Emerging Technologies

What to know about the EU's facial recognition regulation – and how to comply

NEC's Shinya Takashima demonstrates a facial recognition system that identifies people even when they are wearing masks, amid the coronavirus disease (COVID-19) outbreak, at its headquarters in Tokyo, Japan, January 6, 2021. Picture taken on January 6, 2021.  REUTERS/Kim Kyung-Hoon - RC2R2L9LNA38

Facial recognition is about to be heavily regulated in the EU. Here's how tech providers can successfully comply. Image: REUTERS/Kim Kyung-Hoon

Sébastien Louradour

Listen to the article

  • Facial Recognition Technology (FRT) is a central concern of the European Commission's proposed AI regulation.
  • To successfully comply, tech providers will need to build tailored approaches to risk management and quality processes.
  • In partnership with industry, government and civil society, the World Economic Forum has developed an audit framework and certification scheme for tech providers.

The European Commission's (EC) proposed Artificial Intelligence (AI) regulation – a much-awaited piece of legislation – is out. While this text must still go through consultations within the EU before its adoption, the proposal already provides a good sense of how the EU considers the development of AI within the years to come: by following a risk-based approach to regulation.

Among the identified risks, remote biometric systems, which include Facial Recognition Technology (FRT), are a central concern of the drafted proposal:

  • AI systems intended to be used for the ‘real-time’ and ‘post’ remote biometric identification of natural persons are considered a high-level risk system and would require an ex-ante evaluation of the technology provider to attest its compliance before getting access to the EU market, and an ex-post evaluation of the technology provider (detailed below).
  • In addition, “real-time” remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement are mostly prohibited unless they serve very limited exceptions related to public safety such as the targeted search of missing persons or the prevention of imminent terrorist threats (detailed in Chapter 2, Article 5, p.43-44). Additional requirements for this use-case would include an ex-ante evaluation to grant authorisation to law enforcement agencies, i.e. each individual use should be granted by a “judicial authority or by an independent administrative authority of the Member State”, unless it is operated in a “duly justified situation of urgency”. Finally, national laws could determine whether they fully or partially authorize the use of FRT for this specific use-case.

Other use-cases such as FRT for authentication processes are not part of the list of high-level risks and thus should require a lighter level of regulation.

Ex-ante and ex-post evaluations required by use-case in the EU's facial recognition regulation
Ex-ante and ex-post evaluations required by use-case

Ex-ante and ex-post evaluation of technology providers

The ex-ante evaluation (conformity assessment of providers) would include:

  • A review of the compliance with the requirements of Chapter 2;
  • An assessment of the quality management system, which includes the risk management procedures, and the post-market monitoring system; and,
  • The assessment of the technical documentation of the designated AI system.

Certifying the quality of the processes rather than the algorithm performance

While technology providers have to maintain the highest level of performance and accuracy of their systems, this necessary step isn’t the most critical to prevent harm. The EC doesn’t detail any threshold of accuracy to meet, but rather requires a robust and documented risk-mitigation process designed to prevent harm. The deployment of a quality-management system is an important step as it will require providers to design adequate internal processes and procedures for the active mitigation of potential risks.

A focus on risk management and processes

While it will be up to the technology providers to set up their own quality processes, third-party notified bodies will have the responsibility of attesting providers’ compliance with the new EU legislation.

To succeed, tech providers will need to build tailored approaches to design, implement and run these adequate processes. Providers will also have to work closely with the user of the system to anticipate potential risks and propose mitigation processes to prevent them.

Have you read?

How to anticipate the coming regulation

Over the past two years, the World Economic Forum has partnered with industry players, government agencies and civil society to draft a proposed policy framework for responsible limits on FRT.

Among our proposed oversight strategies, we have detailed a self-assessment questionnaire, a third-party audit and a certification scheme. The EU’s proposed concept of third-party audit (i.e. conformity assessment) suggests the same model of oversight and allows for rapid scale-up and deployment of certification bodies (i.e. notified bodies) to run the third-party audits across the EU.

The proposed conformity assessment procedure – which reviews the control of the compliance of the requirements stated in Title III of the proposed regulation – will first require notified bodies to draft dedicated audit frameworks and certification schemes. These two documents will be used to detail to audited organizations how the certification will play out.

In this regard, we encourage providers to consider the audit framework and certification scheme for the quality management system we’ve detailed in the white paper published in December 2020 in collaboration with the French accredited certification body AFNOR Certification.

Steps of the certification scheme detailed in the white paper - World Economic Forum, Responsible Limits on Facial Recognition Use Case: Flow Management, December 2020
Steps of the certification scheme detailed in the World Economic Forum white paper

Among the requirements of Title III, providers will need to put in place a risk management system focused on the analysis, anticipation and mitigation processes of potential risks. (We go over a similar structured approach in sections 2 and 3 of our audit framework to build the right risk assessment processes and prevent the occurrence of biases and discrimination.)

The post-market monitoring system defined in Article 61 is a mechanism to ensure that compliance with the requirements is met when the system is in operation. This critical point is defined in the audit framework we’ve designed. We’ve considered three stages of analysis when the third-party audit is carried out:

1. Ensuring the right design of the quality management processes to comply with the requirements;

2. Controlling correct implementation of the processes; and,

3. Validating that the system operates in accordance with the requirements.

Notified bodies will use certification schemes to conduct conformity assessments. These certification schemes will provide clarity and transparency to providers on how the assessment will be conducted. We have dedicated a chapter of our white paper (Part 4) to explain how to conduct certification of FRT systems, from the preparation phase to the certification phase and the issuance of certificate.

An additional way of preparing for a third-party audit is organizing an internal self-assessment prior to the audit. This activity will provide materials to attest if the system is audit-ready or requires further remediation. For the pilot phase of our policy project, we partnered with Narita Airport to draft and test a self-assessment questionnaire for the responsible use of FRT in airports. (The responses from Narita are publicly accessible in the appendix of our white paper.)

Discover

How is the World Economic Forum ensuring the responsible use of technology?

While the certification scheme and the audit framework we have detailed will have to evolve to comply with this legislation, they already provide good examples to follow.

When it comes to the use of FRT for identification purposes, maximum precaution should be taken. The proposed EU legislation is, in this sense, ambitious – and will help build trust and transparency among EU citizens and allow for the benefits of this technology to be safely deployed.

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

Sign up for free

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

Stay up to date:

Emerging Technologies

Related topics:
Emerging TechnologiesFourth Industrial RevolutionGeographies in Depth
Share:
The Big Picture
Explore and monitor how Artificial Intelligence is affecting economies, industries and global issues
World Economic Forum logo

Forum Stories newsletter

Bringing you weekly curated insights and analysis on the global issues that matter.

Subscribe today

Here’s why it’s important to build long-term cryptographic resilience

Michele Mosca and Donna Dodson

December 20, 2024

How digital platforms and AI are empowering individual investors

About us

Engage with us

  • Sign in
  • Partner with us
  • Become a member
  • Sign up for our press releases
  • Subscribe to our newsletters
  • Contact us

Quick links

Language editions

Privacy Policy & Terms of Service

Sitemap

© 2024 World Economic Forum