Privacy Concerns with Face Recognition Technologies

Posted by Rodrigo Peplau

Recently I had the honor to speak at SUGCON Europe 2019. My presentation, entitled "Face Recognition and Personalization: Data-driven Marketing in a Brick and Click Store", had an amazingly engaged audience, presenting several questions during the Q&A.

The most recurring questions are about privacy concerns in the light of GDPR and other similar regulations, so I decided to write this blog post to consolidate what I've been researching on this subject. 

Why regulate FRT?

The main reason why people are so scared with FRT, pushing them to demand regulations such as GDPR, is because of the great possibilities of the technology, that can be both powerful and dangerous.

There are several positive applications for FRT. You can quickly enumerate a few, such as fast check-ins in events, personalized product recommendations, non-intrusive authentication, etc. Expanding from the obvious, also a lot more you can do. A store can use FRT to track down what people are doing, including their moods, and use Artificial Intelligence to find and react for some patterns. The manager can be warned and act when the cashier's queue is getting too long, or when a client is waiting to be served by a seller.

AI-powered FRT can also be used for surveillance purposes: localizing fugitives from justice, quickly reacting when crimes occur, or even avoiding suicides. Too far? Check out this article from Nicole Lindsey, where she describes a few interesting stories:

  1. The Metropolitan Police of London experimented with FRT and got into trouble with a man, who was stopped for covering his face and fined after swearing and becoming hostile.

  2. The US Secret Service is running an experiment with FRT surveillance, to be completed by the end of summer 2019. The experiment includes the scanning faces of all people walking around the White House perimeter, to detect potential "people of interest" (criminals and terrorists).

  3. San Francisco is considering an outright ban on FRT surveillance. The “Stop Secret Surveillance” bill is being brought for a vote, as concerns mount that facial recognition surveillance unfairly targets and profiles certain members of society, especially people of color. 

There are a good number of commercial FRT surveillance products out there, such as Panasonic's Facial Recognition and FaceFirst's Surveillance FRT. As FRT becomes more commonly used, the push for regulations is the consequence. 

Vulnerabilities

Recently, Apple had to respond to U.S. Senator Al Franken's questions about Face ID authentication technology, with concerns about privacy, security and whether FRT will perform equally well on different groups of people. And the congressman is correct in his concerns: not a long time after the FaceID technology was released, a group of hackers managed to fool the Face Authentication by using a $150 3D mask, becoming the first to demonstrate this vulnerability. Sometime later, researches from a group of universities, and Alibaba Inc. have created a baseball cap that can bypass the face authentication by using infrared LEDs projecting dots onto strategic spots - and the hack applies to other Face Recognition solutions as well.

Architecturally speaking there is one more vulnerability: most of Face Detection solutions are based or composed by APIs, which the machine itself is accessing to do FR operations. If a machine can access it, a person can potentially also access the API and misuse the features. This would expose biometric data from several users, which can be as bad as leaking their password, or even worse.

How GDPR applies to FRT

GDPR introduces new obligations on how must be the processing of personal data. At this legislation, the definition of personal data is very wide. In a high level, any data that can be used to identify an individual is considered personal and must be protected. This includes any biometric information, such as fingerprints and pictures, frontally affecting FRT. The act of processing this kind of information is prohibited - unless of course some conditions are met.

Some of the lawful basis that permits the processing of personal data are listed in Article 6 of GDPR (extended by Article 9):

  1. Express consent of individuals

  2. Legitimate interests not outweighed by the rights of individuals

  3. Processing necessary for the performance of a contract

  4. Processing necessary to comply with a legal obligation

Consent

FRT applications will generally rely on the first: Express Consent of individuals. This kind of consent must be freely given, very specific, straightforward and unambiguous, to indicate the individual is legitimately interested to have personal data processed. In the context of any FRT applications, this means the individual must be previously registered and given consent. Also, the individual needs to be informed when he is in a place where Face Recognition is happening. For additional reading on consent, check the guidance published by the UK's ICO.

Other obligations

Along with the obligations of Articles 6 and 9, there are more on GDPR to concern in the context of FRT applications:

  1. Transparency - Part of the GDPR core states that data controllers must process personal data "lawfully, fairly and in a transparent manner", and also able to demonstrate compliance to this statement
  2. Profiling / Right to object to processing - Article 21 describes the individual right to "object to processing". Individuals can completely object to any processing, or be less restrictively and only refusing solely-automated processings, or to have his data being used for direct marketing;

Conclusions

It is a fact that GDPR and other regulations bring challenges for data controllers aiming to use FRT solutions. It is possible, however, to minimize the risks and comply with privacy regulations if you respect the following guidelines:

  1. If you don't have express agreement from the contact, you should skip processing and discard all images and metadata obtained. This applies both for known contacts without an agreement, and for unknown contacts.
    Without express and specific agreement, you should never do things such as:
    1. Create and track contacts using images captured and used without consent;
    2. Save pictures of unknown detected faces;
    3. Offer personalized content based on face recognition metadata, such as Mood, Gender, Age, Ethnicity, etc.

  2. Authorization should be explicit and specific for each use case - instead of having a single checkbox to permit all kinds of processing, you should have individual permissions for different applications.
    For instance, you can have different permission settings for:
    1. Executing face-logging at the website
    2. Being scanned in a physical store for personalized recommendations
    3. Enroll a promotion campaign that requires face recognition

  3. Make it easy to revoke permissions (and respect it)
    1. Offer a user-friendly way to quickly review and revoke all permissions given;
    2. When permissions are fully revoked, make sure you clean up everything related to face recognition for that user;
    3. Just like anonymous users, these users should be ignored during face recognition

  4. Always use Face Recognition for the user's benefit - you will get more sympathy if the technology is used to offer advantages, such as:
    1. Fast track for events, appointments, and others;
    2. Personalized recommendations;
    3. Discounts and promotions.

Further reading

If you want further information, here is a great article with details about GDPR in the context of FRT.

 

X
Cookies help us improve your website experience.
By using our website, you agree to our use of cookies.
Confirm