Recently I had the honor to speak at the SUGCON Europe 2019. My presentation, entitled "Face Recognition and Personalization: Data-driven Marketing in a Brick and Click Store", had an amazingly engaged audience, presenting several questions during the Q&A.
The most recurring questions are about privacy concerns in the light of GDPR and other similar regulations, so I decided to write this blog post to consolidate what I've been researching on this subject.
The main reason why people are so scared with FRT, pushing them to demand regulations such as GDPR, is because of the great possibilities of the technology, that can be both powerful and dangerous.
There are several positive applications for FRT. You can quickly enumerate a few, such as fast check-ins in events, personalized product recommendations, non-intrusive authentication, etc. Expanding from the obvious, also a lot more you can do. A store can use FRT to track down what people are doing, including their moods, and use Artificial Intelligence to find and react for some patterns. The manager can be warned and act when the cashier's queue is getting too long, or when a client is waiting to be served by a seller.
AI-powered FRT can also be used for surveillance purposes: localizing fugitives from justice, quickly reacting when crimes occur, or even avoiding suicides. Too far? Check out this article from Nicole Lindsey, where she describes a few interesting stories:
There are a good number of commercial FRT surveillance products out there, such as Panasonic's Facial Recognition and FaceFirst's Surveillance FRT. As FRT becomes more commonly used, the push for regulations is the consequence.
Recently, Apple had to respond to U.S. Senator Al Franken's questions about Face ID authentication technology, with concerns about privacy, security and whether FRT will perform equally well on different groups of people. And the congressman is correct in his concerns: not a long time after the FaceID technology was released, a group of hackers managed to fool the Face Authentication by using a $150 3D mask, becoming the first to demonstrate this vulnerability. Sometime later, researches from a group of universities, and Alibaba Inc. have created a baseball cap that can bypass the face authentication by using infrared LEDs projecting dots onto strategic spots - and the hack applies to other Face Recognition solutions as well.
Architecturally speaking there is one more vulnerability: most of Face Detection solutions are based or composed by APIs, which the machine itself is accessing to do FR operations. If a machine can access it, a person can potentially also access the API and misuse the features. This would expose biometric data from several users, which can be as bad as leaking their password, or even worse.
GDPR introduces new obligations on how must be the processing of personal data. At this legislation, the definition of personal data is very wide. In a high level, any data that can be used to identify an individual is considered personal and must be protected. This includes any biometric information, such as fingerprints and pictures, frontally affecting FRT. The act of processing this kind of information is prohibited - unless of course some conditions are met.
Some of the lawful basis that permits the processing of personal data are listed in Article 6 of GDPR (extended by Article 9):
FRT applications will generally rely on the first: Express Consent of individuals. This kind of consent must be freely given, very specific, straightforward and unambiguous, to indicate the individual is legitimately interested to have personal data processed. In the context of any FRT applications, this means the individual must be previously registered and given consent. Also, the individual needs to be informed when he is in a place where Face Recognition is happening. For additional reading on consent, check the guidance published by the UK's ICO.
Along with the obligations of Articles 6 and 9, there are more on GDPR to concern in the context of FRT applications:
It is a fact that GDPR and other regulations bring challenges for data controllers aiming to use FRT solutions. It is possible, however, to minimize the risks and comply with privacy regulations if you respect the following guidelines:
If you want further information, here is a great article with details about GDPR in the context of FRT.