Facial recognition technology is a hot topic, currently attracting heated debates worldwide. On the one hand, facial recognition technology is seen as highly efficient. It is fast and (argued to be) objective in performing tasks. On the other hand, there is a concern that the biases and inaccuracies inherent in these systems might lead to discrimination and jeopardise fundamental human rights, such as our freedom to protest against measures we disagree with.
What is facial recognition and how does it work?
The meaning of facial recognition is perhaps intuitive, but it won’t harm to delve into some detail about what the technology really entails. The UK’s Information Commissioner’s Office defines facial recognition as “the process by which a person can be identified or otherwise recognized from a digital image.” Just like an ID card, your face can serve as a proof of identity.
Facial recognition is used for two purposes: identification and categorization. Identification works by matching live or recorded digital footage to pre-existing images on a database. When such information is “categorised”, the effects of facial recognition become more wide-reaching. The technology can automatically filter people into categories according to identifiable features such as age, gender, weight or even presumed sexual orientation without the help of a real person.
The algorithm organises people into categories by comparing images to the “statistical representation of the average of that category.” If available on a wide scale, facial recognition technology raises therefore the possibility of biometric mass surveillance. That’s because CCTV and other cameras, ubiquitous both in public and private spaces, can be updated with facial recognition capacities. Facial recognition technology is already deployed by law enforcement and private actors in a wide range of places, including schools and train stations.
Worryingly, more often than not our images are uploaded to facial recognition databases without our awareness or consent. For example, it has recently been discovered that Clearview AI, a US technology firm, scraped over three billion facial images from social media platforms such as Facebook, YouTube and Twitter, and gave access to our data to governments and private firms in exchange for payments.
Such incredibly private information is currently being used by law enforcement to make policing decisions, and might in the future be employed by public or private bodies to make decisions as to e.g. your suitability for a job, a university course or loans, overseas travel, or entry into a festival.
How much does facial recognition affect everyday life?
Whilst facial recognition technology may seem akin to science fiction, it is silently creeping its way into the European policy mainstream. After trialling facial recognition at Berlin-Südkreuz station since 2018, Germany is currently planning to roll out biometric surveillance systems in an additional 134 train stations and 14 airports despite criticism. France relies heavily on facial recognition technologies, with President Macron set to catch up with China and the U.S.
Italian police also started deploying the Automatic Image Recognition System (SARI) in 2017, targeting not just a few individuals but processing the biometric data of everyone present in spaces monitored for the purpose of identifying certain individuals. Recently, Belgrade (Serbia) installed hundreds of face recognition cameras, creating a permanent surveillance system. Unbeknownst to much of the general public, facial recognition is becoming increasingly common throughout Europe. Luckily, recent legal rulings in some countries have found such facial recognition systems to be unlawful.
What are the Pros?
Face recognition is promoted as something that makes our lives more convenient. Instead of having to bother entering a password into our phone, for example, or needing to show our ID at an airport, our faces will be enough to verify who we are. In Osaka at four train stations facial recognition systems were implemented to let people pass just by scanning their faces, without using a ticket or an ID card. A transport official noted that they would “emphasize as an advantage the fact that passengers with large luggage will be able to pass gates simply by showing their faces instead of looking for tickets”.
While the internal processes for biometric authentication are technical, from a user’s point of view it’s incredibly easy and quick. Placing a finger on a scanner and unlocking an account in seconds is faster than typing out a long password that has multiple special characters. In addition, forgetting a password is a common mistake of most users. The chances of you forgetting your own biometrics? 0%!
Another argument is that facial recognition helps ensure security. Biometrics are said to provide increased levels of assurance that a person trying to access a service or make a transaction is real. Those in favour of using biometrics point out that passwords, PINs and other personal identifying information can be compromised by data breaches, allowing fraudsters to access accounts that use traditional authentication methods. In contrast, it’s more difficult for someone other than you to provide your fingerprint on the spot. When it comes to law enforcement, those favouring facial recognition argue that it allows the police to track down suspects more easily.
What about the Cons?
Facial recognition fails to respect our rights in more than one way. First, the technology intrusively acquires some of our most intimate data. Because our data belongs to us, (e.g. under the General Data Protection Regulation), we have the right to know when personal information is obtained about us and for which purpose.
Second, knowing that we can be easily identified can lead us to self-censor for fear of negative consequences. For example, people who know that they will be identified and placed in a database for showing up at a protest may be put off attending. This is especially damaging in situations where governments have illegally restricted the right to demonstrate to stifle public criticism.
Third, biometric data carries its own security risk. Unlike a password, biometric data cannot be changed. If my fingerprint data leaks, this is not something I can “reset” like a password. To make things worse, your face can be scanned anytime and anywhere, without your consent. This means that your biometric data might actually be stored in a range of databases, whose security measures might be inadequate.
Facial recognition technology is also known to work relatively well on white and male faces, while having a high rate of inaccuracy on people of colour, especially if they are female. This means that people from ethnic minority groups, who already tend to have less easy access to services and amenities, will encounter an extra barrier when authorities use facial recognition in these contexts.
Not only does face recognition discriminate against people of colour, but it is also disproportionately targeted at marginalized individuals. In a law enforcement context, facial recognition is predominantly used in communities that are already over-policed. In Italy, for instance, the police’s facial recognition software database contained 2 million images of Italians, compared to 7 million of refugees and migrants. So the way this technology is used is likely to perpetuate the way people from some groups are already disproportionately targeted.
Facial recognition might seem attractive for its potential to simplify our lives and enhance public security. But it’s more likely to end up as a major inconvenience that makes it easy to remove our freedoms and choices and reinforce inequality and discrimination. It’s becoming more popular with governments, so the law has to catch up and stop this technology being used at all in most contexts.
Liberties and facial recognition
We at Liberties seek to ban mass surveillance through facial recognition. A few months ago, we joined the ReclaimYourFace movement. If you agree that facial recognition has no place in our public spaces, sign the petition here.