Facial recognition technology (FRT) is being used by most U.S. agencies.
An Aug. 24 report from the U.S. Government Accountability Office revealed that 19 of 24 federal agencies were currently using FRT – 16 agencies for cybersecurity (like accessing computers or cell phones), six for direct criminal investigations and five for facility access.
Almost half of the surveyed agencies stated that they are planning on expanding the use of FRT in the coming years.
Woody Bledsoe, Helen Chan Wolf and Charles Bisson developed a process in the 1960s where a computer could recognize human faces with human assistance. In recent decades, the technology they pioneered has become commonplace.
Since Apple introduced Face ID on the iPhone X, millions have used FRT to access their smartphones. We now see FRT at U.S. and European airports and among Immigration and Customs officers.
The U.S. maintains a massive FRT database, with over 117 million Americans whose photos were taken from driver’s license databases.
This is nothing compared to the ambitions of the Chinese government, which announced the creation of the Skynet Project in 2006. It now includes over 20 million cameras that officials claim can scan the whole Chinese population in one second. The Chinese government has used the system to spot and apprehend individuals who have avoided police for years.
Despite its widespread use, FRT has not been without controversy.
In 2015, it was discovered that Instagram, Twitter and Facebook had provided data to a company that markets surveillance tools to police.
A 2016 American Civil Liberties Union report revealed that law enforcement was using the Geofeedia system to monitor unrest in real time. For example, during a protest following the death of Freddie Gray (2015), Baltimore police were able to process photos of protesters with a facial recognition system and ultimately arrested individuals with outstanding warrants.
Almost half a decade later, the U.S. government is still expanding the use of FRT even though significant moral questions related to FRT have not adequately been addressed.
FRTs raise four main ethical issues:
1. Transparency and Accountability
4. Lawful Surveillance
First, these systems are not generally understood by the public.
The core recognition algorithms are not open source or reviewable by the general public. For most of the public, FRT is a black box.
More importantly, these systems are being used by law enforcement and governments but it is unclear who oversees the appropriate use of the technology and what they are doing with this information.
Second, information about how FRT algorithms have been constructed isn’t known.
All FRTs are based on an algorithm that developers create and use to teach computers how to recognize faces.
Typically, common facial patterns are used as a base. The machine then locates nodal points and takes measurements. This data is then functionally turned into a mathematical face that can then be compared to other faces in a database.
For the computer system to learn to recognize faces, it needs to refine the algorithm based on a trail data set that is composed of thousands of photos. Most data sets have been curated from online photos in the public domain.
But what are its core rules and what data sets were used to test the systems?
The issue of fairness arises because of the potential for a lack of diversity in the data set. For example, if it is mostly composed of white male photos, then the FRT can and will falsely identify subjects.
Even the best systems have a tendency to misidentify Black subjects at a rate of 5-10 times that of white male subjects. This could easily lead to an inappropriate arrest, further eroding the trust that minority communities have in law enforcement.
Third, the public is not being informed of how their personal data is being used in FRT research.
Again, the development of these systems are dependent upon data sets. While many are compiled from images in the public domain, this still might not be adequate consent for subjecting people’s images to a research project.
The public is generally unaware that their images are being used to develop technology that would then be used to keep track of their everyday activities.
Fourth, FRT is in desperate need of a significant discussion of appropriate use.
With regard to missing persons, the technology has great promise. If a child is abducted, then traffic cameras could be used to locate the child in real time before a kidnapper could escape. This could be used to save lives.
On the other hand, a large-scale FRT system like the Chinese SkyNet can keep track of everywhere an individual visits and even plot our predictive patterns. This brings up serious questions of personal freedom and privacy.
FRT has come a long way in 60 years, moving from science fiction to an academic project to widespread applications in the real world.
But like all emerging technology, we need to stop and reflect upon how we as human beings intend to use it.
How might this new technology reinforce discrimination? While intended to help us live free, how might FRT constrict us or bind people? How do we continue to integrate technology into our everyday lives and continue to be authentically human?
These are essential questions that need to be asked and answered, particularly as FRT becomes more commonplace in our daily lives.