26 May 2021





Here's Looking at you Kid



Facial recognition and identification are no longer reserved for sci-fi movie screens they have found a home in the real world. Technology created to easily identify people may seem helpful until---just like in the movies---it falls into the wrong hands.


Facial verification involves scanning and matching of facial features to a known subject. It is used for biometric security in a recognition function. For example it makes logging into a laptop or smart phone faster, and seemingly more secure.

Facial recognition is used to match an individual to a known database. Law enforcement uses it to recognize alleged perpetrators of crime, often resulting in cases of mistaken identity. This is its most problematic function.

According to Lionbridge AI, the technology works this way:
1. An input image is fed to the algorithm.
2. The algorithm creates a facial embedding for the input image.
3. The algorithm compares the input image’s facial embedding to the embeddings of known faces in the database.

Notably, the algorithm fails to correctly identify Black people and others of color. In addition, the technology was found to contain racial bias in a 2019 study with BIPOC being misidentified more often than non-BIPOC.

Police departments and localities are bearing the cost of the misidentification as they face lawsuits. Beyond the financial burdens, coupled with the risk of injury at the time of arrest---a false arrest in the case of an incorrect match---the use of facial recognition technology represents an unacceptable health and safety risk for Black people.

As a commentary, it is mindboggling to rely on technology to make decisions that affect people, especially once that is inherently flawed. (I mean really, how many of us would continue to even use a keyboard with keys that stick?)

In addition to flaws in recognition, there are also privacy concerns. Microsoft deleted its 100 million image facial recognition training and testing database in 2019. A Financial Times investigation revealed that the images were collected without the consent of the subjects. Duke University and Stanford also took down their databases. More recently, a facial recognition company used by U.S. law enforcement has faced a growing number of complaints over privacy issues.

Although facial recognition has great promise, as long as the flaws exist, this technology should not be relied upon in matters of life and death, or liberty.





Black men have been incorrectly identified by facial recognition technology used by law enforcement.