Facial recognition is a scam, but that doesn’t mean we should underestimate the threat it poses to freedom and fundamental human rights.
Case in point: Russian media MBKh discovered that Moscow police officers illegally “monetize” footage of 175,000 surveillance cameras on forums and messenger groups. For the equivalent of 470$, anyone can have access to facial recognition lookup services that, when provided with the picture of an individual, match it with passerby from hundreds of cameras, along with a list of addresses and times they were caught on camera.
Interestingly enough, face recognition tech *does not work*, and the journalist has to grudgingly admit its low accuracy:
“As for the accuracy of the results, none of the photos returned were of the investigator. However, the facial features were similar to the input and the system assessed a similarity of 67%.”
According to the journalist, the explanation for this suboptimal performance is “the limited number of cameras connected to the face recognition algorithm”. Apparently, the sample is too small, but the technology is not fundamentally called into question…
Another explanation (often ignored by journalists, ever ready to believe the AI hype and therefore disregard its actual dangers) is introduced by a 2018 New York Times coverage of China’s surveillance-industrial complex: mass surveillance systems are not automated per se and are largely based on the intervention of crowds of micro-workers who cherry-pick millions of videos, cut out silhouettes of individual and tag metadata, fill in databases and annotate information:
“The system remains more of a digital patchwork than an all-seeing technological network. Many files still aren’t digitized, and others are on mismatched spreadsheets that can’t be easily reconciled. Systems that police hope will someday be powered by A.I. [emphasis added] are currently run by teams of people sorting through photos and data the old-fashioned way.”
“The old-fashioned way” here means “by hand”… Data annotation, triage, enrichment, especially for the computer vision models underlying face recognition algorithms, is a blossoming market. Recent research by Bonnie Nardi, Hamid Ekbia, Mary Gray, Sid Suri, Janine Berg, Six Silberman, Florian Schmidt, Trevor Paglen, Kate Crawford, Paola Tubaro and myself witnesses its development in sectors as diverse as home automation, transportation, advertising, health, entertainment… and the military. It is based on a workforce of hundreds of million of online laborers, alternatively called microworkers or crowdworkers. They work long hours, with precarious contracts and exploitative working conditions, and are paid very low wages (in some cases less than a cent per micro-task). Although they are attested in the global North, they are predominantly based in developing and emerging economies—such as Russia and China. But the companies that recruit them to package their annotated data and resell it as surveillance technologies, are mainly located in so-called liberal democracies. Despite the Chinese market supremacy, US, French, Japanese, Israeli and Finnish corporations are spreading these technologies all over the world, according to the 2019 AI Global Surveillance Index.
Despite the importance of these “humans in the loop” that constitute the secret ingredients of AI-based technological innovation, the threats of facial recognition, smart cities and predictive policing must not be minimized. The glorification of AI turns it into a powerful psychological deterrent and a disciplinary device. “The whole point,” explains an expert interviewed by the New York Times, “is that people don’t know if they’re being monitored, and that uncertainty makes them more obedient.”
Any action aimed to fight against the alleged omnipotence of these technologies begins with the recognition of their fictitious nature. If automated surveillance is made up of men and women who train, control and impersonate “artificial artificial intelligence“, it is from the awareness of their role in a dystopian and inhuman system that a change is going to come.