I Saw You!–or Maybe I Didn’t

Unease about the growing use of facial recognition technology has clustered around some by now well-known difficulties: inaccurate results when non-white individuals are involved; inadequate training of personnel who “read” the results; adoption of privacy-invading systems without public knowledge or input; inadequate monitoring of accuracy.

Despite these concerns, use of this technology is expanding in law enforcement, border control, airport screening, even business, including retail. Remember Rite Aid? How’s it really being used? What are the benefits as well as the unintended consequences? These questions create a fertile arena for authors of crime fiction.

Properly implemented, facial-recognition could help make policing more efficient. It has been used to identify a number of the January 6 rioters, whose participation was then verified by other evidence (their own Facebook posts, often). The UK, whose cities are blanketed with camera surveillance, nevertheless still values the human element—a cadre of super-recognizers who “never forget a human face.” That would not be me.

Unfortunately, an identification facilitated by the facial recognition technology sometimes trumps other evidence gathered in more traditional ways. The arrestee’s alibi, for example. No matter how sophisticated the technology becomes, it remains true that a person cannot be in two places at once.

Police departments say the technology is used only to generate leads, and people should not be charged with a crime until there is corroborating evidence. In practice, though, the algorithm’s output often doesn’t mark the beginning of an investigation, but its end. That perception was borne out by a Washington Post story yesterday about. A Texas man has filed a lawsuit against Macy’s (and others) claiming overconfidence in the technology led to his arrest by the Houston police. While in custody, he was sexually assaulted. This is one of a half-dozen ongoing wrongful arrest cases are around the country.

In the U.S., a growing number of state and local law enforcement agencies have available to them the faces of nearly half the adult population. These photos come from various sources—including billions scraped from social media, as well as government-issued I.D. cards, mug shots, driver’s licenses, etc. (In which case I don’t need to worry, because my driver’s license photo looks nothing like me! Though probably it matches up with the eighty or so “nodal points” that define a particular face.) The Georgetown University Law Center’s project on privacy and technology calls this vast database “the perpetual line-up.” And you’re likely in it, no matter how law-abiding you are.

Maybe you’re thinking, “So what? I’m not a criminal. This doesn’t affect me.” At least not until there’s a misidentification. Crime fiction writers should have a field day with this one. It’s one thing in a traffic stop or arrest situation to attempt to verify someone’s identity; it’s quite another to use the database for a fishing expedition after-the-fact. And fish will be caught, possibly using grainy, out-of-focus, out-of-date, candid selfies, to create a list of possible matches. Facebook for a while identified individuals in the photos on our news feeds. My friend’s wife was consistently identified as me. I didn’t think we look at all alike, but the algorithm did, so I understand the reality of misidentifying people.

Police departments in several major American cities are experimenting with street surveillance cameras that can continuously scan the faces of people in real time. More than a whiff of China here. The People’s Daily has reported that China’s facial recognition system needs only one second to scan the faces of its 1.4 billion people.

Warrants aren’t required for a search of facial databases. The investigations aren’t necessarily limited to serious crimes. Defendants may never be told that it was an algorithm, not a human witness, that identified them. People who don’t trust the justice system, may prefer to take a plea deal and never have their case tried in court and face a potentially longer sentence. This means the true rate of false positive identifications is unknowable. All these aspects of the technology and its implementation, good and bad, lend themselves to situations crime writers can exploit.

Graphic by Mike MacKenzie (www.vpnsrus.com) under Creative Commons license 2.0 Generic.