illustration of a head

A Flawed Facial Recognition System Sent This Man to Jail


In January, Detroit police arrested and charged 42-year-old Robert Williams with stealing $4,000 in watches from a retail store 15 months earlier. Taken away in handcuffs in front of his two children, Williams was sent to an interrogation room where police presented him with their evidence: Facial recognition software matched his driver’s license photo with surveillance footage from the night of the crime.

Williams had an alibi, The New York Times reports, and immediately denied the charges. Police pointed to the image of the suspect from the night of the theft. It wasn’t him. “I just see a big black guy,” he told NPR.

Williams spent the next 30 hours in custody before he was released on bail. With seemingly no other evidence of Williams’ involvement, police eventually dropped the charges. On Wednesday, Williams joined with the ACLU of Michigan to file a complaint against the Detroit Police Department, demanding they stop using the software in investigations.

Williams’ arrest may have been the first in the US to stem from faulty facial recognition technology. But it wasn’t a simple case of mistaken identity. It was the latest link in a long chain of investigative failures that critics of how law enforcement uses facial recognition have warned about for years.

Privacy scholars and civil liberties groups have criticized facial recognition technology because, among other things, it is less accurate on people with darker skin. That’s led cities from San Francisco to Cambridge, Massachusetts, to ban or limit use of the tool; the Boston City Council voted to ban the technology on Wednesday.

READ ALSO  How To Activate iPhone or iPad without SIM Card? (4 Methods) 2019

It’s best not to think of facial recognition as a single tool but as a multistep process that relies on both human and algorithmic judgment. Critics have spotlighted privacy issues at each step; in Williams’ case, the lack of safeguards led to an avoidable arrest.

Michigan State Police used facial recognition software to compare surveillance footage from the theft to a state database of 49 million images, including Williams’ driver’s license photo. People don’t knowingly opt in to having their images used this way, but half of all US adults have their photos attached to a database. Police around the US have also used social media photos, witness sketches, even 3D renderings to match against crime scene photos.

The practice is especially pernicious when the databases include photos of people who were arrested but never charged or convicted of a crime. In New York, for example, police have come under fire for using mugshots from stop-and-frisk arrests as part of “probe photo” searches, even though stop and frisk was outlawed.

Williams’ photo seemingly became the main lead in the case against him. The Michigan State Police report on the match says facial recognition matches are “not probable cause” to arrest someone. The state police guidelines say facial recognition is not a “form of positive identification” and should be considered “an investigative lead only.”

After the “match,” investigators sought evidence that would corroborate the case against Williams. The Times reports that police didn’t check Williams’ phone or if he had an alibi; instead, police asked an outside security consultant, who was not in the store at the time of the burglary, if he was the man in the surveillance footage. The woman’s answer was enough to prompt the arrest.

READ ALSO  The best cars we drove in 2020 - Roadshow

While federal research has found that facial recognition often performs less accurately on darker skin, critics also contest the very definition of a “match.”

Keep Reading

The Times reports that when Williams’ photo was scanned, the software would’ve returned a list of potential matches alongside respective “confidence scores,” the algorithm’s projected likelihood that each photo was, in fact, the burglar in the surveillance footage. These confidence scores are important in facial recognition matches. When the ACLU reported that Amazon’s Rekognition matched congresspeople to criminal databases, Amazon replied that the report used too low a threshold. Amazon said it considers 99 percent confidence a match; the ACLU set the confidence threshold at 80 percent.



Source link

?
WP Twitter Auto Publish Powered By : XYZScripts.com