当前位置:首页 > Scams > 正文

Detroit cops overhaul facial recognition policies after rotten arrest

2024-12-19 11:22:30 Scams

The Detroit Police Department revised its policies on how it uses facial recognition software to solve crimes as part of a federal settlement with a man who was wrongfully arrested for theft in 2020 based on the technology, authorities said Friday.

Robert Williams was falsely identified as the shoplifter at a Shinola store in October 2018. Fifteen months later, he was arrested in front of his family in his driveway, booked in jail and held for 30 hours before he was released on a personal bond, according to his lawsuit.

The settlement between Williams and the city of Detroit was filed in U.S. District Court for the Eastern District of Michigan on Friday. Previously, in May, the Detroit City Council approved paying Williams $300,000 for damages as part of the settlement.

A Detroit detective ran a grainy photo made from poorly lit footage through the department's facial recognition technology, Williams' lawsuit states. In the footage, the shoplifter never looks directly into the camera, the lawsuit said.

The detective sent the poor quality photo to the Michigan State Police to run a face recognition technology search, which returned a possible match to an expired driver’s license photo of Mr. Williams, according to the American Civil Liberties Union.

City officials cited "sloppy" detective work for the wrongful arrest, cleared Williams' record and removed his personal information from the police database.

Williams said abuse of facial recognition technology "completely upended my life."

"My wife and young daughters had to watch helplessly as I was arrested for a crime I didn't commit and by the time I got home from jail, I had already missed my youngest losing her first tooth and my eldest couldn't even bear to look at my picture. Even now, years later, it still brings them to tears when they think about it," he said in a statement.

“The scariest part is that what happened to me could have happened to anyone."

National civil rights advocates have decried law enforcement use of the technology for its hazardous misidentifications. At least seven people have been wrongfully arrested across the nation due to "police reliance on incorrect face recognition results," the ACLU said in April. Nearly all of the falsely accused people were Black. Three of the cases were in Detroit, including a woman who was eight months pregnant at the time and arrested in front of her children, the ACLU said.

ACLU: New Detroit policy serves as model for departments across U.S.

The ACLU of Michigan, which sued on behalf of Williams, announced the Detroit police policy changes at a Friday news conference. Among them:

  • Police can't make arrests based on facial recognition results alone, or on the results of photo lineups based on a facial recognition search.
  • Police can't conduct lineups based on facial recognition alone without other independent, reliable evidence linking the suspect to a crime.
  • Police must disclose the flaws of facial recognition technology and when it is used in an arrest. Officers must also disclose when facial recognition technology did not come up with a suspect, or when the results showed different suspects.
  • Training on facial recognition software that includes risks and dangers of the technology and the disproportionate rate in which people of color are misidentified.
  • An audit must be conducted of all cases since 2017 where Detroit police used facial recognition technology to get an arrest warrant.

Policies will be enforced by federal court for four years, the ACLU said. Representatives with the nonprofit group described the controversial facial recognition technology as "dangerous" and the settlement as "groundbreaking."

The new policies in Detroit will serve as a model for other police agencies nationally on best practices of facial recognition technology, said Phil Mayor, senior staff attorney at the ACLU of Michigan.

Detroit police said Friday that the department is pleased with the policy changes, and that it also "firmly" believes it will serve as a national example of best facial recognition practices.

"While the work DPD and the ACLU do may differ, our goals are similar — to ensure policing is done in a fair, equitable, and constitutional manner," the department wrote.

Following Williams' wrongful arrest, Detroit police created a facial recognition policy that included three independent sign-offs before the technology can be approved to use in an investigation, the department said. The policy also stated the technology could not be used as the basis for identifying a suspect.

Use of facial recognition software raises questions

A facial recognition system uses biometric software to map a person’s facial features from a video or photo. The system then tries to match the information on databases to verify someone’s identity.

Police departments use facial recognition to find potential crime suspects and witnesses by scanning through millions of photos. The software is also used to provide surveillance at public venues such as concerts and schools.

But the technology has raised opposition across the U.S. for misidentifying suspects – and the grave consequences that follow.

A Texas man wrongfully arrested and jailed for nearly two weeks filed a lawsuit in January that blamed facial recognition software for misidentifying him as the suspect in a store robbery. Using low-quality surveillance footage of the robbery, artificial intelligence software at a Sunglass Hut in Houston falsely identified Harvey Murphy Jr. as a suspect, which led to a warrant for his arrest, according to the lawsuit.

In August, Detroit police strengthened its photo lineup and facial recognition technology policies after "shoddy" police work led to the wrongful arrest of a pregnant woman, Police Chief James White previously said. Porcha Woodruff filed a federal lawsuit after she was wrongfully arrested in a carjacking and robbery.

The Federal Trade Commission in December banned Rite Aid from using AI facial recognition technology, accusing the pharmacy chain of recklessly deploying technology that subjected customers – especially people of color and women – to unwarranted searches.

The move came after Rite Aid deployed AI-based facial recognition to identify customers deemed likely to engage in criminal behavior such as shoplifting. The FTC said the technology often based its alerts on low-quality images, such as those from security cameras, phone cameras and news stories, resulting in thousands of "false-positive matches" and customers being searched or kicked out of stores for crimes they did not commit.

Contributing: Terry Collins and Bailey Schulz

Andrea Sahouri covers criminal justice for the Detroit Free Press, part of the USA TODAY Network. She can be contacted at[email protected].

最近关注

友情链接