🎙️ Voice is AI-generated. Inconsistencies may occur.
- There have been multiple cases of wrongful arrest due to facial recognition technology, many involving people of color.
- Organizations such as the ACLU are pushing for laws and regulations to limit or ban police use of the system.
- The technology has been used to track protesters, raising concerns about privacy and civil liberties.
- Some argue the key is for law enforcement to use facial recognition responsibly, saying it has proved effective in certain settings.
- However, one expert told Newsweek: "This is a technology that we think is dangerous when it fails, and dangerous when it works."
Robert Williams knew he did not commit the crime he was being arrested for on January 9, 2020.
What he didn't know was that the detectives accusing him of stealing watches from a Shinola store in 2018 were doing so because of an erroneous hit using facial recognition technology.
Detroit police had tried to identify the thief by feeding a blurry image from the store's surveillance footage into the software—and that led to Williams' arrest in front of his wife and daughters, then 2 and 5, outside his home in the suburb of Farmington Hills.
He was handcuffed and hauled away, then held for 30 hours in an overcrowded detention center. Two weeks later, he appeared in court and the case was dismissed without prejudice, meaning he could be charged again.
It was Williams' wife, Melissa, who figured out that facial recognition technology was the reason for her husband's arrest.
"The day I was arrested, I had no idea it was facial recognition," Williams told Newsweek. "I was arrested for no reason."

In June 2020, the ACLU of Michigan filed a complaint, asking for an apology, dismissal of the case and for the police department to stop using facial recognition technology.
"None of that happened," Melissa Williams told Newsweek. "So the lawsuit came months after."
In a statement to Newsweek, Detroit Police Chief James E. White said the department "uses all available resources to detect and solve crimes in the city of Detroit."
White said: "The DPD has strong policies in place regarding the use of facial recognition technology, including: use is restricted to Part I Violent Crime (robbery, sexual assault, aggravated assault, homicide) or Home Invasion I (unlawful entry into a lawfully occupied home) investigations; and any match is only to be considered an investigative lead, not a positive identification of a suspect.
"There are a number of checks and balances in place to ensure ethical use of facial recognition, including: use on live or recorded video is prohibited; supervisor oversight; and weekly and annual reporting to the Board of Police Commissioners on the use of the software."
The Williams lawsuit, seeking damages and policy changes to stop the use of the technology, noted that what happened to the father of two was the first known case of a wrongful arrest using the facial recognition technology in the U.S.
The first—but far from the last.
Michael Oliver was wrongly accused of a felony in May 2019 in another case from Detroit that utilized facial recognition technology. In February that same year, Nijeer Parks was accused of shoplifting candy and trying to hit an officer with a car in New Jersey, despite him being 30 miles away at the time.
More recent cases have emerged, too. Alonzo Sawyer was arrested in the spring of last year for allegedly assaulting a bus driver near Baltimore, Maryland, after an analyst using facial recognition software labeled him as a possible match with the suspect seen in CCTV footage, Wired reported.
In November, Randall Reid was jailed in Georgia after authorities arrested him on two theft warrants out of Baton Rouge and Jefferson Parish, Louisiana—a state he has never visited.
Reid's attorney, Thomas Calogero, told Newsweek that to free his client, he visited the consignment store in Metairie, Louisiana, that Reid was accused of stealing purses from. There, the owner showed him a photo captured by a surveillance camera.
"I immediately noticed that the guy was heavier than my client, so I sent family photos to this detective and they realized they made a mistake," he said. "It wasn't him, but I'll tell you the face was identical."
The photos also showed a mole on Reid's face that the suspect did not have, and the case against him ultimately fell apart. By that point, Reid had spent a week in jail.
The Issue of Racial Bias
The five men have one thing in common: they are all Black.
Their arrests serve to demonstrate what some studies have shown: that facial recognition technology is far more likely to misidentify Black people and other people of color.
But experts say they are likely just the tip of the iceberg because police rarely reveal when facial recognition technology has been used.
"It is very hard for people accused of crimes and their attorneys to learn the face recognition was used, because prosecutors and police often hide that information," Nathan Freed Wessler, an attorney representing Williams and the deputy director of the ACLU Speech, Privacy and Technology Project, told Newsweek.
"We have no idea how prevalent a problem this is."

Wessler said the growing law enforcement use of the technology is worrying in a country where police are more likely to use force on Black people.
"You take biases in the technology and add it to biases that already exist in policing and you have a recipe for actually amping up false arrests of Black people rather than trying to reduce disparities," he said.
"The risks are so high and so great that banning law enforcement use of it is the safest thing to do."
Calogero does not agree that facial recognition technology should be banned, describing it a great tool that helps generate leads and solve crimes. But he says police officers need to be "do their homework" before making an arrest.
"There are 300 million people in this country and there's somebody out there that looks just like me, and just like you," he said. "They simply need to be more careful how they identify suspects and make arrests."
Concerns about the flaws of facial recognition technology have led some states, cities and counties—including San Francisco and Boston—to ban or regulate police use, but there are no federal laws governing its use.
Efforts To Ban and Regulate Police Use
Last month, Democratic lawmakers introduced the Facial Recognition and Biometric Technology Moratorium Act to prevent federal entities, including law enforcement agencies, from using facial recognition and other biometric technologies.
In Maryland, Sawyer's case led state Senator Charles Sydnor to renew his efforts to restrict police use of facial recognition technology.
After years pushing for a moratorium, Sydnor said he realized his fellow lawmakers would not support curtailing technology that had proven effective, notably in identifying people involved in the deadly riot at the U.S. Capitol on January 6, 2021.
His proposed bill would limit police use of facial recognition to violent crime, human trafficking or crime were there is a "substantial and ongoing threat to public safety or national security."
"We want to make certain that if you're going to use this its going to be something really serious," Sydnor told Newsweek.
The legislation would also restrict police to using only driver's license photos and booking photos for matches rather than services by companies such as Clearview AI.
The New York Times reported that the Jefferson Parish Sheriff's Office, which sought the first warrant for Reid's arrest, signed a contract with Clearview AI in 2019. According to the Times, none of the documents used to arrest him disclosed that facial recognition technology had been used.
In a statement to Newsweek, Clearview AI CEO Hoan Ton-That said: "More than one million searches have been conducted using Clearview AI. One false arrest is one too many, and we have tremendous empathy for the person who was wrongfully accused.
"Even if Clearview AI came up with the initial result, that is the beginning of the investigation by law enforcement to determine, based on other factors, whether the correct person has been identified."
The sheriff's office has been contacted for comment.
On its website, Clearview AI says its database "enables quicker identifications and apprehensions to help solve and prevent crimes, helping to make our communities safer."
That database contains more than 30 billion images scraped from Facebook and other websites, which Ton-That recently acknowledged in a BBC interview were taken without users' permission. Those photos will often be compared to low-quality stills pulled from surveillance cameras. "We think that is dangerous and that is how these false arrests have happened," Wessler said.
The Threat to Privacy and Civil Liberties
But even a technology that could be made free of flaws is worrisome, critics say, because of the significant threat the growing use of facial recognition technology poses to people's privacy and civil liberties.
Advances in technology and the infrastructure already in place positions facial recognition software as a "perfect tool for authoritarian or oppressive ends," Jeramie Scott, director of the Electronic Privacy Information Center's Project on Surveillance Oversight, told Newsweek. "It's not if it will be abused, it will be abused."
He pointed to Baltimore police's use of the technology to identify people protesting Freddie Gray's death in police custody in 2015.
"That type of use obviously undermines our constitutional rights... the technology itself will have a chilling effect on how people act. It really kind of destroys the idea of any type of obscurity in public."
Facial recognition technology does not stop crimes, Scott added, but could become a crutch that officers rely on to solve them.
"But the only way you're going to do that is perfect surveillance," Scott said. "And perfect surveillance is antithetical to a democracy."
Today, police using facial recognition technology feed still photos into software that can search databases for a possible match. Research has found that law enforcement facial recognition networks contain 117 million American adults, and that at least 26 states allow law enforcement to run or request searches against their databases of driver's license and ID photos.
But Wessler and Scott say that facial recognition technology could soon be used on live or recorded video, giving police the power to track people wherever they go.
Considering the vast networks of surveillance cameras in much of the U.S., Wessler says that "would give police a fundamentally new power to automatically and instantaneously identify and then track anyone or everyone as we go about our lives."
Scott says if "that's not possible already, it will be possible in the near future."
The fear is not unfounded—the ACLU has released documents that show the FBI and Defense Department were involved in research and development of facial recognition software that they hoped could be used to identify people from video from surveillance cameras.
"This is a technology that we think is dangerous when it fails, and dangerous when it works," Wessler said.
For the Williamses, what happened in January 2020 is still affecting their lives. They want people to be aware that the technology is being used.
"A lot of people don't realize their driver's license picture puts them in this perpetual lineup," Melissa Williams said.
"Even if there's no resolution really for us, we're hoping to continue raising awareness so that hopefully it doesn't happen to someone else."
Williams sees the benefits of police using technology to track down suspects, but said officers should investigate further and ensure they have the right person before making an arrest.
"If they're not using it properly and it's not yielding the right results, then it shouldn't be used," he said. "Why would you use something that doesn't work?"
Update 04/13/23, 4:50 a.m. ET: This article has been updated with a statement from Detroit Police Chief James E. White.
About the writer
Khaleda Rahman is Newsweek's National Correspondent based in London, UK. Her focus is reporting on education and national news. Khaleda ... Read more