Another study is saying facial recognition software is proving unreliable when dealing with people of color.

The American Civil Liberties Union of California recently released their study of Amazon’s Rekognition software marketed to law enforcement authorities. The software mistakenly matched the faces of more than one in five state legislators — 26 in all — to mugshots in criminal databases.

Among the errant matches was state Rep. Phil Ting, who in May sponsored a bill to ban facial recognition software on the body cams that an increasing number of police officers wear. More than half the errors involved lawmakers of color, highlighting one of the primary criticisms of facial recognition technology, which is its unreliability with non-Caucasians. 

“This experiment reinforces the fact that facial recognition software is not ready for prime time — let alone for use in body cameras worn by law enforcement,” Ting, a Democrat from San Francisco, said in a press release issued with the ACLU. “I could see innocent Californians subjected to perpetual police line ups because of false matches.”

Both Ting’s office and the ACLU referred Karma to the release. 

The bill, which has drawn support of activists and Bay Area lawmakers concerned about misuse of the rapidly evolving technology, passed California’s Democrat-controlled assembly this spring. 

Privacy concerns are also mentioned by the bill’s supporters. “Even if this technology was accurate, which it is not, face recognition-enabled body cameras would facilitate massive violations of Californians civil rights,” said Matt Cagle, a technology and civil liberties attorney at ACLU of Northern California.

A similar study last year testing Rekognition falsely matched 28 members of Congress to mugshots, including U.S. Representative and civil rights leader John Lewis (D.-GA) and U.S. Senator Pat Roberts of Kansas. San Francisco, Oakland and Somerville, Massachusetts, have banned the technology. 

Law enforcement groups have opposed Ting’s bill and other efforts to remove the technology, which they say may be an effective crime-solving tool. 

California at the Center

Facial recognition software has raised hackles in law enforcement and beyond as organizations embrace the technology to track individual behavior. California, the largest state in the U.S. by population, has been at the center of the debate. 

Earlier this month, an investigation by online technology and science publication OneZero found that law enforcement agencies in some of California’s most densely populated counties were using facial recognition programs from the law enforcement technology firm DataWorks Plus to identify suspects from video footage. 

Los Angeles, San Bernardino and Riverside counties were also part of California Facial Recognition Interconnect, a DataWorks Plus-maintained network enabling them to conduct facial recognition searches on each others’ mugshot databases. The counties’ contract gives them access to nearly 12 million images. 

The Greenville, South Carolina-based company has clients in at least nine other states, plus New Zealand. Todd Pastorini, DataWorks Plus executive vice president and general manager, told Karma that facial recognition has been unfairly “demonized.” He likens the facial recognition to automated fingerprint identification systems that have been present for five decades

Pastorini describes his company’s products as investigative tools that can help narrow a search. But he adds that law enforcement officials “still have to do all the other investigative work to put someone at the place of a crime.” 

Cagle, among others, has drawn parallels to China, which has erected a massive system of more than 200 million public surveillance cameras. But Pastorini disputed criticism that facial recognition would give government too much power.  “There is no Big Brother element,” he said. “We’re not searching Facebook or public pools of data. These are all criminal databases.” 

Other research has highlighted similar flaws with recognition technology.

A 2018 study by the MIT’s Media Lab found that facial recognition software incorrectly identified about one in three dark-skinned women. In 2016, a Georgetown University report estimated that law enforcement facial recognition databases contained images of 117 million Americans. That report found that about one in four state or local police departments can conduct face recognition searches through their own or another agency’s system. 

A 1986 Act allows California counties with more than 1.5 million residents to share and analyze information from fingerprint databases around the state.