Company says facial recognition can’t be used in arrests, but it’s happening in Evansville

EVANSVILLE — The CEO of Clearview AI, whose company provides the Evansville Police Department with facial recognition technology, said the software’s results are “only leads for law enforcement” and should be treated like an “anonymous tip” from the public.

Evansville police, however, have claimed in sworn arrest affidavits that they used Clearview AI to conclusively identify crime suspects, according to a Courier & Press review of probable cause affidavits that refer to facial recognition technology.

The EPD’s use of facial recognition technology did not garner public attention until this year when the Courier & Press reported that the department had quietly utilized Clearview AI’s tools for years with little scrutiny or oversight.

As of June, Vanderburgh County’s top prosecutor and public defender were unsure if Evansville police had ever directly referred to Clearview AI or facial recognition technology in affidavits of probable cause, which often constitute the starting point of a criminal case and can support police efforts to obtain search and arrest warrants from judges.

More: Experts voice concern over police use of facial recognition tech in Evansville

When the Courier & Press filed a public records request seeking any written EPD policies governing the use of facial recognition technology by officers, the department responded by stating it had no such policy — other than Clearview AI’s Terms of Service and User Code of Conduct.

“The Clearview app is neither designed nor intended to be used as a single-source system for establishing the identity of an individual, and users may not use it as such,” the company’s User Code of Conduct states. “Furthermore, search results produced by the Clearview app are not intended nor permitted to be used as admissible evidence in a court of law or any court filing.”

But the results of Clearview AI facial recognition searches do show up in arrest affidavits filed in Vanderburgh County courts, often without any additional context describing the software’s limitations — a practice that experts said could mislead judges about the accuracy of facial recognition technology.

Facial recognition is being used by more and more police agencies to help with identifying potential criminals.

“Everybody with any seriousness whatsoever agrees that facial recognition technology cannot provide positive identification,” Nate Wessler, the deputy director of the American Civil Liberties Union’s Speech, Privacy, and Technology Project, told the Courier & Press. “Clearview AI does not purport to ‘positively identify,’ and it doesn’t ‘positively identify.’”

Vanderburgh County Prosecutor Diana Moers and the EPD maintain that local law enforcement use Clearview AI and other advanced investigative tools responsibly and abide by all relevant state and federal laws.

In total, the EPD cited Clearview AI or facial recognition technology at least 10 times in arrest affidavits linked to shoplifting or theft cases between 2021 and 2023, public records show, though the true number may be higher: Detectives are under no obligation to disclose their use of facial recognition technology in arrest affidavits.

Furthermore, detectives do not require a warrant to perform facial recognition searches with Clearview AI, and officers also use the software when out on patrol to identify people in real time, according to EPD Chief Billy Bolin.

The Courier & Press submitted a detailed list of questions to the EPD along with relevant court documents and Clearview AI records. An EPD spokesperson twice confirmed they received the material, but did not provide a written response.

Clearview AI did not return multiple requests for comment or answer questions for this article, though CEO Hoan Ton-That has previously communicated with the newspaper about Clearview AI’s products.

“The results coming back from Clearview AI are only leads for law enforcement, which have to be vetted and verified separately,” Ton-That said. “Think of these results as the same as an ‘anonymous tip’ from the public.

‘The terminology is important’

The affidavits that refer to Clearview AI or facial recognition technology by name offer a glimpse into the rapid uptake of artificial intelligence-based tools by local police, and in turn, the court system.

On Dec. 20, 2021, EPD officers were sent to the Walmart Supercenter on Evansville’s East Side for a routine run: A woman, who the Courier & Press will refer to as Jane Doe, reportedly walked out of the store without scanning all of the items in her cart.

Officers didn’t immediately locate the woman, who dispatchers described as “a white female in her 40s,” but Walmart provided detectives with a DVD copy of its surveillance footage.

With access to Clearview AI’s facial recognition technology, investigators could run a still image from the surveillance footage through Clearview’s internal database of more than 30 billion “face print” images — many of which include selfies and other photos the company scraped from websites such as Facebook, Twitter and Venmo — and mugshot websites.

They got a hit.

“(The detective) viewed the footage and utilized the Clearview AI software to positively identify the female suspect as (Jane Doe),” a detective wrote in a sworn affidavit seeking the woman’s arrest.

Aside from Clearview’s apparent positive identification, the affidavit described little else specifically linking the woman to the reported theft. She later pleaded guilty to a felony theft charge, according to court records.

“The technology is not designed to supply positive identifications but rather to generate possible leads, which even the makers of the technology acknowledge may be wrong,” Wessler said upon reviewing the affidavit. “(Facial recognition technology) must be treated with great skepticism and must be followed by independent, reliable investigative work to develop probable cause. An affidavit using this language in seeking a warrant is perpetrating a fraud on the court.”

In another probable cause affidavit, an EPD detective ascribed similar identification abilities to an unnamed face-matching algorithm: “The defendant, later identified by facial recognition software to be John Doe (the Courier & Press is not using his real name), crossed all points of sales without payment for $22.47 in merchandise.”

Writing in a probable cause affidavit related to a 2022 theft, a detective claimed he was “able to identify (suspect one) and (suspect two) by taking still images of both suspects and running the images through Clearview AI.”

Additional evidence would corroborate the software’s findings, the detective wrote.

In all three cases, the detectives never disclosed in the probable cause affidavits that Clearview AI instructs law enforcement customers not to attribute the positive identification of a suspect to its software in sworn court filings.

In a 2022 report by Georgetown Law’s Center on Privacy & Technology, author Clare Garvie wrote that law enforcement had little judicial precedent to go on when deciding how to leverage information gleaned from tools like Clearview AI.

“In the absence of case law, legislation or more robust agency guidance, the limited evidence available suggests that police are not making rigorous distinctions when deciding whether to treat a face recognition match as an investigative lead or as probable cause to make an arrest,” Garvie wrote. “In some cases, officers have collected compelling information that supports the face recognition search findings. In others, however, officers have relied heavily, if not exclusively on, the leads generated by face recognition searches.”

Of the arrest affidavits the Courier & Press reviewed, the vast majority cited additional evidence that would corroborate a suspect’s identity. In some instances, detectives mentioned Clearview AI or facial recognition technology only in passing, noting its use as an investigative tool that helped generate an early lead.

But advocates and other Indiana law enforcement agencies argue that police must always proceed with caution when they portray information obtained from services such as Clearview AI to judges, particularly when the police do so in an effort to secure a person’s arrest.

“Everybody who’s studied this at all agrees that the terminology is important,” Wessler said. “You should never say ‘identified’ or ‘positively identified’ (when referring to facial recognition technology).”

The Indiana State Police − which became Clearview AI’s first paying customer in 2020 − agrees. The agency has strict policies governing the use of facial recognition technology, particularly when the ISP conducts searches for other law enforcement agencies through the state’s Intelligence Fusion Center.

“The result of a face recognition search is provided by the Indiana Intelligence Fusion Center only as an investigative lead and IS NOT TO BE CONSIDERED A POSITIVE IDENTIFICATION OF ANY SUBJECT,” its policy stated as early as 2019, according to public documents.

Facial recognition technology has a long history of law-enforcement use, dating back to at least the early days of the War on Terror.

The South Bend Police Department takes it a step further: According to SBPD Policy 343, any information gleaned from services like Clearview AI “shall not be used as a basis for probable cause and shall not be used as evidence when obtaining a search or arrest warrant.”

The policy goes on to state that those who perform facial recognition searches must include the following disclaimer when they forward their findings to an investigator: “The results of a facial recognition search is provided as an investigative lead and is not to be considered positive identification of any subject.”

The EPD has no written policies dictating how its officers and detectives can leverage Clearview AI, or other facial recognition technology, to seek arrest and search warrants.

County prosecutor compares Clearview AI to polygraph tests

Moers told the Courier & Press that she did not see a discrepancy between Clearview AI’s assertion that its findings were not to be cited as evidence in court filings and the EPD’s practice of noting Clearview AI results in arrest affidavits.

“For something to be actual evidence, it has to be able to be weighed by the trier of fact, whether that’s a judge or a jury,” Moers said. “The Clearview AI tool is just a lead tool. They look at the Clearview AI, and they put that in the probable cause affidavit because they’re explaining what they did in their investigation, which is correct.”

Moers compared tools like Clearview AI to polygraph tests, which purport to detect if a person is telling the truth or telling a lie. In Indiana, the results of such tests are only admissible in court under certain situations due to a lack of scientific evidence supporting their accuracy.

“It’s just like a polygraph: AI (artificial intelligence) it’s not reliable in and of itself,” Moers explained.

But if AI tools aren’t reliable by themselves, what about detectives’ claims in sworn affidavits that Clearview AI could “positively identify,” or had “identified,” suspects?

“I think it’s just saying that that’s what the tool did, right?” Moers said. “I mean, you look at Clearview AI, and it’s either a positive or a negative; so it positively identifies someone and then they go talk to the person and do the next thing.”

Facial Recognition System concept.

Wessler said Moers’ statement that Clearview AI nets investigators “positive” or “negative” results betrays a fundamental misunderstanding of how facial recognition technology works.

“It does not give a positive or negative,” Wessler said. “What it does is provide a list of possible candidate matches. It doesn’t purport to give a positive or negative. It doesn’t purport to say, ‘Here’s a match,’ or ‘Here’s where there’s no match.’ It gives possible matches — and those possible matches are very likely to be of multiple people.”

Jonathan Barry-Blocker, a visiting legal scholar at the University of Florida’s Levin College of Law, said arguments about whether facial recognition searches by police amount to “evidence” can end up devolving into a debate over semantics.

“At the end of the day, it’s evidence; it’s evidence that is used to build a case to put forward and prosecute,” Barry-Blocker said. “The Indiana bar, as well as the national prosecutors’ professional associations, need to create ethics rules around the use of algorithmic evidence, of facial recognition technology, because it is clear that folks do not understand the technology.”

Evidence or not, an affidavit of probable cause − and the claims made therein − provide a basis for judges to issue arrest warrants and prosecutors to file formal charges. And in many cases, defendants never challenge the information detectives outline in their arrest affidavits: Fewer than 5% of criminal cases in Indiana ever go to trial. Instead, defendants are encouraged to enter into plea agreements.

It remains unclear how often Evansville police leverage Clearview AI results to help establish probable cause that a specific person committed a crime. Police chief Bolin told the Courier & Press earlier this year that detectives may only refer to facial recognition searches via generic terms when authoring probable cause affidavits, or not at all.

How the Courier & Press reported this story

Under Indiana’s Access to Public Records Act, the Courier & Press obtained a sample of EPD probable cause affidavits that refer to Clearview AI and confirmed for the first time that Vanderburgh County judges issued arrest warrants based, in part, upon information obtained via facial recognition technology.

“The Evansville Courier & Press seeks documents pertaining to the citation of results from Clearview Al Inc. products in public arrest/probable cause affidavits filed by the Evansville Police Department,” the request stated, in part. The Courier & Press specifically requested to obtain copies of probable cause affidavits connected to “shoplifting” cases that included keywords such as “Clearview AI” or “facial recognition.”

The department later tailored the public records request to include “theft” cases in addition to shoplifting cases and, after reviewing hundreds of affidavits, provided the Courier & Press with documents responsive to the request.

Houston Harwood can be contacted at houston.harwood@courierpress.com

This article originally appeared on Evansville Courier & Press: Evansville police using Clearview AI facial recognition to make arrests

Leave a Comment