BENTON COUNTY, Wash.- The artificial intelligence debate has come to the Tri-Cities.
Benton County Sheriff’s Office is holding a fourth public meeting on Feb. 22 at 6 p.m. to discuss the possibility of the agency purchasing Clearview AI to assist in after-the-crime investigations.
Clearview AI is a facial recognition platform powered by a neural network: a machine learning model inspired by the human brain’s pathways and processes.
The platform allows law enforcement to search a database consisting of more than 40 billion images publicly sourced from the internet to quickly find a match to an image of a suspect uploaded to the program.
The time saved with this technology can revolutionize the way detectives investigate crime.
“Instead of spending months trying to identify somebody, we may be able to do it in a very short amount of time,” BCSO Lt. Michael Clark said. “We may actually prevent another crime from occurring if we are able to identify these people quicker… maybe we prevent a retaliation event or an innocent bystander from being hurt in the next retaliation.”
Critics bring up a slew of concerns regarding the use of facial recognition technology by law enforcement: ranging from accuracy, to privacy and preventing abuse.
Accuracy
David Makin, Ph.D., an Associate Professor in the Department of Criminal Justice & Criminology at Washington State University, questioned how accurate facial recognition artificial intelligence programs can be if a person’s appearance in an uploaded image differs from how they appear in pictures online:
“[There are] all these environmental factors. Let me put a hat on, change my glasses, shave my beard. And all of a sudden, it doesn’t look like David. Maybe it is. But David also looks closely like someone else.”
In a statement to NonStop Local, CEO of Clearview AI Hoan Ton-That said can work in the presence of disguises, facial hair, glasses and sunglasses.
In the context of identifying minorities, studies have shown artificial intelligence programs are inaccurate and can misidentify people, Makin said.
“The ability to detect with high accuracy and precision, black and brown faces. I mean, there is a very wide range of error there,” Makin said.
“When creating Clearview AI’s algorithm, we made sure to have trained our neural network with training data that reflects each ethnicity in a balanced way,” Ton-That said. As a person of mixed race, this is especially important to me.”
Ton-That stated that under the National Institute of Standards and Technology’s facial recognition vendor test which measures the accuracy of the artificial intelligence, Clearview AI matched the correct image from a lineup of 12 million photos at a rate of 99.85 percent. Ton-That claimed this is more accurate than the human eye.
Privacy
Clearview AI’s image database is publicly sourced from the internet. Clark said the use of images that anyone can access does not invade privacy.
Makin contested this idea.
“[It’s a] kind of context collapse. When I put an image out on social media or something that is on my university webpage, I’m putting it out to a very specific audience for a very specific purpose,” Makin said. “Being able to scrape the interent and pull down all images from everywhere across time and space. That’s very different, right? Because the purpose is changed.”
Makin argued that use of public images in a manner aside from the original intention breaks a social contract.
“When we scrape the internet in that way, it’s violating that,” Makin said. “This serves a very real commercial interest, and the community gets nothing out of that… It kind of takes the utilitarian aspect of what the internet is.”
When asked how Clearview AI supports the use of images in this manner, Ton-That replied: “Clearview AI can only be used by law enforcement, for after-the-crime investigations, not in a real time manner. In this narrow context, searching already public information to help law enforcement solve crime, is a positive and pro-social use of technology.”
Preventing abuse
The use of facial recognition technology by law enforcement also brings a question of ethical use.
Clearview AI requires officer identification and a case number to search the database to create an audit trail.
“It’s one thing to have an audit ability. It’s entirely something else to actually review it, Makin said.
Makin cited WSU’s CCTV program and its measures. As part of its audit trail, when a user zooms in or otherwise manipulates the standard capturing of surveillance video, the system creates a record of that activity, Makin said.
When asked about other accountability measures that Clearview has in place to ensure ethical use, Ton-That answered:
“Clearview AI provides training for each user of the software on responsible use of facial recognition search results in the context of law enforcement investigations,” Ton-That said. “The administrator for the organization is able to audit all users in their law enforcement department who use Clearview, in order to look for abuse.”
Effective implementation of policy must be carried out to prevent bad actors abusing facial recognition platforms, Makin said.
“That is the level of accountability that has to be built into these technologies, and it has to be paired with the transparency of releasing that information to the community,” Makin said. “[Posting results to social media or] clearly communicating to the community. ‘Here’s all that we are doing right to provide that audit and that accountability. And by the way, what do you think we are missing?'”
The current BCSO policy for using Clearview AI, if implemented, states a facial recognition match must have peer-to-peer review– meaning two officers would need to review the result.
“Now it is no different than if [detectives] got a tip from you that maybe witnesses said, ‘I think it’s the guy who lives on the street over there.’ What would they do? They would start their investigative work,” Clark said. “It is not an automatic go to jail or an automatic warrant issued. It is just an embedded investigative tool to get them going down the right track.”
When asked about transparency in sharing audit results with the public, Clark said the policy is still being altered based on the public response BCSO has received from the community at public meetings.
The facial recognition accountability policy can be found on BCSO’s website.