Australian police explore using controversial AI software

Australia’s federal police have been exploring the use of a controversial facial recognition software, according to documents obtained by Guardian Australia via a Freedom of Information (FOI) request.

PimEyes allows users to upload a photo of a person’s face and conduct an image search across the web for other photos of that individual. The algorithm is said to be highly accurate and can yield facial images despite differences in background, lighting, objects, or even features like haircuts.

According to PimEyes CEO Giorgi Gobronidze, the Tbilisi-based company has a database of about three billion faces and facilitates up to 118,000 searches per day.

Between January and August there were hundreds of connections between Australian Federal Police (AFP) devices and PimEyes as well as a similar site called FaceCheck.ID. When questioned about their use of the programs, the AFP said that only “a small number of members” had accessed the sites for “training” and to explore their potential use in “the law enforcement or criminal environment.”

AFP has admitted to testing PimEyes for operational use at least nine times, and once with FaceCheck.ID, although AFP leadership claims it was not aware of this until the FOI request.

“Pimeyes.com is a particularly dangerous facial recognition tool . . . and has been repeatedly criticised for enabling unlawful surveillance and stalking,” said Australian Greens Senator David Shoebridge, who is part of an inquiry on the AFP’s use of the programs. “This keeps happening with the AFP, whether it’s Clearview, PimEyes or FaceCheck.”

In 2021 AFP was found to have breached privacy rules with its use of Clearview AI, a database which has scraped over 30 billion photos of private citizens from the web without permission. 

But the AFP is not the only federal law enforcement agency to use such software and allow leadership to claim ignorance.

In August a report revealed that the Department of Homeland Security (DHS) has been using Clearview AI allegedly to find perpetrators. The DHS Homeland Security Investigations (HSI) unit has spent approximately $2 million to license Clearview, which it has reportedly used to “investigate child exploitation.”

But it is unknown for what other purposes DHS — or other agencies, for that matter — may be using Clearview AI or other such tools. A Buzzfeed report in April 2021 found that employees at over 1,800 publicly funded agencies have used Clearview AI often “without the knowledge or consent of their superiors.” These include military branches, health organizations, local and state police, state attorneys general, federal law enforcement and even public schools.