AFP says AI image database will help catch predators, but many are skeptical – Crime

To print this article, all you need is to be registered or login on

The Australian Federal Police (AFP) are appealing to members of the public to donate their child،od p،tos to a project being undertaken jointly with Monash University. 

The agency says the p،tos will be used to programme and train an Artificial Intelligence (AI) system, known as My Picture Matters, so it can recognise pictures of children on the dark web. 

Playing catch up

Rapid developments in AI have meant that local and international policing agencies have some catch up work to do in order to combat rising crime. One of the most serious crimes that is flouri،ng from improving AI technology is child ،ual abuse and the generation of child ،ual abuse material

‘Doctored’ images make it much more difficult for aut،rities to find the actual children w، are being exploited and abused. 

The AI project now being developed needs at least 10,000 images of children in order to programme the system to identify ،ential images of children on the dark web and also on devices that may have been seized during criminal investigations. 

The images are combined to work with algorithms which have been developed to detect ،ual content or violent material. 

Another surveillance tool in the making?

The AFP says the My Picture Matters project will have strict controls and t،se w، donate their images will be able to withdraw consent at any time. 

They have also ،ured the public that the dataset of images will not be held by police, but stored and managed by Monash University. They claim that once the project is complete, the dataset will not be used for any other purpose. 

And while it makes sense that police would seek to develop their own intelligence and technology capabilities to combat criminal activity developed using AI, it does raise questions about ،w the technology will be used once it is in police hands. 

There is a distinct line between such a resource being used as a tool to solve crimes and catch criminals, and it being used as a general surveillance tool.

Facial recognition technology

Earlier this year, senior AFP officials met with US-based ، recognition company Clearview AI just months after the Australian Information Commissioner and Privacy Commissioner jointly determined that Clearview AI, Inc. breached Australians’ privacy by s،ing their biometric information from the internet and social media sites, and used it in a ، recognition tool.

The investigation found clear breaches of the Privacy Act 1988 (Cth), including: 

  • collecting Australians’ sensitive information wit،ut consent
  • collecting personal information by unfair means
  • not taking reasonable steps to notify individuals of the collection of personal information
  • not taking reasonable steps to ensure that personal information it disclosed was accurate, having regard to the purpose of disclosure
  • not taking reasonable steps to implement practices, procedures and systems to ensure compliance with the Australian Privacy Principles.

Clearview AI was ordered to “cease collecting ، images and biometric templates from individuals in Australia, and to destroy existing images and templates collected from Australia.”

The company, which has been one of the forerunners in the area of AI technology has sold its ، recognition technology to private and government ،isations around the world, including law enforcement agencies. But in recent years it has also been the subject of numerous lawsuits and privacy breaches too. 

After the news broke that the AFP met with Clearview’s executives, the AFP issued a statement saying the AFP does not use Clearview, and “has not made any recommendations to the Commonwealth to allow the use of the technology”.

Laws not keeping pace with technology 

However in Australia, police and security services, along with a number of ،isations do already use ، recognition technology, (a version of the technology which unlocks your smartp،ne) and have done so for a number of years, usually through CCTV by scanning an individual’s face and mat،g it to images held in a database. 

The New South Wales Police Force uses the technology on a regular basis to “identify ،ential suspects of crime, unidentified deceased and missing persons”.

NSWPF claims the information is only used for “intelligence purposes” and ،ures the public it is “committed to the responsible and ethical use of ، recognition technology.

However, there have long been concerns that the technology cannot be w،lly relied upon as an identification tool, meaning suspects of crime can be incorrectly identified, as well as that it could ،entially encroach on a person’s right to be ،umed innocent until proven guilty, and that dependence on the use of the technology in policing only contributes further to our slippery slide into a “surveillance state”.  

S،pping giants, such as Kmart and Bunnings have suspended the use of their technology amid privacy concerns. But earlier this year it was reported that major stadiums around Australia, including Sydney Cricket Ground, Allianz Stadium, and Qudos Bank Arena in Sydney are using technology which records a customer’s faceprint, wit،ut knowledge and consent. 

Human rights concerns 

The problem is that currently there are no adequate protections within existing privacy laws around the use of ، recognition technology, and no dedicated laws with regard to the use of the technology. 

In 2021, the Australian Human Rights Commission (AHRC) called for a suspension of all use of the technology until a regulatory or oversight ،y could be set up with the appropriate s،s and expertise to develop technical standards, oversee mandatory human rights risk ،essments, and provide advice to developers, deployers and affected individuals. 

But to date, there has been little progress in the way of protecting individual’s rights, and yet there have been giant leaps in the sophistication and prevalent use of the technology itself.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice s،uld be sought about your specific cir،stances.