Menu
AFP and Monash apply AI to protect investigators from child exploitation material

AFP and Monash apply AI to protect investigators from child exploitation material

Software will be extended to cover terrorism content that can also cause significant psychological distress

The viewing of child exploitation material found on seized mobile phones and computers during abuse investigations takes an incredible toll on police officers.

Officers have spoken out about the trauma caused by viewing such content, an issue compounded by not being able to talk about it with anybody. Rates of post-traumatic stress disorder (PTSD) are far higher among police officers than the general population, in part due to having to confront such imagery.

To help protect investigators from such material, the Australian Federal Police (AFP) is working with Monash University to develop machine learning algorithms that can identify and classify child exploitation material on seized devices, before they are reviewed.

The initiative will help officers “scan through thousands of confronting images and files faster with lower levels of emotional distress” Monash said.

Over time the software will be extended to cover content from terrorism cases that can also cause significant psychological distress for investigators.

“The automated detection of abhorrent material enhances workplace safety by going some way towards reducing the incidental and inadvertent exposure to such material by law enforcement practitioners,” Dr Janis Dalins, a federal agent and co-director of Monash’s new Artificial Intelligence for Law Enforcement and Community Safety (AiLECS) Lab.

“The ultimate goal of this initiative is to ethically research the use of machine learning and data analytics in advancing law enforcement and community safety,” he said.

A prototype system was created and first described last year by Dalins and researchers at Data61. An image classifier was built using open source machine learning frameworks including Google’s TensorFlow.

Although limited in its ability to classify the severity of the child abuse, it was found to be effective as a ‘forensic triage’ or early warning system for investigators.

Data Airlock

The system will be further developed as within Monash University’s new AiLECS Lab which launched today. The lab is supported by $2.5 million in funding and is part of Monash’s wider ‘Data Futures’ initiative which focus on research into uses of artificial intelligence and data science for social good.

Through the lab, the AFP will also be making real-world data available to researchers a ‘Data Airlock’.

“This is a service designed to manage legal and ethical restrictions in the field by providing trusted research and industry partners with indirect access to offensive materials on which they can develop and test deep learning-based tools,” Dalins said of the Data Airlock in a blog post last year.

The airlock – built by the AFP and CSIRO’s Data61 and hosted at Monash – enables researchers globally to develop and test machine learning algorithms “without being exposed to confronting data”.

AFP Commissioner Andrew Colvin welcomed the launch of the AiLECS Lab.

“This is a groundbreaking initiative from Monash University and the AFP that will minimise AFP officer exposure to child exploitation material and other distressing content. At the same time, it will vastly increase the speed and volume at which police can identify and classify this content,” Colvin said.

“The AiLECS Lab will therefore ensure we hold more people accountable for these abhorrent crimes and, just as importantly, we better safeguard the wellbeing of both AFP officers and the community we are here to serve,” he added.

Similar work is being undertaken by the Metropolitan Police’s forensics department in the UK.

Google last year announced it was making available to NGOs and industry partners its Content Safety API, a toolkit to “increase the capacity to review content in a way that requires fewer people to be exposed to it”.

Facebook has hired additional human moderators to tackle content violating the company’s community standards. The moderators have described the huge emotional toll the work takes on them.


Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags Monash UniversityAFPAImachine learning

More about AFPAustralian Federal PoliceCSIROFacebookFederal PoliceGoogleMonash Universitynews.com.auTopic

Show Comments
[]