Advertisement

We need your help now

Support from readers like you keeps The Journal open.

You are visiting us because we have something you value. Independent, unbiased news that tells the truth. Advertising revenue goes some way to support our mission, but this year it has not been enough.

If you've seen value in our reporting, please contribute what you can, so we can continue to produce accurate and meaningful journalism. For everyone who needs it.

File photo Shutterstock/Antonov Roman

Researchers in Cork develop technology that will identify child sexual abuse images online

Current systems mean specialists have to view “traumatic and distressing” material.

NEW TECHNOLOGY WILL use artificial intelligence to help police and find offenders who present the highest risk to the public, according to a UCC researcher who helped develop it.

The iCOP (Identifying and Catching Originators in P2P Networks) toolkit will automatically identify new and previously unseen images of child sexual abuse for police and help to reduce the volumes of materials specialists have to view in order to find children.

Maggie Brennan, a researcher and lecturer in the Schools of Applied Psychology and Criminology at UCC, worked on the research team with Sean Hammond of UCC’s School of Applied Psychology to develop the technology.

‘Traumatic’

“It’s common to seize computers and collections of child sexual abuse materials containing enormous volumes of illegal materials, terabytes of individual files. Having to view this material to find victims can be traumatic and distressing for the specialists working to find these children,” Brennan said.

In a statement released today, UCC said current systems rely on identifying known media. As a result, these tools are unable to assess the thousands of results they retrieve, whereas the iCOP toolkit uses artificial intelligence and machine learning to flag new and previously unknown child sexual abuse media.

The new approach combines automatic filename and media analysis techniques in an intelligent filtering module. The software can identify new criminal media and distinguish it from other media being shared, such as adult pornography.

Brennan said law enforcement “urgently need these kinds of supports to help them manage the volumes of cases they are being faced with – to find the children who are victimised in these images and videos, as well as those offenders who present the highest risk to the public”.

The research behind the technology was conducted in an international project founded by the European Commission by researchers at UCC, Lancaster University and the German Research Center for Artificial Intelligence (DFKI).

At UCC, the team worked closely with international law enforcement specialists in online child sexual abuse investigation to understand their needs and develop a tool that allows them to find the most urgent cases for intervention.

Our role also involved developing a psychological profiling system to identify viewers of child sexual abuse images who may be at risk of committing hands-on abuse.

“We have been researching this topic with international law enforcement agencies like Interpol for many years, since the early 2000s. The volumes of child sexual abuse images and videos now in circulation is a real concern, and it can be overwhelming for law enforcement.

“Trying to find recent or ongoing cases of child sexual abuse is an absolute priority, but the sheer volume of illegal materials in circulation online makes this task incredibly difficult for the police,” Brennan said.

Hundreds of searches 

There are hundreds of searches for child abuse images every second worldwide, resulting in hundreds of thousands of child sexual abuse images and videos being shared every year.

The people who produce child sexual abuse media are often abusers themselves – the US National Center for Missing and Exploited Children found that 16% of people who possess such media had directly and physically abused children.

“Identifying new child sexual abuse media is critical because it can indicate recent or ongoing child abuse,” Claudia Peersman, lead author of the study from Lancaster University said. “And because originators of such media can be hands-on abusers, their early detection and apprehension can safeguard their victims from further abuse.”

The researchers tested iCOP on real-life cases and police trialled the toolkit. It was highly accurate, with an error rate of only 7.9% for images and 4.3% for videos.

Read: Elderly woman spends 15 hours on hospital trolley

Read: Concern for man missing from Cork since last Friday

Readers like you are keeping these stories free for everyone...
A mix of advertising and supporting contributions helps keep paywalls away from valuable information like this article. Over 5,000 readers like you have already stepped up and support us with a monthly payment or a once-off donation.

Author
Órla Ryan
View 7 comments
Close
7 Comments
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.
    JournalTv
    News in 60 seconds