A new toolkit using artificial intelligence will help police identify new images of child sexual abuse online and find offenders who present the highest risk to the public, its developers have said.
The iCOP toolkit will automatically identify new and previously unseen images of child sexual abuse for police and help to reduce the volumes of materials specialists have to view in order to find children.
"It's common to seize computers and collections of child sexual abuse materials containing enormous volumes of illegal materials, terabytes of individual files," said UCC researcher Maggie Brennan, who worked on the UCC research team.
"Having to view this material to find victims can be traumatic and distressing for the specialists working to find these children.”
The iCOP toolkit uses artificial intelligence and machine learning to flag new and previously unknown child sexual abuse media. The new approach combines automatic filename and media analysis techniques in an intelligent filtering module. The software can identify new criminal media and distinguish it from other media being shared, such as adult pornography.
Brennan added: "Law enforcement urgently need these kinds of supports to help them manage the volumes of cases they are being faced with - to find the children who are victimised in these images and videos, as well as those offenders who present the highest risk to the public."
The research behind this technology was conducted in the international research project iCOP – Identifying and Catching Originators in P2P Networks – founded by the European Commission Safer Internet Program by researchers at UCC, Lancaster University and the German Research Center for Artificial Intelligence (DFKI).