The technology, a toolkit called iCOP (Identifying and Catching Originators in Peer 2 Peer Networks), has been trialled by police.
The toolkit uses artificial intelligence and a special data analysis (known as machine-learning) to automatically identify new or unseen images.
Machine learning enables computers to find hidden information without preprogramming.
Traditionally, employees at law-enforcement agencies have to go through tranches of disturbing data physically to find offenders and victims.
“It’s common to seize computers and collections of child sexual-abuse materials containing enormous volumes of illegal materials, terabytes of individual files,” said UCC lecturer and researcher Maggie Brennan.
“Having to view this material to find victims can be traumatic and distressing for the specialists working to find these children.”
Ms Brennan, a lecturer in UCC’s Schools of Applied Psychology and Criminology, worked as part of an international project team, which included researchers from Lancaster University.
“Our role also involved developing a psychological profiling system to identify viewers of child sexual-abuse images who may be at risk of committing hands-on abuse,” Ms Brennan said.
She said that the toolkit is “urgently” needed by law enforcement agencies to manage the large volume of sexual abuse data.
For example, in an Irish context, 327 cases of online child sexual-abuse imagery were confirmed here in 2015.
This figure does not represent 327 individual cases, as one case can relate to a single website carrying thousands of images.
These figures are from the Irish internet safety watchdog, ISPAI (Internet Service Providers’ Association of Ireland), which runs Hotline.ie.
The website provides an anonymous facility for internet users to report suspected illegal content, particularly child sexual-abuse material, accidentally encountered online, in a secure and confidential way.
“Law enforcement urgently need these kinds of supports to help them manage the volumes of cases they are being faced with, to find the children who are victimised in these images and videos, as well as those offenders who present the highest risk to the public,” said Ms Brennan.
She said the project team had been researching the topic for 15 years, with the aid of the authorities.
“We have been researching this topic, with international law enforcement agencies, like Interpol, for many years, since the early 2000s,” she said. “The volumes of child sexual-abuse images and videos now in circulation is a real concern, and it can be overwhelming for law enforcement.”
The iCOP toolkit has an error rate of just 4.3% for video and 7.9% for images.