AI driving child sex abuse imagery
In their 2022 annual report, Hotline.ie documented a massive increase in the amount of computer-generated child sex abuse material (CSAM) — accounting for 37% of their caseload in 2022, compared to 9% in 2021. File photo
Artificial intelligence software can take any image of a child and turn into child sexual abuse material, according to the Irish internet industry watchdog.
A Hotline.ie analyst said it could soon get to the stage where it might be “impossible” to distinguish between what is generated by computer and real images.
In their 2022 annual report, Hotline.ie documented a massive increase in the amount of computer-generated child sex abuse material (CSAM) — accounting for 37% of their caseload in 2022, compared to 9% in 2021.
It reflects what online watchdogs are seeing across the world, with the Internet Watch Foundation (IWF) in the UK flagging concerns since the middle of last year about the use of freely available AI software to create such imagery.
The IWF even found cases of “text-to-image” technology, where a person could type what they want into online generators and the AI software creates the image.
In the Hotline.ie report, an analyst, named Sean, told of how computer-generated CSAM used to be “crudely rendered animations” which have become more photorealistic over the years. “But what we’re seeing now since the advancements in AI-generated art is deeply concerning,” he said.
He said it was often claimed that computer-generated material is not of real children and therefore it is not hurting anyone.
“Well, now any image of a child can be fed through an AI and be turned into CSAM," he said. "There are even communities out there creating new CSAM based on old videos of real child victims being sexually abused.”

He said not only was computer-generated material becoming more realistic it is also being “mixed in with images of real children” and that it can be difficult to tell the difference.
“The rate at which the technology is advancing, it might soon be impossible to tell the difference at all,” he said.
The Internet Watch Foundation said AI could create images “at scale” with the clear potential to “overwhelm” those working to fight CSAM.
The Hotline.ie system, which works in conjunction with An Garda Síochána, is based on referrals from the public. Its report said it had 5,105 cases of classified CSAM containing computer-generated imagery, compared to 1,329 cases in 2021.
“Computer generated CSAM was found to be more severe and depict very young child-like renditions,” the report said.
It said that nine of 10 cases were found to display depictions of pre-teens.
The report pointed out that under the Child Trafficking and Pornography Act 1998, “any visual representation” depicting the sexual abuse of a child, including computer-generated CSAM, is illegal in Ireland.
It said it was not a “harmless activity”, saying it can “normalise” sexual activity with children, be used by adults trying to groom children and lead some people to search for other types of CSAM.



