Call for ban on AI tools used to create child sexual abuse imagery and 'AI girlfriends and boyfriends'

Call for ban on AI tools used to create child sexual abuse imagery and 'AI girlfriends and boyfriends'

The Irish Internet Hotline said AI 'fundamentally changes" the risks by enabling scalable, private, on-demand creation of child sexual abuse material

AI tools capable of producing child sexual abuse imagery should be specifically banned under proposed legislation, as they pose a “systemic risk to children”, the Irish Internet Hotline has said.

The online reporting centre also wants the draft bill to prohibit ‘AI girlfriends’ and ‘AI boyfriends’ for children, stating the tools — which can generate sexually explicit images of minors — constitute a “critical threat” to children.

The Irish Internet Hotline has made the recommendations in a submission to the Oireachtas Committee on Artificial Intelligence, which is examining the General Scheme of the Regulation Of Artificial Intelligence Bill 2026.

“Certain AI systems now present a structural, systemic risk to children and the public generally because of their capability to generate child sexual abuse material without safeguards,” the submission said.

It said AI "fundamentally changes" the risks by enabling scalable, private, on-demand creation of child sexual abuse material (CSAM).

It said this was often “beyond detection” by existing safeguards since it can be done within the home or can be spun up by “nefarious actors” and offered as a service in return for payment.

The hotline said the bill presents an opportunity to recognise AI systems, especially bespoke home-made models, capable of producing child sexual abuse material, should be made illegal.

The submission said privately trained or locally run AI systems could produce illegal content without platforms being involved, without content passing through industry safeguards, and without any practical possibility of detection by hotlines, law enforcement or regulators.

“From the standpoint of those working daily to intercept CSAM, this represents a serious enforcement gap,” it said.

The Irish Internet Hotline said particular concern arose in relation to privately trained, modified, or fine-tuned AI models, sometimes referred to as “home-made” systems.

“Every generative AI engine can be trained to produce CSAM,” it said.

In addition, it said certain AI systems had been linked to suicide, with chatbots giving specific details on how to take one’s own life.

It said anthropomorphism — mimicking human qualities and emotions — in AI tools had escalated with the proliferation of ‘AI girlfriend’ and ‘boyfriend’ apps, often advertised on social media and online platforms.

“Experts regularly warn about the dangers of these apps, which reinforce harmful gender stereotypes and distort the understanding of consent," it said. 

"The issue is that children often use AI companions when bored or seeking entertainment, without safeguards or proper age-verification systems to prevent access to harmful content.” 

It said the bill should specifically prohibit the possession, creation, or distribution of AI tools that have the capability to generate child sexual abuse material.

The Irish Internet Hotline also called the mandatory implementation of a "child rights impact assessment" for any AI project that might involve children.

More in this section

Lunchtime News

Newsletter

Get a lunch briefing straight to your inbox at noon daily. Also be the first to know with our occasional Breaking News emails.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited