Irish Examiner view: Pornification of our children must be curbed
Irish teenagers 'are being targeted with advertisements for AI girlfriends and nudification apps on all of the spaces they go online'. Picture: iStock
Today’s report by Cormac O’Keeffe on AI girlfriends and boyfriends makes sobering reading.
Perhaps sobering isn’t even strong enough. Nauseating, perhaps? Certainly depressing.
And the insidious part is that children aren’t seeking these websites out. They’re being presented with them.
The cases outlined in the article — where boys can take images of a classmate, upload them, and then manipulate them into sexual scenarios — should sound alarm bells, because let’s face it, it’s not just boys who are taking advantage of this sort of website.
Catering for the most debased sexual instincts, with galleries of virtual women involved in the most degrading acts, it’s an indication of how lawless the AI space is. Where are the safeguards?
Researcher Eoghan Cleary said: “We regularly hear from 15- and 16-year-old students that they are being targeted with advertisements for AI girlfriends and nudification apps on all of the spaces they go online, not just X — as is highlighted in the Collective Shout report — but TikTok, Snapchat, Instagram, YouTube, and TV and film streaming websites and platforms...
The UK recently brought in legislation requiring the users of pornographic sites to be of legal age, though it is routinely bypassed by the use of virtual networks to suggest the user is living in another country. Nonetheless, it is something of an attempt to restrict access, and for good reasons.
We have already seen plenty of reports, including in this publication, showing that pornographic scenarios are being replicated by young men, in some cases because they think it’s how sex should be, given that they’ve seen it on screen. How much longer before we have the first case of sexual assault inspired by an AI scenario?
ISPCC head of policy and public affairs Fiona Jennings says the technology “thwarts boys’ views of girls, and it intimates that girls are there to be controlled, manipulated, used, and abused.
She added: “Girls often talk about the pressure they feel under to look a particular way, dress a particular way, and to perform sexually in a particular way.”
She said there were reports from the Sexual Exploitation Research and Policy Institute and therapeutic child services that “more young girls are presenting for medical treatment after engaging in some of these sexual acts, which is leaving them with serious injuries”.
A better example in light of the O’Keeffe report is how legislation already exists concerning the sharing of intimate images without consent.
The AI images may not be quite image-based sexual assault as widely understood, or what was once known as “revenge porn”, but one would suggest that a strong argument could be made for considering these generated images under that umbrella.
Indeed, the UK recently announced plans to ban these sorts of websites, and it would behoove the powers that be here to follow suit.
Given the pace of legislative change in this country, however, there may be concern that it would end up languishing on a pile of draft bills. But that can’t be allowed to happen.
This may end up being the tip of the iceberg, and there is a tendency for legislation concerning technology to be out of date quickly, or fail to keep pace with developments generally. But the people of this country deserve better, particularly the women and girls who will face the brunt of this behaviour.
And the traditional cry — “teach it in schools” — is just a hollow, hand-wringing sort of way of dealing with things.
As any principal or deputy principal will tell you — from the smallest primary to the largest post-primary — they are already inundated with suggestions for things they should bring in.
Jennifer Horgan, writing in Friday’s , decried a forthcoming British programme to train teachers to spot misogyny (she notes that as 74% of teachers are women, they probably don’t need help with that). But she adds:
“Research from Britain makes the abuse of female teachers abundantly clear. A survey conducted in collaboration with UK Feminista in 2024 recorded one in 10 female support staff in secondary schools as saying they had been sexually harassed, by male pupils but also by their male colleagues.”
She notes further: “What non-teachers don’t realise is that the hidden curriculum is far more powerful than the actual curriculum. The hidden curriculum is the stuff that’s hard to pin down. It’s cultural and social. It’s what children bring to school with them from home, from being online.
“Schools in Britain have been working hard to tackle these issues for years but misogyny is embedded in classrooms because it is embedded in society. It isn’t something that can be solved by a PowerPoint or a list of misbehaving boys.”
It’s not a school’s responsibility to rework human nature.
That goes far deeper than any, or at least most, classes can reach. Given human nature, you will probably not be surprised to learn that such virtual girlfriend apps were some of the earliest to be created as the AI bubble began to inflate.
And there is a parallel, related problem that may manifest more widely as a result of utter reliance on AI apps for, well, pretty much everything.
“AI psychosis” is an increasing issue, with apps like ChatGPT blamed for suicides as well as parasocial addictive behaviour. AI has also almost commodified a reduction in critical thinking, with OpenAI leader Sam Altman claiming he wouldn’t be able to function as a new parent without ChatGPT. “Clearly, people did it for a long time — no problem... but I have relied on it so much.”
We are not suggesting he is breaking with reality — he is a known workaholic, and you should expect him to tout his product as being useful in all areas of life — but that sense of reliance can have unforeseen consequences.
A recent article on AI psychosis published by the University of Michigan noted that psychosis generally has a trigger event, and often in people who may already be vulnerable for various reasons. Psychiatrist Stephan Taylor is quoted in the piece as saying: “there is a real potential for general chatbots to be used by people who are lonely or isolated, and to reinforce negative or harmful thoughts in someone who is having them already. A person who is already not in a good place could get in a worse place.”
There is no putting the genie in the bottle, and even the most basic AI app is capable of usefulness as well as destructiveness. But never has it been more apparent that strong regulation is needed for the tech industry. Whether we will get it is another question.





