'Build your own AI slut': Boys being targeted online by surge in 'girlfriend' websites
'We regularly hear from 15- and 16-year-old students that they are being targeting with advertisements for AI girlfriends and nudification apps on all of the spaces they go online, not just X but TikTok and Snapchat, Instagram and YouTube, TV and film streaming websites and platforms.'
Take a boy who fancies a girl in his class at school. He’s too awkward to talk to her but follows her on social media.
His own social media feeds include an ad for ‘AI girlfriends’, with sexualised images of a woman and a tagline ‘Build your own AI slut’.
The boy clicks and scrolls through a gallery of AI girlfriends.
The site encourages him to upload a ‘reference’ photo of an actual woman to help customise his ideal virtual girl.
The boy copies an image of the girl he fancies from one of her social media sites and uploads it.
He now has his own digital version of the girl.
The site allows him to highlight areas of the girl’s body and 'nudifying or undressing' technology digitally undresses her.
That’s just the start.
Read More
The boy can place his digital girlfriend in a wide range of scenarios, including pornographic.
If a boy doesn’t have a ‘reference image’, he can just choose from the gallery available.
These can include extreme scenarios, of women — and girls — being “penetrated with dildos, covered in semen, performing oral sex, group sex and being sexually tortured and used by men”.
So describes a disturbing account of 20 such sites in new research, ‘Turning Women and Girls into Porn’, by Australian campaigning group Collective Shout.
“Public galleries hosting their pornographic creations are filled with images of women — many who appear to be underage — being sexually tortured by men: bound, blindfolded, tied up with ropes, in stocks (a medieval punishment device), handcuffed, being penetrated by multiple men or machines, covered in semen and used by multiple men,” the report says.
“Women in headscarves, pregnant or modelled on Disney princesses are depicted engaging in humiliating sex acts. Public galleries showcased an abundance of creations depicting underage girls, some who appeared to be prepubescent.”
Boys are encouraged to invite their friends, with one of the websites offering an “invite and earn” programme where users can get “undressing credits” if they get a friend to join.
Report author Caitlin Roper told the this was a “worldwide” issue and one that it is becoming more popular.
“We are also seeing news coverage documenting boys using these websites against female classmates in schools,” she said.
Eoghan Cleary is conducting research on the exposure of Irish children to pornography and its impact.
“The [Australian] report mirrors exactly what my students tell me they see online and what my own very preliminary research on this issue has also revealed,” he said.
“We regularly hear from 15- and 16-year-old students that they are being targeting with advertisements for AI girlfriends and nudification apps on all of the spaces they go online, not just X — as is highlighted in the Collective Shout report — but TikTok and Snapchat, Instagram and YouTube, TV and film streaming websites and platforms.”
He said it was “unquestionably” the case that use of this technology is growing.
“What people need to realise is that this isn't about young people seeking this out,” he said. “They are being targeted with it in every online space they go.”
He said as soon as you click on a female on the sites there’s an offer to receive video clips — currently up to six seconds long doing “any sexually explicit interactions you can think of”.
He said the technology was progressing as such a rate that last June, at a porn conference in Amsterdam, the leading providers promised to soon be able to provide “up to an hour of seamless interaction” with what Mr Cleary calls “AI sex simulating” entities.
He added: “And you cannot tell that the AI girlfriend you are interacting with is not a real person.”
He said these sites were compounding the impact of increasing violence in adult pornography on boys, and girls.
“We know that 90% of mainstream porn is sexually violent, with 94% of that violence perpetrated towards women,” Mr Cleary said.
He said AI was “supercharging” these trends.
“We know peer-on-peer sexual abuse is increasing and we know that sexually violent crime for under 18 years olds in Ireland has multiplied 6.5 times in the 15 years since the majority of Irish adolescents got independent access to the internet through smart phone ownership,” he said.
He said judges overseeing recent cases in court had started to “call out the clear link between exposure to the pornified world online and the harms being perpetrated”.
But, he said, this was before even seeing the impact AI sex simulators is having.
Mr Cleary said money drives these apps: “Each service seems to initially offer itself for free until it has enough engagement to start charging. Other services offer free credits to get you engaged and then charge for some of the more controversial provisions available.”
He said these provisions could include "consensual non-consent", or "cnc" scenarios, "step-sibling’ scenarios or the combination of physical and personal attributes like petite, innocent, schoolgirl options.

The ISPCC said the technology was setting back the rights of women and girls greatly.
"It thwarts boys’ views of girls and intimates that girls are there to be controlled, manipulated, used — and abused,” said ISPCC head of policy and public affairs Fiona Jennings.
She said: “It runs the risk of boys’ believing such behaviours are acceptable and perhaps welcome, when the opposite is the case."
She said they had seen the impact on girls through their helpline: “Girls often talk about the pressure they feel under to look a particular way, dress a particular way and to perform sexually in a particular way.”
She said there were reports from SERP and therapeutic child services that “more young girls are presenting for medical treatment after engaging in some of these sexual acts, which is leaving them with serious injuries”.
And she said boys were telling the ISPCC they can feel pressure too that they should be able to perform such sexual acts and are encouraged to engage in such behaviours.
“It is hugely concerning that we are growing up in a society where more young people are turning to these technologies to engage in this type of behaviour,” Ms Jennings said.
“That should sound a very loud warning bell for us all.”
CARI, a voluntary therapeutic service for children affected by sexual abuse, said they had “significant concerns” for boys and girls from this technology.
“Children are at serious risk of exposure to harmful content that they should never encounter,” CARI chief executive Emer O’Neill said.
She said for children who see or use this type of harmful content, the impact can include “fear, anxiety, and a distorted understanding of boundaries, relationships, expectations, and even what is real versus artificial”.
For children whose images are misused, the harm is profound: “They may feel unsafe, watched, or exposed. Many experience shame or confusion despite having done nothing wrong, and the emotional impact can lead to long term trauma similar to other forms of exploitation.”
Ms O’Neill added: “I would be concerned that CSAM [child sex abuse imagery] could normalise harmful behaviour and increase the risk of abuse. It can also enable grooming, coercion, and peer-to-peer misuse, placing children at significant emotional and physical risk.”
Ms O’Neill called on the State to take “stronger action” on tackling sexual abuse: “Tech companies need tighter regulation and must play a far more active role in protecting children.”

Ms Jennings of the ISPCC said the Government owes a “duty of care” to children.
“The status quo is no longer acceptable,” she said.
“Coimisiún na Meán and similar regulatory bodies are doing what they can within the legislative frameworks they operate under, but unless we get serious about attaching criminal liability to corporate individuals, this issue is not going away.
"The pace and rate at which these technologies are being developed and, in turn, the growth in behaviours they are allowing is creating a perfect storm."
She said these technologies were impacting the mental health of children: “It impacts their notion of self and especially their sexual development at a time when things are hard enough and they are already vulnerable. Our child and adolescent mental health services are already under huge pressure.”
Mr Cleary said the recent announcement by the British government it was going to ban nudification apps and other non-consensual synthetic imagery should be immediately replicated in Ireland.
“We cannot wait for the EU to do this for us,” he said.
"The creation of digital sex simulators that look like real people and that promotes any kind of sexually violent practice like strangulation, 'consensual non-consent', sex with child or teen-like entities, cannot be something we provide for anyone, regardless of age.”
He said the porn industry argue it is just "fantasy", but said research showed its impact on real-world sex.
“To take just one example, directly due to its introduction into porn, strangulation — known in porn as choking or breath play — now accounts for the second most common cause of a stroke in young women under 40,” he said.
“And because it’s so dangerous and yet so normalised in porn, in 2023 in Ireland we had to pass legislation to outlaw what is now termed non-fatal strangulation — which in its first year led to 67 prosecutions.”
He added: “As well as criminalising the adult user, they must criminalise the tech producer, as well as the social media platform and search engines that facilitated its advertisement — not with fines: they don't work: they already owe the Irish state €4bn in fines they're not paying — we need criminal liability for the individual executives.”
Mr Cleary stressed what society must not do is blame boys, or teenagers generally: “Whatever we do, we cannot place the blame for this at the feet of our teenagers, which I so often see happening. They are being bombarded with this stuff every day by an industry bent on recruiting them.
"No one is protecting them from it, no one is talking to them about it, no one is teaching them about it. All the while it is becoming silently and successfully normalised in their lives.
"No nine-year-old goes looking for porn, nudification apps or AI girlfriends", but he added research indicated one in 10 had already been exposed to porn by that age.
He said it could not be a child's responsibility to protect themselves from these harms: "There is currently no avoiding it online, and if educating children is the only approach we've taken, when they do get targeted with it, which they will, it immediately becomes the child's fault because they've failed to protect themselves from what we told them was out there, and they will tell no one about what's happened."
He said the tech companies needed to stop facilitating the targeting of children by the porn industry, "our boys by the likes of Pornhub to normalise their dependence on it, our girls by OnlyFans to normalise and glamorise it as a career opportunity".
Mr Cleary said they could do it, but would not, as engagement increases profit.
"The Government needs to force their hand — by legislating for this issue without delay like the UK are doing," he said.
"They need to ban nudification apps — and AI sex simulators that facilitate the use of images of real people — what they facilitate is already illegal."
The Department of Justice said the Harassment, Harmful Communications and Related Offences Act 2020 (Coco’s Law) created a serious offence of distribution or publication of an intimate image without consent with intent to cause harm to the victim.
It said the definition of an intimate image was intentionally broad to include “any visual representation” of a person.
“This is to include intimate images which have been altered or doctored prior to their distribution or publication, more commonly referred to as ‘deep fakes’,” it said.
“Therefore, under this law it is an offence to send or post an intimate image purporting or claiming to be of another person, even if the image is not actually of them.”
It said Coimisiún na Meán regulates online safety under the Online Safety Media Regulation Act 2022, which was the basis for the Online Safety Code.
In turn, Coimisiún na Meán said the code made online platforms accountable for how they protect people, especially children, from harm online.
It added: “Matters concerning potential further legislative change in this area is a matter for the Government.”
It said a user who sees the non-consensual sharing of intimate imagery should report this to the online platform, as well as An Garda Síochána and Hotline.ie.
The agency said if they were unhappy with the platform’s response, then they could contact Coimisiún na Meán.
There was much focus recently on Australia’s decision to set a mandatory minimum age of 16 to use social media.
Ms Roper said the age delay should have “significant positive impacts” for young people. But she said it was limited to a number of designated social media platforms rather than AI girlfriend websites.
“Some states have taken steps to criminalise not just distribution of nudified/deepfake content, but creation also as a standalone offence, and the federal government has announced plans to outlaw nudifying apps,” she said.
“Some of these apps are also advertised on social media, actively encouraging users to use them to create nude images of any woman, so it’s possible young people will have less exposure to some of them.”
Mr Cleary said the Irish Government was running out of time to deal with this issue head on: “The window of opportunity to act is closing fast, as the upcoming generation's understanding of sex becomes one of simply sexual violence. And the arrival of AI is rapidly accelerating the whole process.”
- Childline 1800 66 66 66 or chat online at Childline.ie;
- CARI 0818 924567;
- Coimisiún na Meán usersupport@cnam.ie or call the contact centre on 01 9637755.
- If you are affected by any of the issues raised in this article, please click here for a list of support services.





