Children being 'bombarded' online by 'AI girlfriend' porn apps

Children being 'bombarded' online by 'AI girlfriend' porn apps

Australian research has shown how some sites allow boys to digitally create scenes of women and underage girls being sexually tortured by men.

Children are being “bombarded” online by so-called AI girlfriend porn apps, which are “grooming” boys to perpetrate sexual violence and girls to accept such behaviour.

Irish children’s charities and pornography researchers say the only way to combat this growing problem is to “criminalise” porn company bosses.

The call comes as the British government announced it was banning AI girlfriend and ‘nudification’ apps and websites and after Australian research showed how some sites allow boys to digitally create scenes of women and underage girls being sexually tortured by men.

“These are not girlfriends, they are sex slaves that will do whatever you tell them,” said Eoghan Cleary, a researcher with independent research institute the Sexual Exploitation Research and Policy Institute (SERP).

Calling them ‘girlfriends’ also suggests that our girlfriends should do whatever we tell them to ‘because my AI girlfriend does’. 

An Australian report, Turning Women and Girls into Porn, examined 20 AI girlfriend and nudification apps which allow users to customise their own porn girlfriend either from a digital gallery or from an image they can upload of a girl or woman.

“Users can turn images of women and girls into extreme and degrading porn scenarios,” the report said.

The research, carried out by campaign group Collective Shout, said these apps also produce digital child sexual abuse imagery: “Public galleries showcased an abundance of creations depicting underage girls, some who appeared to be prepubescent.”

'Children being groomed by the porn industry'

Mr Cleary, who is conducting research on the exposure of Irish children to pornography, said: “The [Australian] report mirrors exactly what my students tell me they see online and what my own very preliminary research on this issue has also revealed.

“We regularly hear from 15- and 16-year-old students that they are being targeting with advertisements for AI girlfriends and nudification apps on all of the spaces they go online, not just X — as is highlighted in the Collective Shout report — but TikTok and Snapchat, Instagram and YouTube, TV and film streaming websites and platforms.”

He said these AI technologies were “supercharging” the already increasing violence in adult pornography. He said: 

Our kids are being groomed by the porn industry — our boys to be the perpetrators of sexual violence, our girls to think that sexual violence is what they should be consenting to.

“The window of opportunity to act is closing fast as the upcoming generation’s understanding of sex becomes one of simply sexual violence.”

He said this was not about young people seeking out this technology: “They are being targeted with it in every online space they go.We cannot place the blame for this at the feet of teenagers. They are being bombarded with this stuff by an industry bent on recruiting them.”

Fiona Jennings, policy and public affairs officer of the ISPCC, said the technology was setting back the rights of women and girls. She said: 

It thwarts boys’ views of girls and intimates that girls are there to be controlled, manipulated, used — and abused.

“Girls often talk about the pressure they feel under to look a particular way, dress a particular way and to perform sexually in a particular way.”

She said there were reports from SERP and therapeutic child services that “more young girls are presenting for medical treatment after engaging in some of these sexual acts which is leaving them with serious injuries”.

Both Mr Cleary and Ms Jennings said the State needs to make the executives running these websites and services criminally liable.

Cari, a voluntary therapeutic service for children affected by sexual abuse, said they had “significant concerns” for boys and girls from this technology.

The Department of Justice said 2020 legislation made it a criminal offence to use “any visual representation” of a person which has been altered or doctored to produce an intimate image.  

x

More in this section

Lunchtime News

Newsletter

Keep up with stories of the day with our lunchtime news wrap and important breaking news alerts.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited