'It should be illegal,' says survivor of child sexual abuse about AI 'nudification' technology

'It should be illegal,' says survivor of child sexual abuse about AI 'nudification' technology

In addition to Grok, people have also called for urgent action on other AI tools, including so-called 'AI-girlfriend' websites and apps, which allow the production of graphic deep-fake pornography of adult women and child sexual abuse imagery (CSAM).

The State’s specialist service for survivors of child sexual abuse has called for a “total ban” on AI technology, including X tool Grok, that can produce sexual abuse imagery of children and adults.

The Alders Unit at Children’s Health Ireland (CHI) said the harm caused by the production of these images is “equivalent” to that perpetrated face-to-face.

You have reached your article limit. Already a subscriber? Sign in

Unlimited access starts here.

Try from only €0.25 a day.

Cancel anytime

More in this section

Lunchtime News

Newsletter

Get a lunch briefing straight to your inbox at noon daily. Also be the first to know with our occasional Breaking News emails.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited