'It should be illegal,' says survivor of child sexual abuse about AI 'nudification' technology

'It should be illegal,' says survivor of child sexual abuse about AI 'nudification' technology

In addition to Grok, people have also called for urgent action on other AI tools, including so-called 'AI-girlfriend' websites and apps, which allow the production of graphic deep-fake pornography of adult women and child sexual abuse imagery (CSAM).

The State’s specialist service for survivors of child sexual abuse has called for a “total ban” on AI technology, including X tool Grok, that can produce sexual abuse imagery of children and adults.

The Alders Unit at Children’s Health Ireland (CHI) said the harm caused by the production of these images is “equivalent” to that perpetrated face-to-face.

Already a subscriber? Sign in

You have reached your article limit.

Subscribe to access all of the Irish Examiner.

Annual €130 €80

Best value

Monthly €12€6 / month

More in this section

Lunchtime News

Newsletter

Keep up with stories of the day with our lunchtime news wrap and important breaking news alerts.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited