'It should be illegal,' says survivor of child sexual abuse about AI 'nudification' technology
In addition to Grok, people have also called for urgent action on other AI tools, including so-called 'AI-girlfriend' websites and apps, which allow the production of graphic deep-fake pornography of adult women and child sexual abuse imagery (CSAM).
The State’s specialist service for survivors of child sexual abuse has called for a “total ban” on AI technology, including X tool Grok, that can produce sexual abuse imagery of children and adults.
The Alders Unit at Children’s Health Ireland (CHI) said the harm caused by the production of these images is “equivalent” to that perpetrated face-to-face.
Lunchtime News
Newsletter
Keep up with stories of the day with our lunchtime news wrap and important breaking news alerts.



