When an AI tool can ‘nudify’ a child, ‘we’re engaging’ is not good enough
The British regulator, Ofcom, has opened a formal investigation into X under the UK Online Safety Act. Malaysia and Indonesia have gone further, restricting or blocking access to Grok over the same concerns. Picture: Yui Mok/PA
As a former chair in Irish broadcasting media, I’m used to regulators asking detailed questions about broadcast minutiae — logs, complaints, scheduling, standards, and whether the public got what the licence promised. That culture of oversight matters.
This is why the official response to the Grok controversy — where an AI tool on X has reportedly been used to generate and circulate sexualised, exploitative imagery involving children — has felt alarmingly weightless.
Grok is an AI system built by xAI and integrated into X. In recent days, it has been at the centre of international concern because of evidence it has been used to create and share sexualised “nudified” images, including those involving minors. The British regulator, Ofcom, has opened a formal investigation into X under the UK Online Safety Act. Malaysia and Indonesia have gone further, restricting or blocking access to Grok over the same concerns.
To be clear about structures: The Broadcasting Authority of Ireland was the long-standing broadcast regulator and Coimisiún na Meán is its successor, inheriting those broadcasting functions and expanding into online safety.
Stations are required to retain recordings, maintain transmission logs, operate complaint procedures, and respond within set timelines.
Coimisiún na Meán’s rules require broadcasters to keep programme recordings for 90 days and to preserve complaint records for up to two years.
Central to that framework is the code of programme standards, which sets out what audiences should expect and what broadcasters must deliver. It covers a wide range of areas: Violence, sexual content, and offensive language; protection from harm; protection of children; respect for persons and groups in society; protection of the public interest; respect for privacy.
The code also provides a complaints mechanism for audiences and potential sanctions for breaches. Its very existence reflects a belief that the power to communicate carries a duty of care.
Ireland has witnessed the consequences of that accountability in practice. A high-profile investigation programme more than a decade ago breached fairness and privacy standards, resulting in a substantial fine and a published decision setting out what went wrong and why. The lesson was simple: Even major players are not above the code, and breaches have reputational and financial consequences.
Grok has confirmed that they will disable nudification features in jurisdictions where it is illegal. The timing of this change is less important than the pattern it reveals: platforms often expand features and then retreat selectively in response to backlash or enforcement actions, region by region. For Ireland, the key lesson is very clear — safety cannot rely on corporate discretion or be country-specific; it must be a consistent, enforceable standard with real penalties for non-compliance. Back again to Coimisiun na Meán.
Why, when the harm is this serious, does the public mainly hear that Coimisiún na Meán is merely “engaging with the European Commission”?
Under the EU Digital Services Act, the European Commission leads oversight of very large online platforms such as X, particularly on systemic risk duties.
Coimisiún na Meán has said it is engaging with the Commission and with An Garda Síochána regarding Grok. That engagement is important — but it is not sufficient as a public-facing posture when children are being targeted.
As X’s EU establishment is in Ireland, Coimisiún na Meán acts as Ireland’s digital services co-ordinator and a key enforcement partner under the act, while the commission leads supervision of very large online platforms.
Domestically, Coimisiún na Meán has already shown that these companies are not beyond Irish rules: X lost a High Court challenge last year against the online safety code, which requires video-sharing platforms to have strong systems in place to protect users — especially children — from harmful and illegal content.
Against that backdrop, the mismatch in regulatory intensity is hard to ignore and difficult to understand.
In broadcasting, we have long debated what some might call “soft” obligations — quotas, balance rules, and commitments designed to shape culture. Whether or not one supports those interventions, they demonstrate something vital: When political and regulatory will exist, we can draft detailed, enforceable expectations and insist they are met.
Yet here we are, in an era where the industrial-scale creation of child-abuse imagery can be facilitated by a built-in feature of a mainstream platform, and the dominant official language remains procedural: Engagement, liaison, review.
Worse still, the political signalling has been confused. The minister for media, Patrick O’Donovan, was criticised for saying X is not responsible for making child sexual abuse images, placing responsibility on the individuals who use the platform to generate them.
If a broadcaster repeatedly facilitated illegal material, we would not shrug and say: “It’s the callers’ fault.”
The Government has repeatedly acknowledged the epidemic of sexual and gender-based violence in Ireland, including through
Yet, when technology enables the rapid production and dissemination of exploitative material, the national response often sounds like a jargon of jurisdiction. It is time to stop treating these curated digital spaces as neutral conduits. They are designed environments that shape behaviour, and they must be held accountable for foreseeable risks created by that design.
Protecting children starts with clarity and urgency: Publishing what Ireland is asking the European Commission to do, on what timeline, and what consequences we believe should follow.
It includes visible co-ordination with gardaí and clear public reporting pathways for schools, parents, and victims. It means using every domestic lever that already exists — especially around online-safety obligations — and, where gaps remain, naming them plainly and proposing targeted legislative fixes rather than relying on vague reassurances.
If a platform builds a capability that predictably increases the risk of generating and circulating exploitative content, then it has a moral — and increasingly legal — duty to design out that risk, not merely react after the damage is done. And if it will not, then we, as a society, must enforce it.
Broadcasting learned a hard lesson: Oversight has to be felt, not merely filed.
When the issue is children, “we’re engaging” cannot be the headline.
Regulators can be technical in their work — ask any radio station in Ireland — but they must be clear, urgent, and public in their purpose. The public understands standards, timelines, and consequences. It’s time online platforms did too.
- Lucy Gaffney is a former Chair in Irish broadcasting media with a background in business. She chaired Communicorp Media prior to its sale to Bauer Media in 2021. She is now focused on online harms, sexual exploitation, and violence against women and girls, and is a non-executive director with the Sexual Exploitation and Research Policy Institute, as well as Women’s Aid Ireland and Ruhama.





