Ireland could prosecute Twitter/X for AI 'child porn' deepfakes

The horrific child sex abuse material produced every day by Grok is already banned under Irish law. The real test is if we are ready to enforce that law against large tech companies
Ireland could prosecute Twitter/X for AI 'child porn' deepfakes

‘Ever since Elon Musk acquired and rebranded Twitter, the platform has been plumbing new depths almost by the week.'   Picture: Michael M Santiago/Getty

New technologies often pose new problems — not least for the law. Legislators and law enforcement agencies can find themselves playing catch-up when new technologies emerge, and aiming at a moving target when seeking to regulate them.

The rise of so-called artificial intelligence (AI) tools is a case in point.

However, just because new technologies often necessitate new legal frameworks doesn’t mean that that is always the case. 

Occasions also arise where existing laws adequately address harms or abuses that may be perpetrated with the latest tech.

Child sex abuse material ('child porn')

A case in point is the use of technology to create or distribute child sex abuse material (CSAM) — what our existing laws refer to as ‘child pornography’ (the latter term is generally considered to be outdated, since it implies a connotation of consent).

CSAM is unfortunately not a new problem; what has changed is the ease with which it can be created and distributed at scale, and the types of material that can be created.

AI tools have added a new dimension of automation that makes the production of CSAM quick and easy. Unlike in the past — when access to children was required and cameras or video cameras were the tools of the trade — no real children might be involved in the production of AI-generated CSAM.

At the same time, a huge problem is the ease with which AI tools can manipulate images in ways that affect real children. 

Digital technologies have facilitated the creation of highly-realistic ‘deepfakes’ in which a real child’s face can be merged with fake sexual imagery — not only in pictures, but potentially in video format also.

Twitter/X plumbing new depths

It is into this hellscape that Grok (the AI assistant tool on social media platform X — formerly known as Twitter) has ventured. 

The Grok AI tool on Twitter/X instantly complies with user requests to re-edit pictures of people — particularly women, but also children — to 'digitally undress' them or otherwise present them in a sexualised manner. Picture: Yui Mok/PA
The Grok AI tool on Twitter/X instantly complies with user requests to re-edit pictures of people — particularly women, but also children — to 'digitally undress' them or otherwise present them in a sexualised manner. Picture: Yui Mok/PA

Ever since Elon Musk acquired and rebranded Twitter, the platform has been plumbing new depths almost by the week.

The latest grim landmark has been an update to Grok that allows users to instantaneously generate images (either entirely fake, or by manipulating real images) that represent children in a sexualised manner.

Ministers differ on who's responsible  

The response among the public has been one of horror and outrage. The political response, however, has been more mixed. 

The minister for state for AI, Niamh Smyth, said she believes Irish law has been broken by online platforms

However, communications minister Patrick O’Donovan expressed the view that this is a matter of individual rather than corporate responsibility, and that the technological advances “are far faster than the law is able to respond”.

Minister of State for artificial intelligence and digital transformation Niamh Smyth said on Thursday that X is breaking Irish law. File picture: Peter Houlihan
Minister of State for artificial intelligence and digital transformation Niamh Smyth said on Thursday that X is breaking Irish law. File picture: Peter Houlihan

In that context, it is worth examining exactly what the relevant law is. 

What does current Irish law say? 

The Child Trafficking and Pornography Act 1998 (as amended in 2017) defines child pornography as “any visual representation that shows … a person depicted as being a child and who is engaged in real or simulated sexually explicit activity”. 

Also covered are visual representations that show “for a sexual purpose, the genital or anal region of a child or of a person depicted as being a child”.

This definition is sufficiently broad to capture real images, manipulated deepfakes, or entirely fake images — the phrasing of “a person depicted as a child” is key, as is the reference to “real or simulated” sexual activity.

A person who creates such an image would clearly be guilty of an offence of producing child pornography under section 5 of the Act, while a person who shares such an image would be guilty of distributing.

Crucially, section 5(1)(f) also provides that anyone who “knowingly causes or facilitates” the production or distribution of child pornography is also guilty of an offence under the section.

Companies can be prosecuted 

Section 9 of the Act specifically provides that offences under the Act can be committed by bodies corporate.

Accordingly, any company whose technology is being used to create AI-generated CSAM is open to prosecution for knowingly facilitating the production and distribution of material that falls within the definition of child pornography under the Act.

If successfully prosecuted, the company would be liable to significant fines.

Penalties up to 14 years in prison

Moreover, under section 9, any director, manager, or officer of the company would be liable to a maximum of 14 years in prison if it were proven that the offence was committed with their consent or connivance, or was attributable to their negligence.

There will be plenty of times in the coming years when the law needs to adjust to the reality of generative AI tools.

However, this is not one of them. Existing laws in Ireland are sufficiently clear and flexible in their wording to prohibit companies from making AI tools available that can be used for mass-scale creation of CSAM.

Professor Conor O’Mahony is the Dean of the School of Law at UCC.
Professor Conor O’Mahony is the Dean of the School of Law at UCC.

The real test is whether there is willingness to enforce those laws against large tech companies which are based in Ireland. 

These companies are crucial contributors to the Irish economy; but if CSAM isn’t a red line that can never be crossed, then one wonders what else ever could be.

  • Conor O’Mahony is the Dean of the School of Law at University College Cork

• If you are affected by any of the issues raised in this article, please click here for a list of support services.

• Here is the full text of the Child Trafficking and Pornography Act, 1998 on the Irish Statute Book. 

x

More in this section

Revoiced

Newsletter

Sign up to the best reads of the week from irishexaminer.com selected just for you.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited