Minister claims users, not X, responsible for generating explicit images with Grok

Communications minister Patrick O’Donovan said technological advancements around AI are moving faster than laws can respond, and that some sharing of images is covered under Coco’s Law
Minister claims users, not X, responsible for generating explicit images with Grok

Patrick O'Donovan: 'Ultimately, at the end of the day, it’s a choice of a person to make these images.' File photo

Responsibility for generating explicit images through Elon Musk's artificial intelligence chatbot Grok lies with users, rather than social media firm X, communications minister Patrick O’Donovan has claimed.

In a break with the consensus of his Government counterparts, Mr O’Donovan said it is “reprehensible” for people to use AI and generate explicit images without consent, but did not lay the blame on X.

Grok is designed to perform a multitude of functions, such as answering questions, summarising convoluted documents, and generating imagery.

Users have availed of a new tool called “edit image” on Grok since late last month, and X has faced sustained criticism because some are using it to remove clothing from real people.

“Ultimately, at the end of the day, it’s a choice of a person to make these images,” Mr O’Donovan said.

Asked why Government departments were continuing to use X amid the controversy, Mr O’Donovan said: “Every Government department will obviously make its own decision with regard to that, but it’s not necessarily the app that’s making the images."

The minister said technological advancements around AI are moving faster than laws can respond, and that some sharing of images is covered under Coco’s Law. This law criminalises the sharing of intimate images without consent.

Facilitating

Professor Conor O’Mahony, Dean of the School of Law at UCC and former Government Special Rapporteur on Child Protection, rejected Mr O’Donovan’s assertion that solely users were responsible.

Mr O’Mahony said there are existing laws, through the Child Trafficking and Pornography Act 1998, which deal with facilitating the production of child sexual abuse imagery (CSAM) by companies.

“The keyword is facilitating, so that word is in the legislation that it’s an offence to facilitate the production or distribution of child pornography,” Mr O’Mahony said.

Mr O’Mahony said the legislation allows for prosecutions to be brought against companies, as well as individuals involved within the companies.

“Section nine of the Act specifically says that where a company commits an offence, you can also pursue a prosecution against any director, manager or officer of the company where the offence was committed with their consent or connivance, or as a result of their negligence,” he said.

If convicted, companies can face significant fines or individuals within the companies can face a maximum prison sentence of 14 years, Mr O’Mahony added.

Mr O’Mahony said that with these laws already in place, there is no need for the Government to design bespoke laws to tackle AI CSAM.

“In effect, right now we could begin a criminal investigation of X or any other online platform in the morning,” he said. “If that investigation indicated there was something worth pursuing a prosecution, that prosecution could proceed right now.”

More in this section

Politics

Newsletter

From the corridors of power to your inbox ... sign up for your essential weekly political briefing.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited