Ireland urged to seize 'seatbelt moment' and regulate against online harm for rest of world

Ireland urged to seize 'seatbelt moment' and regulate against online harm for rest of world

Professor Brian O’Neill of Media Literacy Ireland echoed the concern that education can only go so far. File photo

Ireland is facing the equivalent of the “seatbelt moment” for cars, where protecting people from online harms has to go beyond education and awareness-raising, according to experts.

Speaking before an Oireachtas Committee on Media, Mary Aiken, Professor of Forensic Cyberpsychology at University of East London, said that with so many internet giants headquartered in Ireland, the action Ireland takes to regulate the online space will impact the rest of the world.

“There was a famous book back in the 60s about the automobile industry, titled Unsafe at Any Speed. It led to the introduction of safety belts. I think the parallel here is that the internet is Unsafe at Any Speed. I think this is Ireland's seatbelt moment,” she said. 

What we do in Ireland will impact the rest of the world, and I don't say that lightly.

Looking at progress in the UK, on an equivalent online safety bill, Prof. Aiken recommended that Ireland also provide for the establishment of an advisory committee on disinformation and misinformation.

“The UK government aims to tackle the problem of disinformation through
 a requirement for Ofcom, the UK comms regulator to establish an advisory committee. The report also notes that the viral spread of misinformation and disinformation poses a serious threat to societies around the world and that media literacy is not a standalone solution,” she said.

Professor Brian O’Neill of Media Literacy Ireland echoed the concern that education can only go so far.

“It is important to note that while media literacy increases resilience to many of the issues that are associated with digital communications, it should not be seen nor should ever be a solution on its own. People form beliefs for complex reasons, and skills and knowledge alone may not be enough to guarantee informed decision making,” he said.

Prof. Aiken also warned that the Online Safety and Media Regulation Bill will not be “practicable, feasible, workable, or successful” without first compiling a full taxonomy of the different online harms which can be experienced by internet users.

“You can't look at bullying without thinking about harassment. You can't look at harassment without thinking about misinformation and disinformation. You can't consider online harms without factoring in aspects like cyber fraud. We need a framework and classification system, and then one by one, we can begin to make sense of these harms and look at legislation that may tackle some or hopefully in time, all of them,” she said.

Prof. Aiken said this taxonomy could then be used to build on the safety tech sector, which could automate safety measures to protect users from harm, such as using AI to help flag and take down misinformation or offensive content.

However, Dr Eileen Culloty of the Institute of Future Media, Democracy and Society, Dublin City University, warned that the nuances of harmful content cannot always be spotted by artificial intelligence (AI).

“We have to be very cautious about assuming that these technologies can be the solution to this. It's very difficult to say that even a fact checker or a journalist or someone could come along and say something is categorically true or false. And so extending that out, it's extremely difficult to say you could rely on a piece of AI to start categorising disinformation,” she said.

Dr Culloty said it is fundamentally important to require platforms to be open to independent audits or sharing of information, to assess the effectiveness of current protection measures, such as AI content regulation.

More in this section

Lunchtime News

Newsletter

Get a lunch briefing straight to your inbox at noon daily. Also be the first to know with our occasional Breaking News emails.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited