Social media ban for under-16s takes responsibility away from Big Tech, says lawyer

Social media ban for under-16s takes responsibility away from Big Tech, says lawyer

Irish human rights lawyer Michael O’Flaherty said: 'Children are exposed to violent, sexual, or distressing content, grooming, and rapidly spreading disinformation.'

Banning social media for under-16s would “shift the responsibility for safety from the platforms that create the environment to the children who navigate it”, Europe’s most senior human rights official has said.

The commissioner for human rights at the Council of Europe, Irish human rights lawyer Michael O’Flaherty, suggested that while such calls which are gathering momentum arise from legitimate concerns, it should not deflect accountability from the social media platforms themselves.

“The current online ecosystem is failing children,” Mr O’Flaherty said. “Children are exposed to violent, sexual, or distressing content, grooming, and rapidly spreading disinformation.

“Opaque algorithms direct them toward extreme material, while manipulative designs influence their behaviour, and pervasive data collection compromises their privacy.

“These outcomes are foreseeable results of specific design choices and business models, necessitating regulatory intervention at the source.” 

The topic has come into sharp focus in recent months, after the Grok undressing controversy as well as Australia bringing such a ban into force.

It had appeared the Government would legislate in a similar manner, with Tánaiste Simon Harris recently remarking that he believes there needs to be a minimum age in relation to social media and children “are not safe on the internet, simple as”.

Last week, however, its new digital and AI strategy stopped short of an outright pledge for a ban on under-16s on social media.

It said Ireland would take action “if necessary”, but would work with like-minded EU member states to “explore options” on the topic of age restrictions on the use of social media.

Regulation needed

Mr O’Flaherty said the “pervasiveness” of algorithmic systems that are used on social media sites reinforces the need for regulators to step in.

This includes ensuring there is transparency around the algorithms being used, as well as effective reporting and redress mechanisms, children’s rights risk assessments, independent audits and restrictions on targeted advertising.

“These obligations must be enforceable, subject to independent oversight, and supported by sanctions and liabilities that are effective deterrents,” he said.

The human rights lawyer said the European Commission is already taking steps against platforms, citing recent action against TikTok where it hit out at the social media platform’s alleged addictive design. TikTok reacted strongly in rejecting the commission’s assessment, with the investigation still ongoing.

Mr O'Flaherty suggested that individual EU member states should adopt a similar approach to what has been taken by the European Commission by allocating resources to regulators, co-ordinating actions and making sure any penalties dished out outweigh any economic benefit these platforms are getting from such practices.

“Before considering a ban, governments and parliaments should pause and exercise caution,” he added.

“They should consult with experts, civil society, and children, and ensure that proposals are human rights compliant. The source of harm is rooted in the design and incentives of the platforms. That should be the primary focus of regulation.”

x

More in this section

Lunchtime News

Newsletter

Keep up with stories of the day with our lunchtime news wrap and important breaking news alerts.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited