Social media ban for under-16s takes responsibility away from Big Tech, says lawyer
Irish human rights lawyer Michael OâFlaherty said: 'Children are exposed to violent, sexual, or distressing content, grooming, and rapidly spreading disinformation.'
Banning social media for under-16s would âshift the responsibility for safety from the platforms that create the environment to the children who navigate itâ, Europeâs most senior human rights official has said.
The commissioner for human rights at the Council of Europe, Irish human rights lawyer Michael OâFlaherty, suggested that while such calls which are gathering momentum arise from legitimate concerns, it should not deflect accountability from the social media platforms themselves.
âThe current online ecosystem is failing children,â Mr OâFlaherty said. âChildren are exposed to violent, sexual, or distressing content, grooming, and rapidly spreading disinformation.
âOpaque algorithms direct them toward extreme material, while manipulative designs influence their behaviour, and pervasive data collection compromises their privacy.
âThese outcomes are foreseeable results of specific design choices and business models, necessitating regulatory intervention at the source.âÂ
Read More
The topic has come into sharp focus in recent months, after the Grok undressing controversy as well as Australia bringing such a ban into force.
It had appeared the Government would legislate in a similar manner, with TĂĄnaiste Simon Harris recently remarking that he believes there needs to be a minimum age in relation to social media and children âare not safe on the internet, simple asâ.
Last week, however, its new digital and AI strategy stopped short of an outright pledge for a ban on under-16s on social media.
It said Ireland would take action âif necessaryâ, but would work with like-minded EU member states to âexplore optionsâ on the topic of age restrictions on the use of social media.
Mr OâFlaherty said the âpervasivenessâ of algorithmic systems that are used on social media sites reinforces the need for regulators to step in.
This includes ensuring there is transparency around the algorithms being used, as well as effective reporting and redress mechanisms, childrenâs rights risk assessments, independent audits and restrictions on targeted advertising.
âThese obligations must be enforceable, subject to independent oversight, and supported by sanctions and liabilities that are effective deterrents,â he said.
The human rights lawyer said the European Commission is already taking steps against platforms, citing recent action against TikTok where it hit out at the social media platformâs alleged addictive design. TikTok reacted strongly in rejecting the commissionâs assessment, with the investigation still ongoing.
Mr O'Flaherty suggested that individual EU member states should adopt a similar approach to what has been taken by the European Commission by allocating resources to regulators, co-ordinating actions and making sure any penalties dished out outweigh any economic benefit these platforms are getting from such practices.
âBefore considering a ban, governments and parliaments should pause and exercise caution,â he added.
âThey should consult with experts, civil society, and children, and ensure that proposals are human rights compliant. The source of harm is rooted in the design and incentives of the platforms. That should be the primary focus of regulation.â



