After 23 years of Safer Internet Days, are children any safer?

Governments say the era of self-regulation is over, but until we tackle the 'launch first, figure out the harm later' thinking, nothing will change, writes Alex Cooney of CyberSafeKids
After 23 years of Safer Internet Days, are children any safer?

While researchers debate the scale of harm, the evidence that children are being harmed is undeniable. File photo

It’s been 23 years since the first Safer Internet Day and the sad fact is that if we ask ourselves are we any closer to having made the online world safe for children, then the answer is no.

I say that having spent the last decade working for a safer internet, and as a parent of teenagers who are now active online users themselves, like all their friends.

I say it despite new regulatory frameworks at both European and national levels. None go far enough. I'm heartily sick of hearing that oft-repeated phrase: 'the era of self-regulation is over’, when it clearly isn't. 

So much is still left to the discretion of the immensely powerful companies behind the online spaces we — and our children — inhabit daily.

Recommender systems

The truth is that the problem is the model itself: a model based on engagement. The more engagement, the more money. Big Tech has reaped the rewards while the harm has intensified.

Since around 2009, Facebook and other platforms discovered they could boost engagement by using algorithmic recommendations rather than simple chronological feeds. 

Today, most major social platforms — including gaming environments — rely on algorithm-driven recommender systems designed to maximise engagement, fundamentally changing how content reaches users.

This model has proven deeply problematic. It turns out that engagement thrives on shock, horror, outrage, and negativity. 

For children, this too often means exposure to content that has no positive place in their lives: eating disorder material, suicide and self-harm content, misogyny, hate speech, and pornography — though that latter term fails to capture the reality of today's online pornography, which is increasingly violent, demeaning, and coercive.

On Friday, the European Commission published preliminary findings under the Digital Services Act that TikTok has breached EU rules through "addictive design" features such as infinite scroll, autoplay, push notifications, and its highly personalised recommender system. Similar investigations are ongoing into Meta's Facebook and Instagram.

If history is any guide, these companies will use their vast resources to fight these findings tooth and nail — as they have with numerous regulatory actions to date, dragging out proceedings for years — all the while children continue to be exposed to harmful content. 

Interesting to also note that representatives from Google, TikTok and Meta were up in front of a Joint Oireachtas Committee hearing last week, during which they flat-out denied that their products were designed to be addictive.

Digital parenting

While researchers debate the scale of harm, the evidence that children are being harmed is undeniable. The precautionary principle tells us: act now to prevent harm, rather than wait for definitive proof while another generation of children pays the price.

Given the momentum of the Smartphone Free Childhood Ireland movement and the growing number of school and community-based "hold off” groups nationwide, it's clear many parents share these concerns. 

Our research confirms parents are struggling: among 1,700 parents surveyed, 25% said they "do not understand" the apps and games their children use, fewer than half were using parental controls, and a quarter wouldn't know how to implement them.

These findings align with research from Block W/ESRI research in 2025 on digital parenting, which found parents struggle to manage their own screen use, let alone their children's. Crucially, the research emphasises that even with the best skills and intentions, parents cannot manage this alone — regulation remains vital.

Regulation

Which brings me to the critical question: what can actually be done while children continue to be harmed?

Ireland is not alone in grappling with this. Australia introduced a ban on social media for under-16s in December 2024. France and Spain are proposing similar measures. Ireland continues to prevaricate on what steps it plans to take.

These European proposals to ban are particularly interesting because, in theory, national-level restrictions conflict with EU harmonisation rules designed to reduce regulatory burden on tech companies. 

Yet if we're going to restrict children's access to anything, surely it should be pornography first — the extreme violence, torture, and degrading content our children can freely access today. 

The fact that most pornography providers operate outside the State is often cited as a barrier to action. Yet Italy, Spain, and France have moved to block such content regardless. So why are Ireland's Government and regulator sitting on their hands on such a critical child safety issue?

Children's rights

But beyond Ireland's failure to act, there's a deeper problem with how we're framing this debate. We need a more radical approach that respects children's rights. 

Children have the right to participate in the digital world, to learn, connect, and play online. But that right to participate must be matched by an equally fundamental right to protection from harm. 

An outright ban fails this test — it limits access to some platforms while requiring no reform of the underlying business model. That's the critical flaw: we're not addressing the root cause of harm.

It's time to treat tech platforms like any other product marketed to children. Would we allow a toy company to sell products to children without safety testing? Would we accept "trust us, it's fine" from a pharmaceutical company in relation to a new drug?

Yet we've given tech giants exactly that free pass. Platforms like Snapchat, TikTok, YouTube, Instagram, and Roblox must prove — through independent, verifiable safety assessments — that they're safe for children before they're granted access to young users. 

Not, as it is now, long after the harm is done.

More in this section

Revoiced

Newsletter

Sign up to the best reads of the week from irishexaminer.com selected just for you.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited