We can protect children online, but the digital wallet isn't the right approach

Additional legislation is required to close the many loopholes that will exist after this measure of online protection is put in place next year, write Alex Cooney and Eoghan Cleary
We can protect children online, but the digital wallet isn't the right approach

Social media companies have clearly shown they are more focused on their profit margins rather than the welfare and safety of the children using their services. Australia’s recent ban on social media has opened the conversation on how best to regulate these platforms.

So Australians under 16 have been banned from "the socials", or some of them anyway. The amendment to their legislation has already attracted plenty of criticism, while at the same time garnering widespread public support. Australia is to be applauded for leading the world in their aim to protect children online, but it is in the implementation that its effectiveness will be defined.

Enter stage left, our Government’s ongoing considerations of how best to protect children online here — the
planned introduction in 2026 of a national digital wallet, a Government-approved age verification tool. But again, how effective will it really be? And is the intention to make it optional or to enforce it?

Importantly, age verification is not the silver bullet solution to online child protection, but it’s a big part of it and needs to be done correctly and effectively. So, a few things need to be clarified:

Myth: It’s not technically possible.

Reality: It is. It’s possible and it’s effective. Apart from what’s being rolled out in Australia, the UK has legally forced a number of porn providers to adopt age verification measures and since doing so, pornography viewing has fallen
significantly, with Pornhub visitors dropping from 9.8m in August to 7.2m last month.

Other porn platforms are also indicating a sustained downturn across the mainstream providers of the industry.

Myth: It’s not legally allowed.

Reality: It is. Initially, the Digital Services Act prohibited member states from imposing age verification systems at a national level on entities not based in their state — this was to make it easier for tech companies to comply.

That has changed. Since the publication of the EU Commission’s guidelines on Article 28 of the act, the question of national age verification measures has been raised in multiple EU member states.

Countries such as Spain, Italy, and France have already, or are in the process of, putting in place specific legislative measures to address age access to adult content — while Austria, Belgium, Denmark, and the Netherlands are looking to do so more broadly for all social media platforms.

Myth: Even if age verification is technically and legally possible, it would compromise people’s privacy to an unacceptable degree.

Reality: Privacy-preserving age-verification systems already exist and can prove that someone is an adult without revealing their identity, birthdate, or any additional personal information. Techniques such as zero-knowledge proofs and selective-disclosure credentials allow a person to confirm they are over 18 while keeping all other data private. These systems can be managed by independent, non-government, non-platform-owned third-party providers, ensuring that neither the Government nor tech companies gain access to additional personal data.

This design approach minimises surveillance risks while still enabling effective age checks. It sounds complicated and it is, but in reality, it takes seconds; we’ve tested it.

In keeping with the Digital Services Act, this approach doesn’t put any additional obligations on digital service providers in Europe; it avoids an outright ban of children from the online world, balancing their rights to participate in life online while also ensuring their right to be protected from harm is secured.

But it is imperative that we also establish an independent risk assessment process that can be applied to all digital environments that children can access — like an online equivalent to the Irish Film Classification Office. Right now, the act guidelines allow the tech companies to assess their own risk, but they have long proven that their interests lie solely in their profit margins and not in the welfare or safety of our children.

The likes of TikTok, Instagram, Snapchat, and Roblox should have to prove they are safe by design (if they can) before we classify them as being appropriate for our children to access.

Do virtual private networks (VPNs) render the whole thing redundant?

Initially, yes, until we require zero knowledge proof age verification to access VPN providers. This would mean that adults could still enjoy anonymity online while children would be restricted from accessing VPNs.

Who’s going to pay for it all?

The technology companies behind the social media and pornography platforms who have their billing headquarters in Ireland to avail of our low corporate tax rate should have to pay for it all.

Why? Because they already collectively owe the State almost €4bn in fines, as revealed by the Data Protection Commission’s investigations, each one of which they continue to fight against in the courts.

A few more very important questions:

  • Is the seemingly protective and effective solution described above what the government is proposing with the national digital wallet? No. Unfortunately, by the sounds of it so far, the national digital wallet will be employed as an optional tool for informed parents to enforce for their kids.
  • Will every child automatically be protected? No; just those who elect to use it.
  • Will they be protected from all harms online? No; only on platforms based in the State — which is why, like other European countries have done, we need to legislate around this limitation.
  • Will there be countless ways for kids to get around it? Yes.
  • Will there be any incentive for the online services to create online spaces for children that are safe by design? No.
  • Will the most vulnerable children be the ones left at the mercy of big tech to be exploited ever more? Yes.
  • Will there even be a minimum age requirement enforced for social media access? No, not unless we legislate for it.

So why, in light of all the above information, is the Government not legislating for a system that would enable very high levels of protection for all children equally, while at the same time allowing the rest of us to continue living our online lives with the level of privacy we’ve become accustomed to?

We urge you to put this question to Government. Other countries are moving on this, because their people demand it; we need to do the same.

  • Alex Cooney is the CEO of CyberSafeKids. Eoghan Cleary is a secondary school teacher, working as an educational expert with the Sexual Exploitation Research and Policy Institute. Both are involved in GenFree, the national campaign to protect children online, they were also both members of the online health taskforce whose report was recently published.


More in this section

Revoiced

Newsletter

Sign up to the best reads of the week from irishexaminer.com selected just for you.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited