Will tech giants finally take online safety for children seriously?

Part B of the nascent code means that the video-sharing platforms under its remit that allow pornography, like X, must use effective age-assurance controls to make sure children cannot watch it. File picture
The wild west of social media self-regulation has come to an end, but the battles that will define this new era have only just begun after a very busy week in this hotly contested space.
Last Monday, the second part of Ireland’s regulator Coimisiún na Mean’s Online Safety Code came into force. It came after a nine-month lead in time for companies to prepare its systems for the code aimed at keeping people, particularly children, safe online.
This Part B of the nascent code means that the video-sharing platforms under its remit that allow pornography, like X, must use effective age assurance controls to make sure children can’t watch it.
In other words, the Elon Musk-owned platform formerly known as Twitter must make sure people are aged 18 or over to view porn that is available on it.
There are other aspects to it too including prohibiting and sharing of content harmful to children such as content promoting eating disorders, self-harm or suicide, cyberbullying, hate speech, and extreme violence.
Critics have claimed parts of the code are too vague and don’t provide clear enough timelines to take action against those in breach. These same critics say it will be on the regulator to show it has the teeth to hold platforms to account.
In theory, X or any of the other firms to which it applies like Meta and Youtube could face heavy penalties if they don’t adhere to it. €20m or 10% of turnover, whichever is greater, can come in fines for breaches of the code. The latter percentage figure could run into billions of euro for some firms.
But, just because the code came into force on Monday, it didn’t mean things had changed overnight.
Fine Gael TD Keira Keogh, who chairs the Oireachtas Children’s Committee, said the following day that children could still set up accounts which “opens a doorway to unlimited inappropriate, disturbing and damaging content”.
“Parents are understandably frustrated that as of now, nothing has changed and their kids are still at risk of being exposed to all that is sinister in the world of social media,” she said.
Given the availability and proliferation of the kinds of nasty content people have become used to on social media feeds, advocates had stressed how much firms shouldn’t be let avoid their obligations any longer now Coimisiún na Meán had its powers in place.
“Platforms have benefited from a substantial nine-month implementation period since the Code's publication in October 2024, allowing them more than enough time to develop robust age verification systems other than self-declaration, stringent content controls to prevent child exposure to harmful material, and clear and easy-to-use reporting systems,” charity CyberSafeKids said.
It appears that the regulator agreed.
On Wednesday, Coimisiún na Meán wrote to X seeking an explanation as to why there were still no age checks to watch pornography and asking them for an explanation as to how they were complying with their obligations by Friday.
“Platforms have had nine months to come into compliance with Part B of the Code,” it said.
“We expect platforms to comply with their legal obligations. Non-compliance is a serious matter which can lead to sanctions including significant financial penalties.”
The regulator also said it would take further action if there is evidence of non-compliance with the Online Safety Code.
“We are continuing to review all of the designated video-sharing platforms to assess their compliance with the Code and will take any further supervisory, investigative or enforcement action required,” it added.
The pressure on X and other platforms isn’t just coming from Ireland. Across Europe, regulators are trying to get to grips with regulating this kind of content online.
In the UK, its Online Safety Act sets out children’s codes which came into force on Friday that will see some services, including pornographic websites, starting to check the age of UK users. Again, non compliance can see a fine of 10% of turnover, or even its executives jailed.
From Friday, anyone trying to access pornographic content in the UK would’ve been met with a new check on their age before they could access that site, as platforms clearly got the UK’s message. On the other hand, concerns have been raised over a wider restriction on content deemed “unsuitable” and whether that amounts to censorship online.
At home, the Irish regulator’s work also fits in with wider European legislation, namely the Digital Services Act, and investigations from the European Commission into major platforms.
It’s all very complex, but our Online Safety Code sits with the Digital Services Act and the EU’s laws on terrorist content online. All together, they’re supposed to allow regulators to hold the social media companies to account in a variety of ways.

Under the Digital Services Act, for example, the European Commission recently opened formal proceedings against sites including Pornhub and XVideos while member states also grouped together to take action against smaller pornographic platforms.
The Commission said these major sites hadn’t put in appropriate age verification tools to safeguard minors. An in-depth investigation is now under way.
Curiously timed as it fell within the same week as Ireland’s and the UK’s safety codes came into force, X did publish the methods it will use to check users ages, which include the use of a live selfie with an AI used to determine age or using someone’s email address to estimate their age.
“We are required by regulations including the UK's Online Safety Act, the Irish Online Safety Code and the European Union Digital Services Act, to verify your age for access to certain types of content,” X said on its website.
In Ireland, the regulator prescribes that age checks must be robust, effective and protect privacy and it’s understood it will be considering X’s proposals in this regard. Even in lieu of that, age verification on X appeared to have already come into force as access to such content became restricted over the weekend.
Things are changing and changing quickly.
Charities working in this space have said that while the legislative obligations on platforms are now clearly present where they hadn’t been before, enforcement will be key.
In a statement to the
, CyberSafeKids said: “What we expect to see over the next 12-24 months is tech companies finally stepping up and accepting responsibility and accountability to ensure children are not accessing platforms that were not designed for them in the first place and that they're shielded from the kinds of harmful content they contain.
It said that if companies continue to drag their heels, the regulator must act firmly to impose quick and substantial financial penalties for non-compliance.
Meanwhile, online safety coordinator at the Children’s Rights Alliance Noeline Blackwell said given Coimisiún na Meán had opted for a principles-based approach, we will be very reliant on the regulator to be proactive to ensure companies meet their obligations.
“Its Commissioners will need to ensure that they have the people, the expertise, the finances that they need and they will then need to have the will to follow up with the companies,” she said.
“We believe that it is extremely urgent that platforms are scrutinised for compliance and taken to task if they do not comply.
“The real urgency with these regulations is that every day, every hour that the appropriate safeguards are missing is an hour, a day that children active on these platforms are at risk of harm from all the issues that the Code is meant to protect them from. That’s the whole point of the legislation.
“It’s not a game between the regulator and any or all of the platforms. It’s a real threat to children when these systems are not in place.”