It's crystal clear that an unregulated digital world carries far too many risks for the young

The guidelines under the Digital Services Act emphasise that platforms must identify how risky their content is to children, and then take practical and workable steps so that children don’t access it. File picture
An EU-wide survey published last month, found that overall, one in three Europeans surveyed were concerned that there were insufficient online protections for children and young people.Â
Digging a little deeper, Irish people were more sanguine than the average European about how digital rights and principles were applied, with almost half of those questioned in Ireland (49%) being satisfied compared to a European average of 42%. And yet, for those who care for and work for and with children, and have seen the harms that can occur, it is crystal clear that the digital world carries far too many risks for children and young people.
Regulating tech platforms to produce systems and material that do not harm children is an ongoing topic at both Irish government and regulator level, and at EU level too. Now, in this second half of July, some important advances have been made.Â
After long deliberations, the European Commission issued guidelines on Monday, July 14, to a wide number of tech platforms setting out how they are to apply EU legislation – particularly the Digital Services Act – to better protect children and young people from engaging with illegal and harmful content online.Â
While the Act was there already, these guidelines are more practical, more precise, with clearer emphasis on ensuring children’s human rights and equality are respected. They emphasise that platforms must identify how risky their content is to children, and then take practical and workable steps so that children don’t access it.Â
They will have to ensure that their recommender systems – which control things like the ‘For You’ function on apps – is stricter for young people on the company’s site. They must have practical, effective age assurance systems to stop children from accessing inappropriate content.Â
Meanwhile back at home, the second and final section of Coimisiún na Meán’s Online Safety Code comes into effect today. This applies to a smaller number of companies but includes some of the largest social media companies, which have their European headquarters in Ireland.
Those companies have had obligations under the first part of the Online Safety Code since last November which meant that they were obliged to take steps to protect children and young people from content which could "impair their physical mental or moral development".Â
The second part of the code, going live today, puts specific obligations on companies to effectively prevent children from accessing adult-only content such as pornography and to have good parental control and flagging systems.Â
While we would want to see the regulators playing a much stronger role in enforcing uniform, high safety standards, rather than each platform setting its own rules, these requirements will mean that there will be better protections for children and young people using these online platforms/services.Â
At the end of the day, that is progress.
While tech companies, for the most part, will say that they welcome fair regulation, there is some restiveness. Coimisiún na Meán, the Irish regulator, has had to issue a formal notice to X – formerly Twitter - to say that it’s not satisfied with the information X has submitted about the protection of young people on its platform.Â
And the same company is challenging the regulator about whether the entire code should apply to it at all.
These regulatory changes come with few fireworks and clashing cymbals, but they need to be acknowledged. They are a step forward. They also need to be widely known.Â
They will have to be monitored and implemented by Coimisiún na Meán but the rest of us – caregivers, educators, child rights organisations and advocates - will also need to know what’s there to highlight discrepancies, to complain about breaches, to require the tech platforms to be a bit more responsible for children’s safety.
We will also have to continue to highlight that while technology is with us and brings great benefits to everyone including children and young people, tech platforms and those who are in the business of making sometimes extraordinary profits from selling their digital products and advertising to us, must do it in a way that is decent and fair.Â
We will need to be a lot less complacent than the recent survey shows we have been about the welfare and rights of our children.Â
And we will need to continue to insist that our government and our regulators continue to scrutinise the tech industry to ensure the safety and rights of children and young people in Ireland in this, our digital world.
- Noeline Blackwell is the Online Safety Co-Ordinator with the Children's Rights Alliance