'Digital identity wallet' could block children from accessing harmful content online
The European Commission want to use an age verification app to prevent children from accessing pornographic or gambling websites.
An EU “digital identity wallet” could be used to prevent children from accessing pornographic or gambling websites as part of wider measures to keep them safe online, the European Commission has said.
In new guidelines, the commission said online platforms must use “accurate, reliable and robust” age verification methods to restrict access to adult content or risk facing heavy penalties.
“The adoption of the guidelines marks a milestone in the Commission’s efforts to boost online safety for children and young people under the Digital Services Act,” it said.
“The guidelines set out a non-exhaustive list of proportionate and appropriate measures to protect children from online risks such as grooming, harmful content, problematic and addictive behaviours, as well as cyberbullying and harmful commercial practices.”Â
The Commission said that each member state is in the process of providing their citizens with an EU Digital Identity Wallet, and must do so by the end of 2026.
It said this could be a safe, reliable and private means of confirming a person’s age without the online platform receiving any of your personal data.
Prior to that rollout, the European Commission said it is now testing a standalone age verification app that would meet the standards required.
“Users will be able to easily activate the app and receive the proof in several different ways,” it said.
“The proof only confirms if the user is 18 years or older. It does not give the precise age, nor does it include any other information about the user.”Â
In its guidelines, the European Commission also indicated it is open to allowing individual countries to set minimum ages for people to access platforms such as social media sites.
Furthermore, it said that children’s accounts on social media sites should be set to private by default while empowering them to be able to block and mute any user.
Accounts should also be restricted from downloading or taking screenshots of any content posted by minors to prevent the “unwanted distribution of sexualised or intimate content and sexual extortion”.
It added that it will use these guidelines as a reference point to ensure that online platforms are meeting their legal obligations and may inform national regulators of the enforcement actions they are taking.
Responding to the guidelines, Ireland's regulator Coimisiún na Meán said it will use the guidelines to help ensure companies are doing what is required of them to protect children online.
“This will include formal enforcement action in cases where a failure to implement the guidelines leads to a suspicion of non-compliance with the Digital Services Act,” it said.