Meta urged to build safer systems as Irish media regulator investigates ‘dark patterns’
Meta urged to improve safety as Irish regulator investigates Facebook and Instagram over ‘dark patterns’
Social media platform Meta has been called on to build safer products and systems and to engage with Coimisiún na Meán in its investigation into its practices.
The regulator on Tuesday announced it is investigating Meta platforms, Instagram and Facebook, to determine whether they have breached provisions of the Digital Services Act (DSA) by failing to provide information and transparent options regarding content recommender feeds.
It said there are concerns that so-called “dark patterns” may be preventing people from accessing a feed not based on profiling. Profiling is the use of automated systems to personalise content or ads based on patterns in a person’s data or behaviour.
If a platform is found to be in breach of the DSA, Coimisiún na Meán can apply financial sanctions, including a fine of up to 6% of turnover.
Read More
The investigation will examine whether Facebook and Instagram interfaces deceive or manipulate users away from choosing a recommender system feed that is not based on profiling of their personal data.
In a statement, the commission said the move followed initial assessments by its platform supervision team and a review of complaints it received.
Online Safety Coordinator with the Children's Rights Alliance, Noeline Blackwell, said that at present, when people go online, the platform they use collects information about them, which can be used to build a profile that may then be used to push new material at users unless they opt out.
“People should be able to switch that off easily. The concern that the regulator has, and that they're investigating, is that the company is using children's and young people's profiles in particular to push information at them," Ms Blackwell told
“So they might be looking at something that they genuinely want to look at, but that the information that they get from the company after that might not be in their best interest and might not be suitable and might be harmful.”
Such information could take people “down rabbit holes”, she warned.
“Look, that doesn't only happen to children and young people. Lots of people can go down rabbit holes, can find that they have an interest in one thing, and they're fed more and more information, and that they get into a stage where they could end up getting anxious about something," Ms Blackwell said.
“These companies are very big, profitable companies. They are an industry themselves. They want to make money.”
She added that EU regulations require companies to operate in a safe way.
“In the Children's Rights Alliance, we always say there's almost nothing in the European Union that you can do without a Certificate of Safety about it. These are the exceptions. So this is where, when the companies are not behaving in a way that is safe, then the regulator can come in," Ms Blackwell said.
“The regulator isn't the fastest operator in the world. It's not the quickest, but it can be the most comprehensive. And it is what, if Coimisiún na Meán investigates this and finds that these dark patterns are in fact being applied, they can hold the company accountable. And the problem in some ways is that Meta is saying, again, nothing to see here.
“Meta will insist on fair processes. There will be a preliminary finding of fact. And what would be really great would be if Meta would engage with this. And instead of saying, as they are doing, nothing to see here, if they would actually say, we can build a safer system. Because that's really what the industry needs to do, is build safe products and systems," she added.
It is the latest in a series of investigations being carried out by the regulator into potential breaches under the EU Digital Services Act.
These include:
- TikTok, in respect of suspected non-compliance related to the protection of minors;
- TikTok, in relation to suspected systemic risks to election integrity in the context of the Romanian presidential elections;
- X, in relation to suspected systemic risks linked to its deployment of Grok;
- Shein, in relation to suspected non-compliance on the sale of illegal products, addictive design, and recommender system transparency.


