Social media firms to tell TDs and senators children are safe online
Concerns about social media use by minors include appalling instances of online abuse facilitated by social media firms. See links below. Picture: iStock
Social media giants will today seek to convince TDs and senators that they are committed to keeping children safe online.
Representatives from Snapchat, TikTok, Meta, Google, and Microsoft will address the Oireachtas children’s committee, with Meta representatives set to say “the safety and wellbeing of young people on our platforms is a core priority”.
Meta, which owns the Facebook and Instagram platforms, will tell politicians it has teen-specific accounts which come with protections. On Instagram, these built-in protections include private accounts by default, the strictest content control settings, overnight notifications being turned off, and app usage reminders after 60 minutes.
Read More
“We know that 97% of teens aged 13-15 have stayed in these built-in restrictions,” Meta will say.
“Last year, we expanded teen accounts to Facebook and Messenger, meaning that teens now benefit from these built-in protections across all of our major platforms.
"Hundreds of millions of teen accounts are now active globally and this number continues to grow as the rollout extends to further countries.”

Earlier this week, Meta announced it is expanding its technology to “proactively identify accounts we suspect belong to teens — even where an adult birthday has been provided — and place those accounts into teen account protections”.
A representative for TikTok will tell politicians that it, too, has similar settings, but that “safety doesn’t stop with settings”.
“When teens need support, parents are often the first people they can turn to, making them one of our most important partners.
"That’s why we remain focused on reaching parents with the information they need about TikTok.
"Since launching our parental controls five years ago, we’ve continually added new features based on feedback from families, as well as guidance from leading experts.”
The TikTok submission will say 95% of user-reported content was removed within two hours, 98% of all violative videos are proactively detected and removed before any user report and 90% of videos that break our rules were removed before receiving a single view.
In its statement, Google will say it acknowledges that “children and teens are spending more time online — and we recognise that protecting them as they do so poses unique challenges”.
“For over two decades, we have worked closely with policymakers, law enforcement, educators, and child safety experts to design and develop tools and resources to protect children and help families navigate technology and make informed choices that are right for their specific circumstances.”
Snapchat will tell the committee it “invests heavily in proactive detection of the gravest harms, including grooming for sexual purposes, sexual exploitation and self harm, and we report illegal child sexual abuse material to the relevant authorities.
"We know we must continually improve these systems as offenders constantly adapt.”
The idea of a social media ban for under-16s has been mooted in recent months, but the recently-published digital and AI strategy stopped short of an outright pledge for a ban.
It said that Ireland would take action “if necessary”, but would work with like-minded EU states to “explore options” on the topic of age restrictions on the use of social media.
- Paul Hosford, Acting Political Editor




