Meta and TikTok say ban on under-16s would not make social media safer
TikTok's head of public policy and government relations Susan Moss, and minor safety public policy lead Richard Collard arriving at Leinster House this morning for the Oireachtas children's committee. Picture: Sam Boal/Collins
Meta has claimed an overall ban on social media for children would not achieve a safer space.
The Oireachtas joint committee on children heard on Thursday from representatives of TikTok, Meta (WhatsApp, Facebook, and Instagram), Snapchat, Microsoft, and Google. The Government has been considering plans to ban social media for under-16s, following the example of Australia.
Read More
However, a digital strategy announced in February stopped short of a commitment in this regard, saying it would work with the EU on it instead.
Meta's head of public policy Ireland, Dualta Ó Broin, said if the aim is to “achieve safer environments for young children,” it would not achieve that.
Minor Safety Public Policy Lead at TikTok, Richard Collard, added that while TikTok did not support or oppose a ban, “badly-designed legislation carries risk".
“And if restrictions only apply to a certain number of platforms, or a named list of platforms, or there are exemptions, any risk will move to those platforms,” Mr Collard added.
Separately, Meta said the creation of “teen accounts” would help restrict the type of content which appears on the feed of users who are aged between 13 and 17 years old.
But, it said that while it uses both human and AI to verify a person’s age, age verification mechanisms need to be set up at app store levels.
Mr Ó Broin said this is already happening with some apps and that it would make sense “for that verification to happen once, potentially when the device is being set up and that the (age) signal can then be shared out with all of the apps that are coming on the App Store".
However, Child Safety Public Policy Manager at Google, Chloe Setter, said deferring these responsibilities is a “cynical attempt to shift responsibility elsewhere".

She said making sure children are age-verified on app stores is like asking shopping malls to ask for ID on entry instead of asking at the alcohol section.
“The reason I say it’s ineffective is because apps are not the only way that content is accessed,” Ms Setter said.
She added that any solution created at an app store level would not cover websites, which would lead to children “not being protected and have the age-appropriate experience on websites".
“Also, many apps come preloaded or sideloaded onto phones when you buy them. So they're not actually downloaded from the App Store.
“They're already on the phones. And there are privacy issues around sharing age data across every single app that we have on the App Store,” Ms Setter added.
Meanwhile, TikTok has claimed it is not “addictive”, and while it takes seriously the European Commission's preliminary finding that it breaches EU law, it disagrees with the findings.
It claimed that its 60-minute screen time would need to be bypassed by a passcode, and it would reduce a child’s screen time.
“We acknowledge that screen time is something that parents and children care a lot about,” Mr Collard said. "And that's why we have put in a range of different solutions to help people manage their control. So we have a 60-minute screen time cap by default for everyone under 18."
TikTok claimed that children below 16 stop receiving push notifications after 9pm and, for adults, they stop at 10pm.





