TikTok exposing children to harmful content even when passively scrolling, study finds
Researchers at Trinity College Dublin and the HSEâs Department of Public Health said there was an 'urgent need' for stronger regulatory enforcement and digital health education to protect children from exposure to harmful content.
Passive scrolling on the social media platform TikTok is exposing children to harmful content, including hateful ideologies, depictions of suicide, and disordered eating, a new study has found.
Researchers at Trinity College Dublin and the HSEâs Department of Public Health found almost one in 20 clips suggested for teenagers on the platformâs âFor Youâ feed violated TikTokâs own safety policies.
They said there was an âurgent needâ for stronger regulatory enforcement and digital health education to protect children from exposure to harmful content, finding voluntary compliance was proving insufficient.
The findings were based on an experiment in which the researchers created four âdummyâ TikTok accounts representing two male and two female users aged between 13 and 15.
Each account scrolled the âFor Youâ feed for three hours, pausing to view content relating to one of four popular themes: conflict, mental health, drugs and alcohol, and diet and body image.
The scrolling was passive as none of the users conducted any searches, followed any accounts, commented on any videos, or âlikedâ any of the content that was recommended for them.
A total of 12 hours of screen recordings were reviewed, comprising 3,023 videos from the âFor Youâ feed. The study found 128, or 4.2%, of these violated TikTokâs published safety policies.
The most frequent types of harmful content that was suggested for the four users included depictions or glamorisation of suicide and self-harm, disordered eating or extreme dieting, and promotion of alcohol consumption.
They also featured content relating to weapons and gun violence, and extremist or hateful ideologies, according to the study published in the latest edition of the .
âThese reels appeared without any search conducted by the users â the platformâs algorithm promoted potentially harmful content based solely on passive viewing time within themed content areas,â the authors explained.
The researchers noted this phenomenon was not unique to TikTok, saying similar algorithmic risks had been reported across other major social media platforms.
âWhile these platforms offer entertainment and peer connection, growing concerns exist about childrenâs exposure to harmful content,â they added.
The authors said problematic use of social media had been linked to negative mental health outcomes, including anxiety, depression and self-harm.
âIreland, as the regulatory base for several major technology companies, is uniquely positioned to lead on protecting childrenâs digital wellbeing,â they wrote, claiming the study had shown voluntary compliance to be insufficient.