Girl's suicide cited in call to regulate social media

A cover letter in the submission said recommender systems on social media platforms select “emotive and extreme content” and show it to people it estimates are most likely to be outraged. File photo
The tragic death of a girl who took her own life after being sucked into websites presenting suicide as a “romantic way” to end a life has formed part of a large-scale submission to regulate online media.
More than 60 organisations, co-ordinated by the Irish Council for Civil Liberties, have demanded the so-called “recommender system” on social media platforms — which places suggested content into the feed of users — be turned off, citing harms that it causes.
In a submission to Coimisiún na Meán, the new media regulator, the organisations share real-life stories detailing the sometimes devastating consequences.
In one case, an uncle said that his niece was dragged into websites that were detrimental to her mental health and contributed to her taking her own life.
He said: "My beautiful, intelligent, accomplished niece was encouraged, incited to see suicide as a romantic way to end her life.
“She did end it.”
He added: “Earlier she had been encouraged to see more and more sites by people who espoused the idea that people suffering from mental health issues should stop their medications and force society to accept them as they were.
“This led her a dangerous downturn from which she never recovered, leaving her poor parents devastated and her family changed for the worse.”
In another case, an individual described how their father had become radicalised by disinformation promoted on his social media feeds.
“My father has slowly been radicalised by the content pushed to his feed on Facebook. He watches the short videos and accepts all the information in the video without any verification on his part."
The person said that if they questioned him, he would call them a liar.
“The videos can directly state conflicting information, but he will accept it all as fact without thinking about it,” the person said.
The cases were collected by one of the 62 organisations, Uplift.
A cover letter in the submission said recommender systems select “emotive and extreme content” and show it to people it estimates are most likely to be outraged.
“These people then spend longer on the platform, which allows Big Tech corporations to sell ad space,” the letter said.
“Meta's own internal research disclosed that a significant 64% of extremist group joins were caused by their algorithms.
“Even more alarmingly, Amnesty found that TikTok’s algorithms exposed a 13-year-old child account to videos glorifying suicide in less than an hour of launching the account.”
- If you are affected by any of the issues raised in this article, please click here for a list of support services.