Roughly 30 cases have been taken to the High Court by content moderators of Facebook, the Oireachtas committee on enterprise, trade and employment was told, amid a growing appetite for better conditions for staff handling difficult and, at times, traumatic content.
The need for statutory regulation of the content on social media channels is “over-ripe”, it was argued.
The committee has heard from a content moderator with Facebook, Isabella Plunkett, together with union representatives, and Cori Crider of non-profit Foxglove, which is representing moderators in their quest for better working conditions.
Ms Crider told the committee that in terms of social media, “light-touch regulation has failed” in view of the extreme content posted there.
“This should be regulated in the same manner that broadcast media is. It is over-ripe for it,” she said.
“Using outsourced moderators is to outsource a core business function, a function key to the health of the public square."
Fionnuala Ni Bhrogain, head organiser with the Communications Workers Union, said: “There are no social media companies without moderators. They could not exist without moderators. Once the moderators realise this, they will see an improvement in their terms and conditions."
A moderator’s job at Facebook is, in Ms Plunkett’s words, is to “train the algorithm”, that is to sift through the various forms of content posted and flag what is inappropriate, thus teaching the artificial intelligence on the platform what is acceptable and what is not.
“When a terrorist tries to livestream a beheading, a content moderator steps in,” Ms Crider said.
Ms Plunkett said that in the context of her role, in which she has worked for two years, she routinely has to view hate speech, bullying, graphic violence, suicide, abuse, and child exploitation.
In return, she is afforded 90 minutes of wellness coaching a week, she said, in order to manage her mental health “if we’ve seen some particularly bad content”.
“I have horrible dreams, about work, about work from yesterday. I’d like to be able to separate it, but your subconscious mind is always thinking about it,” she said, adding that morale among her and her colleagues is “dreadful”.
When queried as to the limited nature of mental health supports for moderators at Facebook, Ms Ni Bhrogain said: “You would think that [there should be more], but sadly that is all there is”.
Ms Crider said that to her knowledge, there are roughly 30 cases before the High Court which have been taken by moderators as a result of the alleged post-traumatic stress disorder they have suffered in their jobs.