Facebook moderators should be given support of that provided to frontline workers, says privacy expert

Social media content moderators need to be afforded supports similar to those provided to frontline workers due to the stresses associated with the job, according to a privacy expert.

Facebook moderators should be given support of that provided to frontline workers, says privacy expert

Social media content moderators need to be afforded supports similar to those provided to frontline workers due to the stresses associated with the job, according to a privacy expert.

Elizabeth Farries of the Irish Council for Civil Liberties made the comparison following an Irish Examiner report which revealed that content moderators at Facebook are to take legal action in Ireland, alleging they now suffer post-traumatic stress disorder as a result of viewing graphic content at work.

Ms Farries said she could not comment on the specifics of the Facebook case, but that the ICCL believes social media content moderators must be seen as a similar category of profession as a social worker, doctor or first responder medics in that they too deal with distressing incidences as part of their work.

She referenced previous media coverage of social media moderation work and its demands on staff who witness egregious acts of violence, end of life events, and other grotesque assaults.

Ms Farries said interviews with social media moderators have revealed many struggle to cope with the material they have seen, displaying problems such as difficulty regulating emotions, and recurring thoughts of the graphic content they have viewed.

The symptoms displayed, Ms Farries said, match NHS definitions of those associated with trauma.

As such, she said, these are frontline workers who need adequate supports that cannot be tokenistic.

“The best way to deal with trauma is to talk about it,” Ms Farries said.

“It’s not enough to just say there’s a counsellor on-site,” she said.

These adequate supports would mean moderators are free to access counselling within working hours without fear of missing out on company-imposed work quotas, allowing them adequate time to seek professional support.

Education and awareness prior to undertaking moderation work is also key, Ms Farries argued, so that staff are prepared as best as possible to deal with the content they will be exposed to, and provided with the right tools to cope with the demands of the role.

Earlier his year Ms Farries authored the ICCL’s submission to the Department of Communications, Climate Action and Environment regulation of harmful content on online platforms.

The ICCL has warned the Government that balancing the need to moderate online material with individual rights.

“Legislation or regulations permitting generalised monitoring of content based on the concern that it might be harmful could allow governments and corporate platforms to surveil people in Ireland in a manner that contravenes constitutional and human rights standards and the principles of legality, necessity and proportionality,” the ICCL warned.

Similarly, imprecisely drafted laws or regulations that do not intend to but nonetheless increase the chances of blanket surveillance would also run afoul of these standards.

"States must clarify definitions of harmful content so that they may be subject to a rights balancing analysis. It is unlikely that states can define harmful content to a level of specificity that avoids the need for an independent and impartial judicial authority to evaluate individual circumstances when applying this definition,” it said.

More in this section