Facebook moderators, who claim their mental and physical health has suffered because of the graphic content they have had to view as part of their work, are to take personal injury claims against the social giant through the Irish courts.
The development follows a series of media reports and allegations from moderators worldwide, which have included claims that staff have suffered Post Traumatic Stress Disorder having viewed violent and explicit material including beheadings, self-harm, and child pornography.
The Irish Examiner understands moderators from across Europe are either in the process of instigating legal proceedings or considering action against Facebook in Dublin as it is the social media company’s headquarters for its operations in Europe, the Middle East and Africa.
David Coleman, of Coleman Legal Partners, said they are in the process of issuing letters of claim to Facebook as a precursor to legal action against the company.
“I have been instructed by many people who have worked as moderators within the Facebook organisation, and have been instructed to vindicate their rights following the distress caused by their methods of work and the effect that has had on their mental and physical health,” Mr Coleman told the Irish Examiner.
Facebook’s own Community Standards Enforcement Report said that in the first three months of 2019 alone, the platform took action on 5.4m pieces of content that violated its standards on child nudity and exploitation.
In the same period, 33.6m items of content were found to go against the platform’s rules on violence and graphic content.
Facebook says it has around 15,000 people who review content for its platform. They are spread across full-time employees, contractors and people working at partner companies such as Accenture, CPL and Majorel.
They work at sites in Ireland, Germany, Spain, Latvia, Kenya and the US, amongst others.
Facebook told the Irish Examiner it is “committed to providing support for our content reviewers as we recognise that reviewing certain types of content can sometimes be hard”.
Everyone who reviews content for Facebook goes through an in-depth, multi-week training programme on our Community Standards and has access to extensive psychological support to ensure their wellbeing.
“This includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment.
"We are also employing technical solutions to limit their exposure to graphic material as much as possible. This is an important issue, and we are committed to getting this right,” the statement said.
In 2017, the company pledged to do more to remove violent content from Facebook following a spate of what founder Mark Zukerberg described as videos and live streams of “people hurting themselves and others on Facebook”.
“It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” Mr Zukerberg said.
The measures he announced at the time included hiring more content moderators to enable Facebook to faster respond to user reports.
“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” he said.
“And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it - either because they’re about to harm themselves, or because they’re in danger from someone else.”