Facebook has admitted that it needs to do more to support the wellness of moderators who remove harmful content from the social network.
The admission comes in response to a report about the working conditions of contractors tasked with the difficult job.
The company has stepped up its efforts to remove disturbing posts such as violent crime, violent pornography and hate speech, relying on specialist partner firms to sift through the deluge of potentially offending content and deciding whether it needs to be taken down.
Former workers shared a number of alarming claims, such as coping with the stress of the job with drugs, alcohol, offensive jokes, and sex both in and away from the workplace.
Speaking to The Verge, employees said that they felt therapeutic activities and counselling were inadequate, while bosses at one partnering firm are said to have put a stop to the nine-minute wellness break being used to go to the toilet.
Another had been diagnosed with PTSD and sleeps with a gun by his side, following trauma from seeing a video of a man being stabbed to death.
“We are committed to working with our partners to demand a high level of support for their employees; that’s our responsibility and we take it seriously,” said Justin Osofsky, Facebook’s vice president of global operations, following the report.
“We’ve done a lot of work in this area and there’s a lot we still need to do.”
Mr Osofsky detailed how Facebook has already taken steps to ensure moderators are provided enough support, including making it explicitly clear in contracts that good facilities and wellness breaks are on offer, and making regular visits to partner sites to help address any issues.
“Given the size at which we operate and how quickly we’ve grown over the past couple of years, we will inevitably encounter issues we need to address on an ongoing basis,” he continued.
Facebook uses several partners, including Accenture, Cognizant and Genpact.
- Press Association