Facebook has agreed to pay out $52m (£42m) to current and former content moderators who sued it for the mental trauma allegedly inflicted upon them by the job.
In a preliminary deal filed in US courts last week, the company committed to compensating each worker with $1,000, with extra awards of up to $5,000 for those diagnosed with mental health conditions such as post-traumatic stress syndrome (PTSD).
The settlement covers 11,250 contractors who waded through waves of unpleasant, potentially distressing content, often for little more money than they would earn in a call centre.
Facebook will also implement new safety and hiring standards, including better counselling and mental health care, resilience screening for job applicants and measures to dampen the effect of disturbing videos.
But for other tech firms such as YouTube and Twitter that make heavy use of content moderators, the payout is likely to provoke new fears of copycat lawsuits seeking similar damages.
Steve Williams, one of the plaintiffs’ lawyers, said: “We are so pleased that Facebook worked with us to create an unprecedented program to help people performing work that was unimaginable even a few years ago. The harm that can be suffered from this work is real and severe.”
A spokesman for Facebook said: “We are grateful to the people who do this important work to make Facebook a safe environment for everyone. We’re committed to providing them additional support through this settlement and in the future.”
The settlement, first reported by The Verge, a tech news site, caps off years of criticism of Facebook for its treatment of what it calls “content reviewers”, charged with protecting its users from the scum of the internet.
Selena Scola, a former moderator in the US, had sued the company in 2018 claiming that she had suffered PTSD from being required to examine images and videos of suicides, rapes and terrorist beheadings.
Separately, Facebook announced on Tuesday that some of its moderators were now returning to their offices on a voluntary basis, while others were now doing more of their work from home.
The company sent all its moderation contractors home on full pay in March, but most of their tasks could not be done remotely due to concerns about user privacy and corporate leaks.
It also published its first statistics on bullying and sexual content in Instagram, which it owns, saying it had removed 3m pieces of content relating to bullying and harassment and 14.4m relating to sex and nudity.