Financial news

Facebook content moderators required to sign PTSD forms

By  | 

Via Financial Times

Content moderators working at a European facility for Facebook have been required to sign a form explicitly acknowledging that their job could cause post-traumatic stress disorder, according to documentation and employee confirmation obtained by the Financial Times.

The facility, which is operated by global professional services company Accenture, hosts roughly 400 content moderators who trawl through hundreds of disturbing images and videos — ranging from bestiality and child abuse to hate speech, self-harm and terrorism — across Facebook and Instagram every day.

The moderators’ jobs entail making granular decisions about why each image or video is objectionable. One employee at the facility, who asked not to be named, said that people working there “cry every day”, and that many “take sick leave for mental health issues, sometimes three or six months”.

Accenture runs at least three content moderation sites for Facebook in Europe, including in Warsaw, Lisbon and Dublin, where workplace safety rules are some of the most stringent in the world and include protections for mental health.

The document was distributed to all moderators at the European facility in early January via email, asking them to sign it immediately. It stated: “I understand the content I will be reviewing may be disturbing. It is possible that reviewing such content may impact my mental health, and it could even lead to post-traumatic stress disorder (PTSD).”

The two-page form also outlines Accenture’s WeCare programme, which provides employees with access to “wellness coaches” from whom they can receive mental health support. The company says, however, that “the wellness coach is not a medical doctor and cannot diagnose or treat mental disorders”.

READ ALSO  Lagarde downplays prospect of imminent ECB response to coronavirus

Facebook is facing lawsuits in California brought by two former moderators, and a slew of personal injury claims in Ireland, where its international headquarters are based, brought by a dozen Facebook content moderators who have all experienced severe mental health conditions, ranging from panic attacks to PTSD.

A similar document was also provided by Accenture to workers at a YouTube content moderation facility in Austin, Texas, according to the Verge.

“Now we see it in black and white: Big Tech knows full well that content moderation causes PTSD in its workers,” said Cori Crider, director of Foxglove, a UK-based litigation non-profit organisation that is assisting with investigation, strategy and campaigning in one of the Irish cases.

“The question is, when are Google and Facebook going to clean up their unsafe factory floor? Pushing responsibility on to the individual worker, as this document tries to do, won’t cut it. It’s on them to make their workplace safe.”

Facebook said it did not review or approve forms like the one Accenture had sent and was not aware that its content moderators were being asked to sign it. It did say, however, that it required its partners to offer extensive psychological support to its moderators on an ongoing basis.

“Facebook themselves were part of an industry group called the Technology Coalition that proposed standards for protecting moderators’ mental health years ago — in 2015,” Ms Crider added. “But they didn’t follow those standards. So these companies are going to be hard pressed to say senior management weren’t aware of the problem.”

READ ALSO  Disney defends new big cheese at the Mouse House

According to an employee who signed one of these acknowledgment forms, every moderator at the facility was emailed a link and asked to sign immediately. The employee said they had seen multiple instances of severe mental health conditions among their colleagues, and had also been diagnosed with depression themselves, something they believed was exacerbated by their working conditions. However, they had never previously been asked to sign a form acknowledging the potential for damage to their health.

“When I started to work there, I thought graphic violence and sexual and animal abuse would be the hardest part of this job for me, and I think they were,” said the moderator. “But if you work on hate speech six hours a day, five days a week, it gets to you. I’m a cis white heterosexual male, so I can’t imagine how it affects the people that represent minorities.”

The moderator explained that it was not just the content that caused severe mental health problems among employees, but also that Accenture’s running of the facility contributed to the overall stress levels.

“I would stress . . . that the employment conditions are a factor in the high rate of mental health problems in our workplace,” the person said. “When I started, we had five possible decisions to make; now there are more than 250 possible combinations of labels. The content policies are changing every two weeks.”

Employees are expected to hit quality scores of 98 per cent, which means their decisions on why a piece of content is egregious have to match that of their quality reviewer in almost every instance.

READ ALSO  Shenzhen’s speculative stocks defy coronavirus

Every few minutes, a notification with quality scores pops up on an employee’s screen, showing them how many mistakes they have made. Their scores determine if their short-term contracts are terminated or extended. “The anticipation of the quality score is what is very stressful for me, and for most of us,” the person said.

Accenture did not respond immediately to request for comment.

Print Friendly, PDF & Email

Hold dit netværk orienteret