Former Facebook Content Moderators Sue Facebook for Psychological Injuries

Former Facebook content moderators have taken the decision to sue Facebook for psychological injuries and are seeking compensation from the social media network after developing post traumatic stress disorder (PTSD) from viewing extremely disturbing violent content and other graphic material at work.

Working for Facebook may seem like a dream job for many people, but not all work that needs to be performed for the social media network is glamorous and fun. The job of a content moderator at Facebook requires employees to constantly review content that has been uploaded by its users and determine whether it violates Facebook’s policies.

All manner of content is uploaded to the social media network by users. Facebook content moderators must manually assess those posts and make decisions about whether the content should be allowed to remain or if it needs to be filtered out or deleted.

Facebook’s Q1, 2019 Community Standards Enforcement Report revealed that 33.6 million pieces of content were removed by its army of content moderators in the first three months of 2019 as they breached its rules on violent and other graphic content. 5.4 million posts were deleted as they depicted the exploitation or children, child abuse, or child porn.

Facebook content moderators have to view that content before a decision can be made. Workers have been exposed to so much extreme content that they have suffered severe psychological injuries as a result.

Naturally the job of a content moderator on platform where users can upload their own content is likely to involve some exposure to unsavory content and, in some cases, extreme material. Facebook recognizes that this is inevitable. A spokesperson for Facebook said, “reviewing certain types of content can sometimes be difficult.” For that reason, its content moderators are provided with extensive training and access to full time support.

However, the reality appears to be somewhat different. In 2018, lawsuits were filed in California by Facebook workers who had developed PTSD as a result of the constant exposure to extreme content. Former Facebook content moderator Selena Scola was diagnosed with PTSD following the exposure to extreme material at work and she is not alone.

In September 2019, Ireland’s Personal Injuries Assessment Board authorized a group of Facebook content moderators to take their case to the High Court in Ireland. The case was filed with the High Court on December 4, 2019.

Facebook does not directly employ content moderators. They are usually provided by third party companies. One of those companies, CPL Resources, provides contractors for social media content moderation in Ireland and was named in the lawsuit along with Facebook Ireland.

Chris Gray (53), lead plaintiff in the lawsuit, was employed by CPL Resources and worked as a content moderator for Facebook for 10 months between 2017 and 2018. During that time, he was exposed to extreme violent content, graphic sexual content, and sexual violence on a daily basis. The extreme content was relentless, and it soon started to cause him problems.

Gray found his moods changing and outside of work he was often irritable and aggressive. He claims that his personal and political views started to change in what he describes as a ‘slow creep’. He experienced difficulty sleeping, had nightmares, and couldn’t get the extreme images out of his head. He was later diagnosed as having PTSD from repeated exposure to extreme online content.

On a daily basis Gray had to view videos of people being stabbed, shot, beaten to death, and beheaded. He witnessed mass murder by machine guns, torture of people and animals, and videos of child abuse. He was also exposed to extreme sexual content including child porn and bestiality.

Sean Burke, another former Facebook content moderator, told Vice, “My first day on the job, I witnessed someone being beaten to death with a plank of wood with nails in it and repeatedly stabbed.”

In addition to the constant exposure to extreme, psychologically damaging content, working conditions were tough. Facebook content moderators were required to rate content with a high degree of accuracy an achieve a 98% success rate and meeting the targets for content moderation was a constant challenge and the extreme pressure was a major cause of stress. The job only paid slightly more than minimum wage.

The combination of the stress from meeting performance targets coupled with the extreme content has led to employees suffering panic attacks and suffering from severe psychological problems. Many have turned to antidepressants to help them cope and others are abusing alcohol and drugs to help them blot out the images and allow them to sleep.

Gray and others claim that the support provided was minimal, there was no screening process to weed out candidates who were unlikely to cope, and the full-time support failed to materialize. When a free 45-minute support session was provided by CPL Resources, many staff felt they could not take full advantage as it would hamper their ability to meet their daily targets.

There are at least a dozen former Facebook content moderators taking legal action in Ireland and many others in Spain, Germany, Sweden and other countries are also considering seeking compensation for PTSD and may sue Facebook for psychological injuries suffered at work.

According to Gray’s legal team, there are around 15,000 Facebook content moderators provided by third party companies. One employee stated that there are at least 40,000 content moderators working for Facebook, either through third parties or directly.

Several former employees have now spoken to the media about their mental health issues caused as a result of the work and lack of support. PTSD appears to be a common workplace injury, yet nothing appears to have been done to reduce the risk.

Many former employees have held back from speaking out or taking legal action as they have signed nondisclosure agreements and are not permitted to talk about the work that they do. They fear repercussions such as legal action and worry it may affect future job prospects if they talk about their problems and suffer in silence as a result.

The media coverage and High Court case are likely to trigger a wave of cases against Facebook and the companies that provide contractors for the job. Over the next few months is likely many former employees will sue Facebook for psychological injuries and will attempt to get social media moderator compensation.

Author: Richard Anderson

Richard Anderson is the Editor-in-Chief of NetSec.news