Top Class Actions’s website and social media posts use affiliate links. If you make a purchase using such links, we may receive a commission, but it will not result in any additional charges to you. Please review our Affiliate Link Disclosure for more information.

Facebook on phoneFormer Facebook moderators filed a class action lawsuit against the company for exposing them to graphic images without providing mental health services.

More than a dozen former moderators claim that they and others in similar positions sift through thousands of graphic images as a part of their job removing Facebook content that does not meet the terms of use for the social network site.

In addition to Facebook being named as a defendant, Cognizant has also been called out in the Facebook moderator class action lawsuit because it employs the moderators that review the content.

“From their cubicles during the overnight shift in Cognizant’s Tampa and Phoenix offices, plaintiffs witnessed thousands of acts of extreme and graphic violence,” alleges the Facebook class action lawsuit.

The plaintiffs say that they and other moderators face a constant stream of “child sexual abuse, rape, torture, bestiality, beheadings, suicide, racist violence and murder” as a part of their daily work.

As a result, “class members suffered PTSD and other psychological disorders, physical injuries including stroke and epilepsy, lost pay, lost future earning capacity, emotional distress and loss of enjoyment of life,” states the Facebook class action. “Plaintiffs also allege that employees sustained severe physical injuries, such as a heart attack, while on the job.

The Facebook moderator class action lawsuit acknowledges that Facebook has implemented workplace safety measures; however, these measures, as enacted, only go so far to protect workers. The plaintiffs say that the measures are, in fact, a way for Facebook to “cultivate its image.”

Workforce safety measures allegedly include screenings, counseling, and mental health support for Facebook moderators. Attributes, such as color and size of graphic images are altered under the measures, as well. However, the plaintiffs say moderators are often let go when their contract expires, making them ineligible for these services.

Further, “the multibillion-dollar corporations affirmatively require their content moderators to work under conditions known to cause and exacerbate psychological trauma,” contends the complaint.

Woman looking at computer scared“Content moderators are essentially the first responders of the internet,” asserts the Facebook class action lawsuit. “Plaintiffs and the other content moderators, at a minimum, deserve the same protections as other first responders, which includes workers’ compensation/health coverage for the PTSD caused by the working conditions.”

This Facebook moderator class action lawsuit was initially filed in Florida federal court in February, but it is not the first time these allegations have been made by a moderator against the social networking titan.

In 2018, a former content moderator, who was hired through an independent company similar to Cognizant, Pro Unlimited, alleged that Facebook failed to protect workers from viewing traumatic content. The plaintiff in the 2018 class action lawsuit alleged that she was forced to view graphic and violent acts, such as rape, murder, suicide, and child sex abuse.

She claimed she suffered from post traumatic stress as a result, but Facebook ignored its own protective measures, such as requiring mandatory counseling and PTSD training. The 2018 Facebook moderator class action lawsuit alleged that the company, as well as the independent contracting company, violated California law as a result.

The Florida Facebook moderators class action lawsuit proposes to represent moderators in Florida and Arizona. The plaintiffs are seeking the creation of a medical monitoring fund covering Class Members who may be suffering post traumatic stress disorder or other psychological damage from viewing the graphic content they are required to be exposed to as a part of their job.

According to the Facebook class action, “the medical monitoring requested by Plaintiffs includes (1) baseline screening, assessments, and diagnostic examinations to assist in diagnosing adverse health effects, (2) secondary interventions to reduce the risks of PTSD, (3) tertiary interventions to reduce symptoms of those suffering from PTSD, and (4) evidence based treatments to help individuals recovery.”

The Facebook moderator class action lawsuit points out that screening for traumatic brain injuries, including PTSD, can cost between $3,8000 to $11,100 each year.

They are also seeking compensatory damages, including lost wages, medical expenses, and emotional, and physical distress, along with attorneys’ fees and costs.

Are you a Facebook monitor? Have you suffered emotional distress or other psychological effects after viewing graphic images as a part of your work? Tell us your story in the comment section below!

The plaintiffs and proposed Class are represented by Jay P. Lechner of Lechner Law.

The Facebook Monitor Class Action Lawsuit is Debrynna Garrett, et al. v. Facebook Inc., et al., Case No. 8:20-cv-00585, in the U.S. District Court for the Middle District of Florida.

We tell you about cash you can claim EVERY WEEK! Sign up for our free newsletter.


76 thoughts onFacebook Class Action Says Graphic Images Cause Moderators PTSD

  1. Sharonda Sherman says:

    Please add me to the lawsuit I worked for Facebook for 9 months I still don’t sleep more than 4hrs at 1 time I always have dreams of the children I have saw raped and murdered in front of my eyes and I can’t even begin to count the number of people I saw shot heads chopped off and any other gruesome thing you could think of

  2. Craig Lawrence says:

    Hello, I worked as a Facebook Moderator for 2 years and have suffered due to the work performed. Day in and day out of severely graphic images. How do I get added on to Thai class action?

  3. Krystal Anthony says:

    Please add me I’ve seen so many traumatizing graphic videos and pictures on Facebook. It’s hars to sleep at night.

  4. Krystal Anthony says:

    Please add me

1 6 7 8

Leave a Reply

Your email address will not be published. By submitting your comment and contact information, you agree to receive marketing emails from Top Class Actions regarding this and/or similar lawsuits or settlements, and/or to be contacted by an attorney or law firm to discuss the details of your potential case at no charge to you if you qualify. Required fields are marked *

Please note: Top Class Actions is not a settlement administrator or law firm. Top Class Actions is a legal news source that reports on class action lawsuits, class action settlements, drug injury lawsuits and product liability lawsuits. Top Class Actions does not process claims and we cannot advise you on the status of any class action settlement claim. You must contact the settlement administrator or your attorney for any updates regarding your claim status, claim form or questions about when payments are expected to be mailed out.