Jessy Edwards , Jessy Edwards  |  July 14, 2022

Category: Legal News

Top Class Actions’s website and social media posts use affiliate links. If you make a purchase using such links, we may receive a commission, but it will not result in any additional charges to you. Please review our Affiliate Link Disclosure for more information.

Close up of YouTube homepage on a laptop screen.
(Photo Credit: PixieMe/Shutterstock)

Update: 

  • YouTube Inc. agreed to pay a $4.3 million settlement to end claims it failed to mitigate graphic content it subjected YouTube moderators to and didn’t provide support for psychological trauma suffered on the job. 
  • In a motion filed in a California federal court July 12, YouTube and the anonymous plaintiff asked a judge to approve the proposed settlement.
  • The settlement would see about 1,300 U.S. content moderators receive $2,079 in compensation each. Lawyers are seeking $1.4 million in fees.
  • The settlement benefits moderators who work or have worked for contractors of YouTube anytime since January 1, 2016. 
  • Going forward, YouTube has also agreed to provide content moderators with onsite and virtual counseling services by licensed clinicians, as well as access to telephonic counseling.

(Oct. 27, 2022)

A YouTube moderator has filed a class action lawsuit against the “video library of the world” over how it approves its disturbing and illegal images. 

Filed in San Mateo County, California Superior Court, the anonymously named plaintiff claims Google fails to mitigate graphic content YouTube moderators are subjected to and doesn’t provide adequate support after the fact.   

The filing argues the role of the YouTube moderator is crucial in preventing the seedy underbelly of humanity from being published online.

On any given day, a YouTube moderator is expected to review 100 to 300 flagged video clips in an effort to address the millions of user-submitted alerts on questionable content, the class action lawsuit contends. 

“From genocide in Myanmar to mass shootings in Las Vegas and Christ Church to videos of children being raped and animals being mutilated, content moderators spend hours a day making sure that disturbing content like this never appears to YouTube’s users,” the plaintiff said in the class action lawsuit.

Once viewed, a YouTube moderator has to determine if the video falls into certain categories, which include violent extremism, adult content, hate and harassment, and child sexual abuse imagery.

On an average week, a YouTube moderator is reviewing “hundreds of thousands, if not millions,” of these flagged posts, the filing explains.

The YouTube moderator’s duties cannot be completely replaced by artificial intelligence, either, the plaintiff says, because algorithms lack human nuance. 

YouTube content moderators are necessary “because human judgment is critical to making contextualized decisions on content,” the plaintiffs said. 

The sheer volume of uploaded video, about 500 hours per minute, according to the complaint, had YouTube CEO Susan Wojcicki promising a hiring blitz of 10,000 moderators in 2018. 

The class action lawsuit points to several studies showing “unmitigated exposure to highly toxic and extremely disturbing images” causes post traumatic stress disorder, anxiety, depression, cardiovascular conditions, pain syndromes, diabetes, and dementia. 

Disorders like PTSD can lead to secondary problems, according to the plaintiffs. A third to one-half of people with PTSD also have a substance abuse disorder, the complaint says. 

The filing references the American Psychiatric Association’s Diagnostic and Statistical Manual of Mental Disorders, which “recognizes that secondary or indirect exposure to trauma meets the first diagnostic criterion for PTSD.”

YouTube recognized these realities and created a program to address the issues but failed to implement any “workplace safety measures,” the plaintiffs argue. 

The responsibility of tech companies in guarding staff well-being was considered in a similar class action against Facebook this year. A settlement was reached in that case. 

The true nature of the work isn’t fully revealed to YouTube moderator candidates when interviewing, according to the lawsuit. These new hires are sent through a two week training of PowerPoint slides showing “extremely graphic” content but “little to no time was spent on wellness or resiliency.” 

The new class of hires were told they could leave the room at any time but reported feeling they might lose their jobs if they failed the test at the end, the plaintiffs said. 

This treatment of the YouTube moderator in the U.S. contrasts with those in the United Kingdom, according to the complaint. 

The U.K.’s Internet Watch Foundation requires psychologists assess the new content moderator’s “suitability” for the work, inquires about individual support networks and “eases” the new hires into the graphic imagery aspects, the plaintiffs said. 

Training for a YouTube moderator in the U.K. lasts six months and includes elements of criminal law, the dark web and “crucially, building trauma resilience,” the lawsuit said.  

Instead of professional clinicians, the YouTube moderator in the U.S. received guidance from “underqualified” and “undertrained” wellness coaches, according to the complaint, who advised a “trust in God.”

Another wellness coach referred a moderator to a psychologist, but no details on how to find treatment. Many of the overnight moderators had no access to these coaches at all. 

The plaintiffs assert Google and their contractors were more concerned about getting non-disclosure agreements signed, especially after The Verge published a story on the gory details of being a YouTube moderator. 

“YouTube responded” to that published article, the lawsuit said “by requiring its Content Moderators to sign a document acknowledging that performing the job can cause PTSD.”

Before that in 2017, “Content Moderators were told to stop talking or posting about the negative effects of reviewing graphic content,” according to the complaint. 

Plaintiffs further assert Google had a plan to mitigate the negative impacts on the YouTube moderator with the formation of the Technology Coalition in 2006. 

They claim Google published specific protocols in their Employee Resilience Guidebook for Handling Child Sex Abuse Images in 2015, but that YouTube never followed them. 

Part of those protocols included altering the imagery’s resolution, color, and size but that was never implemented either, according to the lawsuit. 

The anonymous plaintiff says a YouTube moderator was reprimanded after asking an engineer to flag “ultra violent” content following the Myanmar genocide. In that instance, the moderator had requested the change once before but YouTube’s Global Vice President of Operations reportedly told them it wasn’t a priority. 

Another mitigating step Google could take to help the YouTube moderator, the plaintiffs said, would be to limit the exposure to these graphic images to four hours, but chronic staffing shortages mean that doesn’t happen. 

With “low wages, short-term contracts, and the trauma associated with the work—many Content Moderators remain in the position for less than one year,” the lawsuit said. 

Even more, the plaintiffs claim YouTube moderators are pressured with quotas requiring a review of 100 to 300 pieces of content per shift. 

“YouTube requires Content Moderators to engage in an abnormally dangerous activity. And by failing to implement the workplace safety standards it helped develop, YouTube violates California law,” the lawsuit argues. 

Last year, according to the complaint, Google and YouTube brought in some $150 billion in advertising revenue. YouTube content is uploaded online at a rate of about 720,000 hours of video a day.  

The plaintiffs formally accuse YouTube of three counts of negligence and violating California’s unfair competition laws. The proposed Class also seeks to form a medical fund to help YouTube moderators out with their mental health. 

Have you worked as a YouTube moderator? Let us know in the comments below.

Counsel representing the plaintiffs in this class action lawsuit are Joseph R. Saveri, Steven N. Williams, Kevin Rayhill, Kate Malone, Kyle Quackenbush of Joseph Saveri Law Firm, Inc.

The YouTube Moderator class action lawsuit is Doe v. YouTube, Inc., Case No. 4:20-cv-07493-SK, in the Superior Court of California in the County of San Mateo. 


Don’t Miss Out!

Check out our list of Class Action Lawsuits and Class Action Settlements you may qualify to join!


Read About More Class Action Lawsuits & Class Action Settlements:

We tell you about cash you can claim EVERY WEEK! Sign up for our free newsletter.

24 thoughts onYoutube class action over moderator mental health settled for $4.3M

  1. Teri mathews says:

    Add me please

  2. Joe Ezell says:

    Please add me

  3. Safaa Deeb says:

    Add me please!

Leave a Reply

Your email address will not be published. By submitting your comment and contact information, you agree to receive marketing emails from Top Class Actions regarding this and/or similar lawsuits or settlements, and/or to be contacted by an attorney or law firm to discuss the details of your potential case at no charge to you if you qualify. Required fields are marked *

Please note: Top Class Actions is not a settlement administrator or law firm. Top Class Actions is a legal news source that reports on class action lawsuits, class action settlements, drug injury lawsuits and product liability lawsuits. Top Class Actions does not process claims and we cannot advise you on the status of any class action settlement claim. You must contact the settlement administrator or your attorney for any updates regarding your claim status, claim form or questions about when payments are expected to be mailed out.