Facebook, Inc.(NASDAQ: FB) reached a settlement agreement in May with content moderators who filed a class-action lawsuit, alleging that Facebook ignored its responsibility to provide a safe healthy workplace for its workers.
The plaintiffs filed the lawsuit against Facebook in September 2018. They worked as content moderators for the social network giant through various third-party vendors and contract companies like Pro Unlimited, Inc., Accenture LLP, Accenture Flex LLC, and U.S. Tech Solutions, Inc.
The final approval hearing on the settlement agreement is scheduled on November 20. The agreement provides a medical screening payment of $1,000 to members of the class-action lawsuit.
Certain class members with qualifying diagnoses including post-traumatic stress disorder, acute stress disorder, anxiety disorder, depression, or an unspecified trauma or stress-related disorder, maybe eligible for additional damage awards of up to $50,000 depending on the amount remaining in the settlement fund.
The settlement covers any current or former content moderator who performed work for Facebook in California, Arizona, Texas, or Florida between September 15, 2015, and August 14, 2020. Whether the work was performed, as an employee or subcontractor, workers may be entitled to benefits from the Class Action Settlement.
Payments to class members will be made only after the Court grants its final approval of the settlement and after the resolution of any appeals. Following the Court’s final approval, class members have 180 days to submit claims documenting diagnoses and any other damages.
If you worked at Facebook directly or through a contract company as a content moderator during the covered period go to the Scola v. Facebook Proposed Settlement website for the latest information.
Over 10,000 content moderators across the U.S. will benefit from the settlement
Under the proposed settlement entered in the Superior Court in San Mateo County, California, Facebook agreed to pay $52 million and implement workplace improvements to resolve the litigation.
Facebook will fund and maintain a testing and treatment program for previous and current content moderators. It will protect these workers who remain “at-risk” for PTSD and psychological trauma based on their job requirements. They will receive ongoing medical testing and monitoring. The social network giant will cover any necessary medical and psychiatric treatment and associated costs.
The settlement is expected to provide relief to over 10,000 former and current Facebook content moderators across the United States.
Facebook content moderators exposed to disturbing images
The content moderators were responsible for viewing and removing a range of offensive, horrifying images and videos posted on Facebook.
Every day, they were exposed to thousands of “videos, images, and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder.”
Their job was stressful and caused many of them to suffer from psychological trauma and post-traumatic stress disorder.
It is estimated that Facebook receives more than a million reports of potentially objectionable content each day. Most of these images will be viewed by a content moderator.
Facebook failed to implement the tech industry’s workplace safety standards
The safety standards that Facebook helped develop include, getting the workers “informed consent during the initial employment interview process; providing moderators with robust and mandatory counseling and mental health support; altering the resolution, audio, size, and color of trauma-inducing images and videos; and training moderators to recognize the physical and psychological symptoms of PTSD.” These standards were not previously put in place at Facebook workplaces. The settlement requires that Facebook make a safe and healthy work environment for these “at-risk” workers.
The settlement includes a new set of safety standards and Well-Being Preference tools
Under the proposed settlement, Facebook agreed to implement a suite of Well-Being Preference tools on the Single Review Tool platform that content moderators can use.
- Viewing images in black and white
- Blurring images
- Blocking faces within posted images
- Blurring video previews
- Auto-muting videos on start
Other tooling enhancements include:
- The ability to preview videos using thumbnail images when feasible
- Default settings preventing automatic video playback
These business practices and tooling enhancements will be provided by Facebook to mitigate the effects of exposure to graphic or objectionable materials.
Have a story you want USA Herald to cover? Submit a tip here and if we think it’s newsworthy, we’ll follow up on it.
Want to contribute a story? We also accept article submissions — check out our writer’s guidelines here.