Facebook $52M Content Moderator Settlement: How to Receive Your Benefits

0
2495

It is estimated that Facebook receives more than a million reports of potentially objectionable content each day. Most of these images will be viewed by a content moderator.  

Facebook failed to implement the tech industry’s workplace safety standards

To maintain the platform, maximize profits, and enhance its public image, Facebook relies on content moderators to view objectionable posts and remove those that violate the company’s terms of use.

The safety standards that Facebook helped develop include, getting the workers “informed consent during the initial employment interview process; providing moderators with robust and mandatory counseling and mental health support; altering the resolution, audio, size, and color of trauma-inducing images and videos; and training moderators to recognize the physical and psychological symptoms of PTSD.” These standards were not previously put in place at Facebook workplaces. The settlement requires that Facebook make a safe and healthy work environment for these “at-risk” workers. 

Signup for the USA Herald exclusive Newsletter

The settlement includes a new set of safety standards and Well-Being Preference tools

Under the proposed settlement, Facebook agreed to implement a suite of Well-Being Preference tools on the Single Review Tool platform that content moderators can use.