Big Tech Fires Back: No Duty to Prevent Harm
The defendants, however, argued that their reporting tools are not “products” under liability law and that they have no legal duty to prevent harm. YouTube and TikTok insisted that the tragic deaths were not a reasonably foreseeable consequence of their content moderation decisions.
Judge DeMarchi agreed, stating the plaintiffs failed to pinpoint a specific design defect. She ruled that their complaints essentially challenge the platforms’ decisions on whether to remove reported videos—an issue more aligned with editorial discretion than product liability.
“This is an objection to defendants’ decisions, after receiving plaintiffs’ reports, to remove or not remove certain videos,” the judge wrote, making it clear that content moderation policies do not fall under traditional product liability laws.
The court also found no basis for emotional distress claims, determining that the plaintiffs’ suffering was tied to content moderation choices, not a defect in a tangible product.