Parents of sextortion victim sue Meta, accusing the social media giant of putting profits and engagement ahead of children’s safety — with fatal consequences. Lawsuits unsealed Wednesday allege that Meta Platforms Inc. and its Instagram app knowingly connected child users with adult sexual predators, leading to the suicides of two boys, one in Scotland and one in the United States.
The cases, filed in California and Delaware courts, were brought by Rosalind and Mark Dowey, whose 16-year-old son from Scotland is identified as M.D., and Tricia Maciejewski, the parent of L.M., a 13-year-old boy from Pennsylvania. The parents say their children were “sextorted” after being targeted through Instagram’s recommendation tools.
Allegations of Longstanding Knowledge
According to the complaints, Meta knew as early as 2019 that Instagram was recommending children’s accounts to adult predators, yet failed to act. Newly unsealed filings contend that internal analyses warned executives the platform was facilitating dangerous connections but that the company chose growth over safety.
Central to the lawsuit is Instagram’s AI-driven “Accounts You May Follow” feature, designed to boost user engagement. The parents allege a 2019 Meta analysis estimated about 3.5 million profiles were seeking “inappropriate interactions with children.”
By 2023, Meta’s own estimates allegedly showed that nearly 2 million children’s accounts were recommended to adult predators in just three months, with 22% of those recommendations leading to follow requests.

