Meta CEO Mark Zuckerberg faced intense questioning in a Los Angeles courtroom this week as a closely watched trial examines whether his company knowingly created social media platforms that are addictive and harmful to children.
The case, brought by a now-20-year-old plaintiff identified in court filings as “KGM,” alleges that using Facebook, Instagram and YouTube from a young age led to compulsive behavior and serious mental health consequences. The lawsuit claims the platforms’ design features — including recommendation algorithms and endless scrolling — were intentionally engineered to maximize engagement, even among minors.
Zuckerberg was questioned about Instagram’s policies regarding children under 13, as KGM’s attorney pressed him on how the platform handles underage users. The plaintiff reportedly began using Instagram at age 9.
Zuckerberg acknowledged that users under 13 are not permitted on Instagram but said enforcing that rule is challenging. He told the court that a significant number of young users misrepresent their age to access the platform.
The questioning also focused on whether Meta prioritizes increasing the amount of time users spend on Instagram. Zuckerberg said the company tracks time spent as a performance metric to gauge competitiveness against rivals such as TikTok, rather than as a standalone goal.
“It’s different than trying to just increase time,” Zuckerberg told the court, describing it as a way to measure how Meta’s products compare within the industry.
Another point of scrutiny involved Instagram’s beauty filters, which critics have argued can negatively affect body image. Zuckerberg said Meta at one point halted certain filters amid concerns they promoted unrealistic standards or plastic surgery. He added that while the company supports creative expression, it believes it should not actively create or promote filters that could cause harm.
The trial marks the first time Zuckerberg has defended Meta before a jury, though he has previously testified before lawmakers about youth safety and content moderation.
Legal observers say the case could carry major implications for the broader social media industry. Thousands of similar lawsuits have been filed across the country alleging that platforms are designed in ways that foster dependency among young users. TikTok and Snapchat were initially named in this lawsuit but reached settlements before trial began.
Some analysts have compared the proceedings to past litigation against tobacco companies, arguing that the case may test whether tech companies can be held accountable for how their products are designed and marketed.
Meta has strongly denied the allegations. In statements prior to the testimony, the company said it is committed to supporting young users and pointed to tools and safeguards aimed at improving teen safety online. Meta also argues that the plaintiff experienced mental health challenges prior to using social media.
A spokesperson for Google, which owns YouTube, similarly rejected the claims, calling them unfounded.
Instagram chief Adam Mosseri testified earlier in the trial, stating that he does not believe social media use meets the clinical definition of addiction. Instead, he described what he calls “problematic use,” where individuals spend more time on the platform than they feel comfortable with. Mosseri also said teens generate less advertising revenue than other user groups.
As testimony continues, the trial is drawing national attention for its potential to reshape the conversation around youth mental health, digital design and corporate responsibility in the tech industry.
The outcome could influence not only future litigation but also how social media companies develop and regulate their platforms in the years ahead.

