A Los Angeles jury has found Meta and Google negligent in a closely watched social media addiction case, awarding $3 million in compensatory damages to a young woman who said Instagram and YouTube contributed to years of anxiety, depression, and body dysmorphia. Meta was assigned 70% of the fault, with Google bearing the remaining 30%, and the panel left the door open to additional damages as deliberations continue.
The plaintiff, identified as K.G.M. or Kaley, now 20, alleged that design choices and recommendation systems on Instagram and YouTube intensified compulsive use during adolescence, fueling harmful comparisons and self-image issues. TikTok and Snap, also originally named, settled prior to trial, leaving Meta and Google to defend their products before jurors in Los Angeles County Superior Court.
Jurors were presented with internal research indicating that platform teams understood the heightened susceptibility of teens to engagement-optimizing features and, according to testimony, continued to lean on those levers. Revelations echo prior disclosures from the so-called Facebook Papers, including findings that Instagram exacerbated body image issues for 32% of teen girls who already felt bad about their bodies, a data point that has shaped public debate about platform responsibility.
Meta’s defense emphasized alternative causes, pointing to family instability and other stressors as more proximate drivers of Kaley’s mental health struggles. The jury’s verdict suggests it concluded that design and recommendation choices on the platforms materially contributed to harm, even amid complex, real-world factors.
Why This Verdict Matters For Social Media Platforms
The case advances a legal theory that focuses on product design and negligence rather than user-posted content—an approach that may skirt some protections under Section 230 of the Communications Decency Act. Courts have increasingly distinguished between hosting speech and designing potentially hazardous product features; notably, a Ninth Circuit ruling in Lemmon v. Snap allowed a negligence claim over the company’s speed filter to proceed on design grounds.
The decision also arrives amid mounting scrutiny of youth online harms. The U.S. Surgeon General issued an advisory urging stronger safeguards for adolescents and has called for warning labels on social media. If upheld on appeal, the Los Angeles verdict could energize similar suits and settlement talks across the country, especially alongside other recent rulings that found platforms responsible for failing to protect young users.
Evidence and the Addiction Playbook Presented at Trial
At the center of the trial were engagement mechanics now ubiquitous across apps: infinite scroll, autoplay, push alerts tuned for return, and algorithmic feeds optimized for time-on-platform. Psychologists describe these systems as reinforcing variable rewards—an intermittent burst of validation or novelty—that can be especially potent for the developing brains of adolescents, who are more sensitive to social comparison and peer feedback.
Data on teen usage underscores the exposure. Pew Research Center reports that 95% of U.S. teens use YouTube, 62% use TikTok, and 59% use Instagram, with roughly 19% saying they are online “almost constantly.” Common Sense Media estimates average daily screen media use by tweens and teens has climbed past seven hours, not including schoolwork. Meanwhile, the CDC has documented rising rates of persistent sadness among teen girls. While correlation does not equal causation, the jury’s decision suggests that evidence of deliberate design choices carried significant weight.
What It Means for Platforms and Product Design Choices
Even though $3 million is a modest sum for tech giants, the allocation of fault—70% to Meta, 30% to Google—signals potential exposure if more juries deem youth-oriented design choices negligent. Plaintiffs may pursue punitive damages in future cases, and investors will be watching for disclosure risks as legal liabilities evolve alongside regulatory demands.
Operationally, platforms may accelerate measures like age assurance, stricter default settings for minors, daily time caps, and more transparent recommendation controls. International frameworks already push in this direction: the EU’s Digital Services Act mandates systemic risk assessments, and the UK’s Online Safety Act and Age Appropriate Design Code emphasize child safety by design. In the U.S., proposals such as the Kids Online Safety Act and COPPA 2.0 would set clearer guardrails if enacted.
This verdict could also ripple into the federal multidistrict litigation in the Northern District of California, where hundreds of claims alleging adolescent addiction and related harms have been consolidated. Plaintiffs there will likely cite the Los Angeles case as persuasive authority on design negligence and causation.
Appeals and Next Steps in the Social Media Case
Meta and Google are expected to appeal, potentially challenging the sufficiency of the evidence on causation, jury instructions, and whether certain claims are preempted by federal law or shielded by Section 230. They may also argue that algorithms and feed curation are protected expression. Appellate courts will be asked to clarify where product design ends and speech moderation begins—a line that now carries major financial and policy consequences.
Regardless of the appellate path, this ruling tightens the focus on how engagement-driven features affect young users. The message to boardrooms is clear: design choices are no longer just growth levers; in the eyes of jurors, they can be duties of care. That reframing will shape product roadmaps, risk assessments, and the next wave of courtroom battles over the cost of keeping teens hooked.