Mark Zuckerberg took the stand in a Los Angeles courtroom, facing a jury and pointed questions about whether Meta’s products are engineered to keep young users hooked. In measured testimony, the Meta CEO defended Instagram’s design and safety choices while rejecting the characterization that the company’s strategy is to maximize time spent at all costs.
Inside Zuckerberg’s Testimony on Instagram Safety Design
Pressed by plaintiff’s counsel Matthew Bergman of the Social Media Victims Law Center, Zuckerberg said Meta has stepped up efforts to curb underage use on Instagram and revised internal metrics that once put heavy emphasis on engagement. He insisted the company does not optimize for raw minutes spent, arguing that “if people don’t feel good about their experience, they won’t come back,” according to courtroom accounts reported by NBC.

The questioning zeroed in on age verification, a hot-button issue for platforms balancing privacy with child safety. Zuckerberg defended Meta’s mix of AI signals, user reports, and verification prompts to detect under-13 accounts, while acknowledging the broader industry challenge of reliably proving age online without creating new risks for minors’ data. He also pushed back on attempts to frame prior statements as an endorsement of addictive design, calling such portrayals misleading.
A High-Stakes Case for Tech Liability and Platform Design
The Los Angeles case, brought by a 20-year-old user, is part of a larger wave of litigation that consolidates claims from more than 1,600 plaintiffs against major platforms. The suits contend companies knowingly deployed attention-capturing features that can worsen mental health in young people. TikTok and Snapchat have reportedly settled out of the current trial, while Meta and YouTube parent Google remain central defendants.
At issue is whether social networks can be held liable under product liability and negligence theories when harm is tied to design choices rather than user-generated content. For years, platforms have leaned on protections like Section 230 of the Communications Decency Act to deflect claims rooted in content. Plaintiffs here argue the core problem is product design itself—algorithms, notifications, and interface mechanics—an area where Section 230’s shield may be less absolute.
Meta has signaled it will ask jurors to focus on causation. In a statement previewing its defense, the company noted the jury must decide whether Instagram was a substantial factor in the plaintiff’s mental health struggles, arguing she faced significant challenges unrelated to social media before joining the platform.
Addiction or Problematic Use: Defining Online Behaviors
Instagram chief Adam Mosseri, who testified earlier in the proceeding, drew a line between “clinical addiction” and “problematic use,” saying not all heavy use meets a clinical threshold. That debate reflects the state of the science: while medical manuals have recognized gaming disorder, there is no consensus diagnosis specific to social media addiction. Leading psychology groups urge caution, calling for better screening, parental guidance, and platform-level safety by design.

Design features are under a microscope. Infinite scroll, personalized ranking, push notifications, and variable rewards—the “pull-to-refresh” slot-machine analogy is often cited—can create powerful engagement loops. Plaintiffs argue these loops target developmental vulnerabilities in teens. Platforms counter that many features also deliver user benefits, such as surfacing content from friends and communities while offering robust parental controls and screen-time tools.
What the Data Shows About Teens’ Online Use and Well-Being
Independent research paints a complex picture. Pew Research Center reports that roughly 95% of U.S. teens use YouTube, about 60% use Instagram, and close to 1 in 5 say they are online almost constantly. Health studies cited by journals like JAMA Pediatrics and The Lancet Child & Adolescent Health have linked heavy social media use with elevated risks of depressive symptoms, poor sleep, and body image concerns, while also noting that causation remains difficult to establish and experiences vary widely.
Internal documents previously reported in major newspapers suggested Instagram’s experience can affect some teen girls’ body image, a claim Meta disputes and says was taken out of context. The U.S. Surgeon General has urged more transparency from platforms and stronger safeguards for younger users, alongside guidance for families on setting boundaries and media diets.
Policy and Industry Fallout to Watch After the Trial
Advocacy groups see the trial as a catalyst for new rules. The Tech Oversight Project’s executive director Sacha Haworth said proceedings confirm long-standing concerns that social media systems “exploited children for profit,” urging Congress to move swiftly on measures like the Kids Online Safety Act. International regimes, including the EU’s Digital Services Act and the UK’s Online Safety Act, have already raised the bar on youth protections and risk assessments, adding pressure on U.S. companies.
Whatever the verdict, this case will reverberate. A ruling that treats engagement mechanics as a product safety issue could expose tech firms to broader liability and accelerate design changes, from stricter age assurance to default limits on notifications and night-time use. A defense win would not end the debate but could push the battleground back to legislatures and regulators, where momentum for youth protections continues to build.