Meta is pushing to sharply narrow the scope of evidence a New Mexico jury can hear in a closely watched child safety lawsuit, seeking to exclude research on youth mental health, accounts of teen suicides linked to social media, the company’s finances and past privacy issues, and even references to Mark Zuckerberg’s college years. The requests, detailed in pretrial filings reviewed by Wired, preview a courtroom fight not only over what Meta allegedly did, but what jurors will be allowed to consider when deciding liability.
The case, brought by New Mexico attorney general Raúl Torrez, accuses Meta of failing to protect minors from exploitation and abuse on its platforms. It is among the first state-led trials testing whether design choices and enforcement gaps on major social networks can constitute unlawful practices when children are harmed.
What Meta Wants Kept Off the Table at Trial
Meta’s legal team has filed broad motions in limine, a standard pretrial tool used to keep out evidence deemed irrelevant or unfairly prejudicial. But legal scholars told Wired the breadth here is unusual. Among the items Meta wants excluded: third-party research on social media’s impact on youth mental health, anecdotal stories of adolescent self-harm or suicide allegedly tied to social platforms, references to its revenues and profitability, and the company’s past privacy violations under federal and international regulators.
The company also asks the court to bar any mention of its AI chatbots in the context of child safety, as well as internal or external surveys about the prevalence of inappropriate content on Facebook and Instagram, including surveys Meta itself conducted. Another specific request targets the public health advisory issued by U.S. Surgeon General Vivek Murthy warning of potential harms to youth mental health from social media, arguing that such materials could sway jurors emotionally while offering little probative value on the precise claims at issue.
In effect, Meta is urging the judge to focus the trial narrowly on alleged failures tied to specific products, policies, and incidents in New Mexico, rather than broader narratives about social media and youth well-being. The company contends that sweeping contextual evidence would confuse jurors, create bias, and violate rules that bar unfair prejudice.
The Legal Stakes and Meta’s Pretrial Strategy
Judges typically weigh requests like these under relevance and prejudice standards, akin to Rules 401 and 403 in many jurisdictions: does the evidence make a material fact more or less likely, and does its probative value outweigh the risk of misleading the jury? Companies often try to keep out emotionally charged material that could inflate damages or overshadow technical proof about product design and enforcement systems.
Here, the trial’s statewide scope raises the stakes. A win for New Mexico could embolden other attorneys general and private plaintiffs challenging how platforms police grooming, trafficking, and child sexual abuse material. A Meta victory, particularly if the court narrows admissible evidence, might become a blueprint for defendants in parallel cases, including consolidated litigation over youth mental health harms in federal court.
Two dynamics make the evidentiary battle pivotal. First, jurors’ understanding of what counts as “reasonable” child safety depends heavily on industry norms and public health context. Second, platform-scale data—like prevalence estimates, internal audits, and complaint volumes—can illuminate whether reported fixes match the magnitude of the problem. Excluding those categories would materially constrain the narrative the state can present.
Child Safety by the Numbers on Major Platforms
The scale of online child exploitation is not in dispute. The National Center for Missing & Exploited Children reports that its CyberTipline receives tens of millions of reports annually; in recent years that figure exceeded 36 million, with Facebook and Instagram among the largest sources of industry reports. Advocates say those numbers reflect both detection efforts and the sheer reach of major platforms.
Meta, for its part, has said it employs more than 40,000 people on safety and security and has invested billions since 2016 on moderation, detection tools, and policy enforcement. The company points to features like default private settings for younger teens, expanded parental supervision tools, and age-verification technologies. It also argues that end-to-end encryption can be paired with proactive safety measures, though child protection groups and some law enforcement agencies worry encryption may hinder the discovery of abuse.
On the mental health front, findings are complex. The Surgeon General has urged caution pending more definitive research, while peer-reviewed studies in journals such as JAMA Pediatrics have linked heavy social media use with elevated risks of anxiety and depression for some adolescents. The question for the court is not to settle that debate, but whether such research should inform jurors’ views of what Meta knew or should have anticipated about risks to children.
A First Test With Wider Implications for Platforms
If the court grants Meta’s requests in full, the trial could focus tightly on a handful of incidents, local enforcement actions, and technical design questions. If the judge allows broader evidence—including public health advisories, internal prevalence surveys, and testimony on prior regulatory run-ins such as the company’s long-running FTC consent orders—the jury will get a more panoramic view of risk and responsibility.
Either way, the evidentiary rulings will echo beyond New Mexico. Lawmakers debating proposals like the Kids Online Safety Act, regulators monitoring platform safety promises, and plaintiffs in other jurisdictions will all be watching how the court calibrates relevance versus prejudice in the emerging law of social media child safety.
For now, both sides are maneuvering for narrative control. The state wants jurors to see a systemic safety gap; Meta wants them to stick to the four corners of the complaint. The judge’s decisions on what the jury can see may prove as consequential as the verdict itself.