FindArticles FindArticles
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
FindArticlesFindArticles
Font ResizerAa
Search
  • News
  • Technology
  • Business
  • Entertainment
  • Science & Health
  • Knowledge Base
Follow US
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
FindArticles © 2025. All Rights Reserved.
FindArticles > News > Technology

Jury Orders Meta To Pay $375 Million For Child Exploitation

Gregory Zuckerman
Last updated: March 25, 2026 4:23 pm
By Gregory Zuckerman
Technology
6 Min Read
SHARE

A New Mexico jury has found Meta liable for misleading users about the safety of Facebook and Instagram and enabling child sexual exploitation, ordering $375 million in civil penalties. The award reflects the maximum $5,000 per violation under the state’s Unfair Practices Act, indicating the jury counted roughly 75,000 violations tied to the company’s conduct.

How New Mexico Built Its Case Against Meta at Trial

The lawsuit, brought by New Mexico Attorney General Raúl Torrez, leaned on an undercover probe that created accounts posing as 14-year-olds on Meta’s platforms. Investigators reported receiving sexually explicit messages and images from adults, evidence that became central to the state’s claim that Meta failed to protect minors from foreseeable harms on its services.

Table of Contents
  • How New Mexico Built Its Case Against Meta at Trial
  • Meta’s Response And Safety Record Claims
  • The Data Context And Industry Implications
  • What Comes Next For Meta And Possible Court Remedies
The Facebook logo, a white lowercase f on a blue circle, centered on a professional 16:9 background with a soft blue gradient and subtle hexagonal patterns.

Jurors also heard about platform design choices—such as infinite scroll and auto-play—that the state argued were engineered to maximize youth engagement and time-on-platform, while amplifying exposure risks. Internal company documents introduced at trial, according to the attorney general’s office, indicated Meta knew about exploitation problems at scale but did not implement sufficiently effective countermeasures.

The trial spanned several weeks and culminated in a sweeping liability finding under New Mexico’s consumer protection statute. Torrez’s office said the verdict marks the first time a state has prevailed at trial against a major tech platform for harms to young users, setting a playbook other attorneys general are likely to study.

Meta’s Response And Safety Record Claims

Meta said it will appeal, arguing the verdict mischaracterizes its safety efforts and that it is “confident” in its record of protecting teens online. The company has highlighted tools such as default private settings for younger users, limits on adult-to-teen messaging when the adult does not already follow the teen, expanded parental supervision features, and safety nudges designed to interrupt risky interactions.

The case also spotlighted encryption. New Mexico has asked the court to require operational changes including stronger age verification, proactive removal of predators, and safeguards around encrypted communications. Meta has recently shifted its approach on Instagram direct messages, with the company indicating DMs there will not be end-to-end encrypted going forward—an adjustment the state portrays as necessary to curb abuse but which also raises ongoing debates about privacy versus safety.

Outside this case, Meta continues to face a thicket of litigation over alleged youth addiction and mental health impacts on social media. Reuters has reported thousands of suits from families, school districts, and municipalities, underscoring how platform design, moderation capacity, and youth protections are under unprecedented legal scrutiny.

The Facebook logo, a white lowercase f on a blue circle, centered on a professional light blue gradient background with subtle hexagonal patterns.

The Data Context And Industry Implications

The scale of online child exploitation remains daunting. The National Center for Missing & Exploited Children’s CyberTipline receives tens of millions of reports annually from tech companies and the public, a reminder that large platforms—by sheer size—face extraordinary detection and enforcement challenges. While Meta frequently accounts for a significant share of industry reports, experts caution that higher reporting volumes can reflect both underlying risk and the company’s detection throughput.

Law and policy pressures are converging. State attorneys general have grown more aggressive, lawmakers are weighing age-assurance and design mandates, and regulators in Europe and elsewhere are testing how far safety-by-design obligations can go without undermining privacy or freedom of expression. For platforms, the practical challenge is operational: scale up age estimation, improve proactive detection of predatory behavior, reduce algorithmic amplification of risky content, and do so without unduly intruding on legitimate communication.

Academic research has called out friction-based approaches—like safety prompts, stricter default settings, and limiting unsolicited adult contact—that can meaningfully reduce harm exposure. The verdict suggests juries may increasingly expect technology companies to deploy such measures comprehensively and to validate their effectiveness with data. That reduces the margin for good-faith but unproven interventions.

What Comes Next For Meta And Possible Court Remedies

Beyond the appeal, New Mexico has a pending bench trial on a separate public nuisance claim, where the state is seeking remedies that could force Meta to implement platform changes around age verification, predator removal, and communications safeguards. If the court orders structural modifications—rather than just monetary penalties—it could become a template for other jurisdictions.

For Meta and its peers, the signal is clear: jurors are increasingly receptive to arguments that platform architecture and product decisions can constitute deceptive or unfair practices when they expose minors to exploitation risks. The $375 million penalty raises the stakes on proving not only intent to protect teens, but measurable outcomes. The industry’s next chapter will turn on who can show their safety systems actually work at scale—and who can’t.

Gregory Zuckerman
ByGregory Zuckerman
Gregory Zuckerman is a veteran investigative journalist and financial writer with decades of experience covering global markets, investment strategies, and the business personalities shaping them. His writing blends deep reporting with narrative storytelling to uncover the hidden forces behind financial trends and innovations. Over the years, Gregory’s work has earned industry recognition for bringing clarity to complex financial topics, and he continues to focus on long-form journalism that explores hedge funds, private equity, and high-stakes investing.
Latest News
Sonos Beam Gen 2 Gets 26% Discount In Amazon Spring Sale
Amazon Big Spring Sale Slashes Tablet Prices Up To 50%
Amazon Big Spring Sale Slashes Dyson Prices
Experts Flag Five Signs Your Phone Is Compromised
Survey Finds Android Users Reject New Sideloading Rules
Android Canary Revamps Linux Terminal With Modern UI
RayNeo Air 4 Pro XR Glasses Drop to $249 at Amazon
Meta Launches Small Business AI Initiative
Granola Raises $125M and Reaches $1.5B Valuation
Kindle Base Model Sale Fuels Shift From Doomscrolling
The Growing Importance of Deepfake Detection in Digital Identity Security
Sony And Honda Abandon Joint Afeela EV Project
FindArticles
  • Contact Us
  • About Us
  • Write For Us
  • Privacy Policy
  • Terms of Service
  • Corrections Policy
  • Diversity & Inclusion Statement
  • Diversity in Our Team
  • Editorial Guidelines
  • Feedback & Editorial Contact Policy
FindArticles © 2025. All Rights Reserved.