Texas has sued Roblox, the sprawling gaming and creation platform that claims to have 150 million users each month, accusing it of endangering minors and failing to act on complaints about grooming and sexual exploitation. The office of the attorney general accuses Roblox of having become “a breeding ground for predators” and says the company failed to comply with state and federal online safety laws, prioritizing profits over protections for children.
The platform this suit is targeting has an outsized youth presence. Roblox reportedly has more than 150 million monthly users, and Texas cites internal estimates that around 40% of them are under 13. State lawyers say those demographics and add-ins like user-generated content and social features create high-risk environments in which children are exposed to sexually explicit material, exploitation and grooming.

What Texas Alleges About Roblox and Child Safety Failures
Roblox “flagrantly” ignored online safety duties, which meant harmful content and interactions were allowed to remain, the court papers alleged. The state attorney general’s office has described the company’s choices as benefiting “pixel pedophiles and corporate profit over the safety of Texas children.” Texas is asking for penalties and court-ordered safeguards that could strengthen age verification, limit harmful site features and boost enforcement.
The suit comes after similar suits filed by Louisiana and Kentucky this year, as part of an expanding effort at the state level to hold social and gaming platforms responsible for harms to children. Advocacy groups have escalated the pressure: A 2024 report from the National Center on Sexual Exploitation called Roblox “a tool for sexual predators” and pushed for more robust parental controls, stricter chat rules and stronger age gating.
Roblox Pushes Back Against Texas Allegations in Lawsuit
Roblox denies the allegations and says they are sensationalized and inaccurate. The company contends its policies are “deliberately stricter” than those of many platforms and that it spends heavily on safety-by-design mechanisms, moderation and detection tools. It also contends that the issues cited by Texas are being experienced across the industry and must be addressed in cooperation with regulators and experts, not in court.
Roblox has since introduced more safety measures in the past year. It added age verification — using the identity provider Persona — to unlock some communication features only for users who are able to demonstrate that they’re 13 or older. In late 2024, Roblox banned users under the age of 13 from using “Hangouts,” a type of social space such as clubs and islands that was considered high-risk. It has also discouraged vigilante operations on its service, removing users who try to mount sting-style entrapments that would threaten investigations and user safety.
Real-World Cases Fuel Scrutiny of Roblox Safety Risks
Roblox’s risks have come into greater focus in recent weeks, including after news organizations reported on law enforcement data. Bloomberg published at least 24 arrests in 2024 of people accused of abusing children they met on Roblox. Other reporting recorded exactly six other arrests connected to the platform thus far in 2025. The cases, though a tiny portion of overall activity on Roblox, are the most recent examples of the stakes when bad actors seek to abuse online communities geared toward youth.

Safety researchers say grooming often begins with innocuous contact before transferring to private channels or apps on another platform, a fact that can make such abuse hard to prevent and identify even by well-resourced teams. The complexity has forced platforms to layer moderation tools — text filtering, behavior analysis, user reporting — while cutting features where risk exceeds engagement rewards.
A Wider Battle Over Kids’ Online Safety in the U.S.
Texas has expanded its focus beyond Roblox. The state filed a lawsuit earlier this year against TikTok, accusing the app of promoting mature content and exposing minors to explicit content despite being promoted as appropriate for youth. Nationwide, regulators say an increase in reports of the online exploitation of children is evidence that platforms must take more action: In 2023, the National Center for Missing & Exploited Children received over 36 million CyberTipline reports, a new high driven by a volume of user-generated content.
Legal observers say that states increasingly are framing claims around product design and marketing decisions, rather than simply the content of an individual post — an approach designed to force companies to address their safety architectures without slamming headlong into the broad federal liability shield for third-party content. The Texas case could be a test of how far states can go to set standards for platforms aimed at young people and what companies must do to prove the effectiveness of their protections.
What Comes Next in the Texas v. Roblox Safety Case
The court will decide whether Texas can show “systemwide failures and practices” in Roblox’s safety efforts, and whether remedies should include new protections. Any discovery process could shine a light on Roblox’s moderation workflows, age verification performance and risk modeling, potentially creating a template for how other platforms are required to document child-safety outcomes.
For the time being, the distance between state accusations and Roblox’s rebuttal underscores the central question every youth platform confronts: Is it possible in real time to police identity checks, feature limits and sit-down penalties against user growth and ever more sophisticated tactics by abusers? The response will not only determine the outcome in this case, but establish a standard for judging all child-centered online ecosystems.
