Roblox is also making age verification a requirement worldwide for people who wish to use the chat, part of an effort in recent years to enhance safety that had previously been tested in certain markets, including Britain. The company will now gate all communication features behind an in-app age verification flow, a change meant to mitigate risk for its younger players and tighten moderation across one of the world’s largest gaming platforms.
What Changes for Players Under Roblox’s New Chat Rules
Chat will be safe after an age check. Once confirmed, players are separated into six categories by age—under 9, 9–12, 13–15, 16–17, 18–20, and 21+. Users can chat within their own band and with the adjacent bands, crafting guardrails to prevent cross-generational interaction. For instance, players ages 9 to 12 can message others in their band, as well as those under 9 and those 13 to 15.
Chat is turned off by default for children under 9. Parents can opt in, after an age check, so that families have a better sense of how younger children are interacting on the platform. The goal, according to Roblox, is to make conversations feel age-appropriate while maintaining social play—long a core component of its appeal.
How the Age Check Is Supposed to Work in the App
Users start the process in the Roblox app, granting access to their camera and following on-screen instructions for facial age estimation. Roblox says the verification is processed by Persona, a third-party identity provider, and no images or video taken for the check are stored after it’s been completed. Players aged 13 and above can choose an ID-based verification method as an alternative to facial analysis.
If the system misjudges someone’s age, users have the option to appeal and re-verify using alternative methods such as ID verification. Parents can also adjust a child’s age by using parental controls. Roblox notes that it scans multiple signals of account behavior and may request users to re-verify if their activity suggests a mismatch with the age given.
Privacy and Compliance Questions Raised by Verification
Age verification is being increasingly addressed by regulation. COPPA oversees data collection for children under the age of 13 in the United States, and state and federal regulators have been looking at youth safety and privacy on social apps. The U.K. is poised to introduce a robust new system of age verification with the Online Safety Act, while the ICO’s age-appropriate design guidance feeds into that work by nudging platforms toward proportionate age checks and data minimization. In the European Union, the Digital Services Act will require very large platforms to address systemic risks for minors.
Roblox stresses that images taken for age approximation aren’t stored, and are deleted by both Roblox and Persona after use. Still, facial analysis—even for age estimation rather than identity recognition—has faced scrutiny under biometric and consumer privacy laws. Transparent retention limits, auditability, and strong appeal paths will be essential to public trust.
Why Roblox Is Rolling Out Chat Age Checks Worldwide Now
The company is under growing legal and public pressure to improve child safety. Texas and Louisiana attorneys general have sued, alleging the apps have failed to protect youths from harms including grooming and exposure to explicit material. More broadly, lawmakers and regulators around the world are forcing platforms to restrict adult–minor contact and verify user ages in substantive ways.
Roblox has added layers of safety over the years—including a filtered text chat, parental controls, and age gating on voice chat it debuted earlier for older teens. This new mandate unifies these efforts by requiring all chat to be locked behind a verified age, going further than simply requesting a birthdate at sign-up. Given that it has more than 70 million daily active users worldwide and a significant percentage under the age of 13, any incremental improvement in age accuracy could have an outsized safety impact.
How Players, Parents, and Creators Could Be Affected
Anticipate some friction: users who don’t agree to verification will be locked out of chat, and families may need to negotiate consent flows for younger children. False positives or misses will create tickets, so the appeal and backup ID features matter in practice.
For developers, age-banded chat might alter the makeup of the communities that develop around popular experiences. Developers who use text chat to coordinate between staff and fans may notice a brief decline in engagement while users complete checks. Gradually, more defined age separation would lessen the moderation burden and boost retention for younger users by minimizing unwanted interactions.
The wider industry is going the same way. Instagram relies on third-party age estimation in some markets to verify teenagers, and Epic Games added cabined accounts and stricter voice chat defaults for children following enforcement actions from U.S. regulators. Roblox’s move is an investment in similar rigor on a platform where play and socializing are tightly intertwined.
The upshot: Roblox is substituting convenience for clarity. And if the system works as promised—accurate, noninvasive, and appealable—it could establish a long-term precedent for age-aware chat at scale. If not, it should face intensified scrutiny—and another cycle of design changes that continue to make millions of children safer without tearing the social fabric that keeps them coming back.