Roblox is putting numbers and details behind its age-assurance push, saying nearly half of its daily users have now completed on-platform age checks and outlining how those checks unlock age-based chat controls while adding new safety signals in the background.
The company has moved to mandatory facial verification for chat access, pairing that with automated systems designed to catch mismatches between a declared age and how an account behaves. The result is a sizable shift in who can talk to whom on one of the world’s most popular social gaming platforms.

Why Roblox Introduced Age Checks Across Its Platform
The strategy follows growing legal and regulatory pressure to better protect minors online. State attorneys general in Texas, Kentucky, and Louisiana have sued over child safety concerns on large platforms, and researchers and advocacy groups have warned for years about grooming and exposure to explicit content in open chat systems. Age assurance gives Roblox a way to segment interactions and demonstrate a risk-reduction plan to watchdogs.
Industry regulators from the Federal Trade Commission to European data authorities have signaled that “appropriate for age” experiences are fast becoming a baseline expectation. Roblox’s move aligns with that direction, even as debates continue over privacy trade-offs and the accuracy of AI-led age estimation.
How Roblox’s Age Verification Works In Practice
Verification happens inside the app. Users grant camera access and follow on-screen prompts that guide a short facial verification sequence. Roblox says it performs liveness detection to ensure a real person is present and then estimates age. Processing is handled by Persona, a third-party vendor specializing in identity and age checks. Both Roblox and Persona say they delete images or video clips once the check is complete.
If the system gets it wrong, users can appeal. Alternatives include ID-based verification or parent-driven updates through family controls. This “step-up” approach mirrors age-assurance frameworks recommended by child-safety organizations: start with the least intrusive method and escalate only when necessary.
Age-Gated Chats And Segmentation For Safer Play
Once verified, chat is limited by age bands: under 9, 9–12, 13–15, 16–17, 18–20, and 21+. Users can talk within their band and to the adjacent groups above and below. For example, 9–12 can chat with under 9 and 13–15, but not adults. The aim is to reduce cross-generational contact in chat by default while keeping gameplay social for peers.
This granular ladder—rather than a single “under 13” wall—reflects how social dynamics change quickly during adolescence. It also gives developers and moderators clearer guardrails for voice and text experiences in popular genres.

Stopping Loopholes And Verifying It’s You
Age checks are only as strong as their enforcement. After media reports highlighted listings for “verified” Roblox accounts on marketplaces like eBay, the platform said it is layering behavioral signals to confirm the person using an account matches the verified age. That includes keystroke patterns, emoji usage, who a user chats with, and the kinds of experiences they frequent—signals that can trigger rechecks if something looks off.
Marketplaces have removed some listings, but the risk of account resale and impersonation remains. Roblox’s background systems are meant to catch this with minimal friction for legitimate users, similar to fraud detection on payments platforms. The company says additional checks roll out when inconsistencies appear.
What The Data Shows And Why It Matters For Roblox
Roblox reports that 45% of daily active users have completed age checks. Among those verified, 35% are under 13, 38% are 13–17, and 27% are 18+. The verified population skews younger than self-reported ages, an expected outcome given how often teens inflate their age in sign-ups across the internet.
There is also a business dimension. Roblox’s leadership says the 18+ cohort is growing at over 50% and monetizes 40% higher than younger groups, prompting investment in genres favored by older players, such as shooters, RPGs, sports, and racing. Accurate age data sharpens that strategy while giving advertisers and developers clearer audience signals.
The Bigger Picture For Online Safety On Platforms
Roblox’s approach sits within a broader shift to “age assurance” across social media and gaming. Meta’s Instagram has tested face-based age estimation for teens through specialized vendors, while YouTube and Snap have expanded teen safety controls and parental tools. Regulators in the UK and EU have urged platforms to calibrate features to a user’s maturity, not just their stated age.
Key open questions remain: independent auditing of accuracy and bias, handling of false positives and appeals, transparency around data retention, and active enforcement against account trading. Roblox says its systems will keep evolving as risks change—a necessity for a platform where trends and tactics move fast.
The bottom line: age checks on Roblox now meaningfully shape who can talk to whom, with AI-driven verification at the front door and behavior-based checks behind the scenes. The combination is designed to curb harm without upending social play, and the early adoption figures suggest the model is gaining traction.