Roblox is now requiring age verification for anyone who wants to use its chat, locking a new feature behind a facial scan the company says will provide greater security controls across its sprawling youth audience. Those who decline will see text and other chat features deactivated until they finish the process.
Company executives say the move is an industry first for a mainstream gaming platform and would place facial age checks as the “gold standard” in online communication safety. The rollout follows tests that started last year, and it folds age estimation into an overarching chat and parental tools redesign.
How Age Estimation Becomes Mandated Across Roblox Chat
The system works by estimating age with AI, using third-party vendor Persona, which can be accessed through a device’s camera. Rather than taking IDs, the model processes a face to predict an age range and sorts users into one of six cohorts: Under 9; 9–12; 13–15; 16–17; 18–20; and, finally, 21+. These groups help guide which chat experiences a user has access to, with more stringent filtering for younger users.
Age estimation first manifested on Roblox as part of a verified chat pathway for teenagers. The company now says the check will be mandatory everywhere chat is accessible, and it intends to similarly require it on real-time collaboration features in Roblox Studio — a tool used for building games on its platform. New forum rules are also on the way with the expansion.
Safety Objectives and Early Adoption Trends on Roblox
Roblox says millions of users have already undergone voluntary checks. In areas where we tested making age estimation for chat mandatory — such as Australia, New Zealand, and the Netherlands — over 50% of users entered into age-verified chat, showing that adoption can increase when features are gated based on estimates.
The stakes are high. Roblox has said it reached 151 million daily active users in 2025 and has bragged that it reaches nearly half of U.S. users under 16. The platform, as scrutiny mounts over child-safety issues, has begun adding activity tracking, enhanced blocking features, tighter chat moderation, and more hour-by-hour parental controls. The age verification is here to strengthen these systems, reduce the age gap, and limit voice and text access accordingly.
Limits to Accuracy and Ongoing Privacy Questions
Executives at Persona and Roblox say the AI is normally accurate within approximately two years and works best with younger users. Still, age estimation can be so imprecise at the margins that it may keep some older teens out of younger-preferred pools or allow some adults pretending to be children too readily in. Vendors often use liveness checks to guard against spoofing with a photo or deepfakes, but no system is perfect.
Privacy proponents will be monitoring how Roblox and its partner deal with biometric data and disclosures. The company is keen to stress that this is age estimation, not identification, and it uses on-device cameras rather than uploading IDs. Groups like the Electronic Frontier Foundation and the Future of Privacy Forum have called for clear retention limits, transparency reports, and independent audits for any age-verification technology — expect similar calls here.
Legal and Industry Context for Age Checks and Safety
Regulators are increasingly requiring platforms to do more to keep minors safe in chat and other high-risk features. The UK’s Online Safety Act mandates Ofcom to implement “robust age assurance” where appropriate, and the EU’s Digital Services Act prohibits targeting minors with certain ads and steps up duty-of-care duties. In the United States, COPPA regulates data collection for children under 13, and lawmakers and attorneys general have suggested more widespread guardrails that would essentially nudge platforms toward age checks.
Roblox also continues to face lawsuits from families who claim that it fails to stop the grooming and exploitation of children. The cases are being consolidated in federal court in San Francisco, and some also name Meta and other platforms like Discord and Snapchat for off-platform communications related to the incidents. Roblox has contested the suits, and the new age-estimation policy can be seen by many as a part of its answer to legal as well as regulatory pressure.
What’s New for Users and Developers on Roblox
Players can also anticipate a straightforward but non-negotiable process: send or receive chat messages via facial age scan if they want to play. Younger groups should be met with tighter communication boundaries and more visible safety reminders. Parents can still use Roblox’s parental controls to set contact preferences, oversee spending, and log activity.
That may cause friction for the creepiest creators as users finish checks, but greater age gating allows studios to design experiences knowing exactly who should be in them, and potentially unlocks features for older groups. Roblox says it will add mandatory age estimation to real-time collaboration in Studio and update policy guidance for developers who are creating social experiences.
The bottom line: age computation is evolving toward a hygiene factor in youth platforms. Roblox’s choice to condition chat access on a facial scan reflects where online safety standards are headed — and how much trade-off between convenience, privacy, and protection the industry is now willing to compel.