Discord is tightening access to mature features by requiring many users to confirm their age, introducing identity checks that span video selfies and government IDs. By default, all accounts will shift to a teen-appropriate experience unless Discord can determine you’re an adult through its background age inference system or you proactively verify.
How Discord’s age checks and verification work
Unverified accounts will face new limits: they won’t be able to join age-restricted servers or channels, speak in Stage events, reply to certain DMs, adjust a subset of settings, or view mature content without blurring. The aim is to create a safer default while gating features and spaces more likely to host adult conversations or media.

Discord offers two main verification routes. First, a video selfie uses an on-device process to estimate whether you meet the age threshold, checking liveness and facial features without, according to the company, uploading the video to the internet. Second, users can submit a government ID, which Discord says is deleted quickly after confirmation. In both paths, a third-party vendor handles the technical evaluation.
Many people may not see a prompt at all. Discord’s age inference model runs in the background and looks at signals such as account age, participation patterns, and the types of communities you belong to. Staff have indicated on the official subreddit that the vast majority of adults could be cleared automatically, with manual verification only triggered when the model can’t be confident in its assessment.
Privacy risks and security claims around age verification
Age assurance is colliding with a hard reality: users are deeply wary of handing over face scans or IDs to chat apps. Skepticism isn’t unfounded. Last year, roughly 70,000 ID images connected to a prior verification vendor were exposed in a breach disclosed by Discord, along with other personal data handled by the vendor. Discord ended that relationship and says its new provider employs stronger safeguards, with on-device processing for selfies and rapid deletion for ID uploads.
Security experts caution that “deleted quickly” is not the same as “never at risk.” Breaches routinely hit well-defended platforms, and biometric or identity data is especially sensitive. Regulators are watching closely: the UK Information Commissioner’s Office emphasizes data minimization under the Children’s Code, and the Australian eSafety Commissioner has pushed platforms toward proportionate age assurance that collects the least data necessary. Discord’s claims—local processing, narrow retention windows, third-party isolation—align with that direction, but trust will hinge on execution and independent scrutiny.
Practically, users should expect standard fraud controls behind the scenes, including liveness checks to deter deepfakes and document forensics to spot tampering. False positives and false negatives can occur, which is why Discord leans on inference first and asks for ID only when needed. Still, appeals and support responsiveness will matter for edge cases, including older adults with sparse activity or privacy-minded users who avoid public servers.

What users are saying online about Discord’s age checks
Initial reaction across Reddit, X, and creator forums is split. Moderators of large communities welcome default teen protections and clearer gating for mature topics. Power users and privacy advocates, however, are voicing frustration at ID collection and the opacity of automated profiling. A recurring theme: some are fine with age checks for specific content, but balk at platform-wide restrictions that nudge them toward verification even if they rarely visit adult channels.
Discord’s line that most adults will never see a prompt has tempered some criticism, yet many want explicit opt-outs, documented retention policies from the vendor, and transparency reports on verification accuracy and disputes. If those assurances don’t materialize, users say they’ll migrate chats and communities elsewhere.
Alternatives If You Don’t Want To Verify
There is no one-to-one replacement for Discord’s blend of voice, text, streaming, and bots, but several platforms are gaining attention:
- Matrix with the Element client offers decentralized, end-to-end encrypted rooms, federation across servers, and strong admin controls. Setup and moderation are more involved, but it reduces reliance on a single company’s policies.
- Guilded, owned by Roblox, mirrors Discord’s server-and-channel model with events and calendar tools popular for gaming clans. Bot ecosystems are growing, though not as extensive.
- Slack fits professional and school communities that value integrations and structured channels over casual voice chat. It’s not designed for large public servers, but it’s stable and enterprise-tested.
- Community-run options like Stoat Chat and emerging open-source projects can be viable for small groups that prioritize control and minimal data collection, accepting trade-offs in reliability and feature depth.
What to do before you decide on Discord verification
Review which servers or channels you actually use and whether they’re age-restricted. If your activity is already teen-appropriate, you may be unaffected even without verification. If you expect to need access, weigh the video selfie route—which keeps processing on-device—against submitting a government ID, and look for updated privacy notices from Discord’s vendor describing retention, auditing, and data access.
For parents and educators, the shift can add guardrails with less manual configuration. For privacy maximalists, federation and self-hosted communities on Matrix may be a better fit. Either way, the industry trend is clear: pressure from safety regulators in markets like the UK and Australia is moving chat platforms toward formal age assurance. Discord’s approach is one of the most visible tests yet of whether that can be done with meaningful privacy protections—and whether users will accept the trade.
