Discord will require facial recognition or ID verification for all users starting in March

Discord will require facial recognition or ID verification globally in March, restricting features for unverified users to enhance teen safety.

Feb 9, 2026
4 min read
Set Technobezz as preferred source in Google News
Technobezz
Discord will require facial recognition or ID verification for all users starting in March

Don't Miss the Good Stuff

Get tech news that matters delivered weekly. Join 50,000+ readers.

Discord announced Monday it will require facial recognition or ID verification for all users globally starting in March. The chat platform's 200 million monthly users will default to teen-appropriate settings until age verification completes.

Users must submit a video selfie for AI age estimation or upload government identification.

Verified adults gain access to age-restricted servers, sensitive content, and full communication features. Unverified accounts face message filtering and limited interaction capabilities.

The rollout begins early March with a phased global deployment. Discord tested similar systems in the UK and Australia last year to comply with local safety regulations.

"Nowhere is our safety work more important than when it comes to teen users."

Video selfies process locally on devices without data transmission, Discord claims. Government ID images route to third-party vendors for verification and reportedly delete immediately after confirmation. The platform plans additional verification methods in future updates.

Discord disclosed a security incident in October 2025 where approximately 70,000 user ID photos potentially leaked from a verification vendor. The company says it implemented enhanced data protection measures following the breach.

New Jersey filed a lawsuit against Discord in April 2025 alleging inadequate child protection measures. The state's attorney general claimed the platform engaged in "deceptive and unconscionable business practices" despite safety feature additions in 2023.

The age verification push matches industry trends. Meta's Facebook and Instagram deploy AI age detection and teen account restrictions. TikTok implements 60-minute daily screen limits for users under 18. Roblox began requiring facial verification for chat access in January.

Half of US states have enacted or proposed social media age regulations, though courts have blocked some measures on free speech grounds. Australia banned under-16 social media use, with other countries considering similar restrictions.

Discord founder Jason Citron testified at a 2024 US Senate hearing on child safety alongside Meta's Mark Zuckerberg and TikTok's Shou Chew. The platform reportedly explored a public listing earlier this year amid increased regulatory scrutiny.

Verification typically completes within minutes, according to Discord documentation. Users receive notification and direct message confirmation when age group assignment finishes.

The system includes background age inference algorithms that estimate user maturity without direct verification.

Unverified accounts cannot join age-restricted communities, speak on Stage channels, or disable content filters. Friend requests from unknown users route to separate inboxes with warning prompts. Message request controls remain limited until age confirmation.

Discord's announcement follows media reports of confidential IPO preparations in January. The platform introduced mobile advertisements in 2025, allowing users to watch promotions for in-app rewards. Company representatives emphasize privacy protections despite biometric data collection requirements.

Share this article

Help others discover this content