Back to Blog
discordage verificationfacial recognitionbiometric dataprivacydata breachsurveillance

Discord Wants to Scan Your Face. Here's Why I'm Worried.

Snugg Team|February 12, 2026|12 min read
Discord facial recognition age verification privacy concerns


Starting March 2026, Discord will require facial recognition or government IDs. Five months after their vendor exposed 70,000 IDs in a data breach.


I saw the Discord announcement on Tuesday.

"Enhanced teen safety protections," they called it. Global rollout. Privacy-forward age verification.

Sounded responsible. Until you read the details.

Starting in March, Discord is putting every single account—yours, mine, everyone's—into a default "teen mode." Want to turn off content filters? Prove you're an adult. Want to access age-restricted servers? Prove you're an adult. Want to change your privacy settings? Prove you're an adult.

How do you prove you're an adult?

Option 1: Let Discord's AI analyse your behaviour patterns to guess your age.
Option 2: Record a video of your face for "facial age estimation."
Option 3: Upload a photo of your government-issued ID.

This is the same company whose age verification vendor got hacked five months ago, exposing 70,000 government IDs.

I sat there reading the announcement, and something clicked.

This isn't about teen safety.

This is about normalising biometric surveillance.


What's Actually Happening

Here's what Discord announced on February 9th, 2026:

Starting March (phased rollout):

  • Every account defaults to "teen-appropriate" experience

  • Content filters automatically blur sensitive content

  • Age-restricted channels, servers, and commands locked

  • Direct messages from non-friends go to separate inbox

  • Can't change these settings without age verification


The three ways to verify you're an adult:

1. Behavioural Tracking ("Age Inference Model")

Discord says most adults won't need to verify manually because their AI will figure it out by watching:

  • How you use the platform

  • What servers you're in

  • Your communication patterns

  • Device information

  • Account age

  • "Several other signals"


Translation: We're already tracking everything you do. Now we're using it to profile you.

2. Facial Age Estimation

Record a video selfie. Their AI analyses your face to estimate age. Discord says the video is "processed on device and never stored."

But it still requires you to record your face and send biometric data to a third party for analysis.

3. Government ID Upload

Send a photo of your driver's licence, passport, or other government ID to Discord's vendor partners.

Discord promises the ID is "deleted quickly—in most cases, immediately after age confirmation."

Just like they promised last time.


The Part They're Not Advertising

September 2025—five months ago:

Hackers breached 5CA, one of Discord's customer service vendors. The breach exposed personal data for approximately 70,000 Discord users.

Including government-issued IDs used for age verification.

The same age verification system Discord is now rolling out to over 250 million monthly active users globally.

Discord's response in their official announcement?

"These vendors were not involved in the September 2025 data breach."

Notice they said "these vendors" (plural). Not "this vendor" (singular).

They're using multiple vendors. We don't know which ones. We don't know their security track record. We're just supposed to trust.

Five months after 70,000 IDs were compromised.


Why This Matters (Even If You're Not On Discord)

I don't use Discord much. I'm a yacht surveyor in the Caribbean—it's not really my demographic.

But I'm paying attention to this anyway.

Because this isn't just about Discord.

This is about normalising biometric surveillance as the price of admission to the internet.

Discord isn't the first. They won't be the last.

The UK's Online Safety Act is driving this. Other countries will follow. Soon every platform will have the same requirements:

  • Prove you're an adult

  • Submit to facial scans or ID checks

  • Accept that tech companies will store your biometric data

  • Trust that they'll keep it secure


This despite the fact that data breaches are constant. Equifax: 147 million records. Marriott: 500 million records. Yahoo: 3 billion accounts.

Now we're adding faces and government IDs to that list.

And here's the clever part:

They're framing it as protecting children. Who can argue with that?

But protecting children doesn't require building a global biometric database. It doesn't require behaviour tracking AI. It doesn't require normalising facial scans as standard practice.

There are other ways to create age-appropriate experiences. Parental controls. Device-level restrictions. Education.

But those solutions don't give companies what they actually want: more data.


The Slippery Slope Isn't Theoretical Anymore

"Slippery slope" arguments usually feel like paranoia.

Not this time.

2018: Social media platforms are free and ad-supported. That's just how the internet works.

2020: Maybe a paid tier for power users. But the core experience stays free.

2022: Actually, to fight spam, you need to verify your phone number.

2023: To protect the platform, we need your ID for certain features.

2025: For everyone's safety, we need your face or ID just to use basic features.

2026: This is normal now. Every platform does it.

We're watching privacy expectations shift in real-time.

Each step seems reasonable in isolation. But together, they're building infrastructure for surveillance that would have seemed dystopian a decade ago.

And we're accepting it.

Because what's the alternative? Stop using platforms where everyone you know already is?

That's the genius of network effects. They trap you.


What Discord Could Have Done Instead

Here's the thing that frustrates me most:

There were better options.

Option 1: Device-Level Controls

Apple, Google, and Microsoft all have parental control systems built into their operating systems. They're mandatory for kids' accounts. They don't require biometric scanning.

Discord could have integrated with these existing systems.

Option 2: Parent-Managed Age Verification

Make parents verify their kids' ages through a simple declaration. If problems arise, parents are responsible—same as they are for every other aspect of their child's internet use.

Option 3: Age-Gated Servers By Default

Make server creation "all ages" by default. If a server owner wants to allow adult content, they set age restrictions. Let individual communities manage their own standards.

Option 4: Education Over Enforcement

Provide better tools for parents. Better reporting systems. Better moderation. Focus on making unsafe behaviour easy to spot and stop.

None of these require scanning faces or collecting IDs.

But none of these give Discord the data infrastructure they're building.


The Real Question

I keep coming back to this:

If Discord's goal is teen safety, why are they building systems that:

  • Profile all users behaviourally

  • Collect biometric data

  • Store government IDs (however briefly)

  • Create infrastructure that could be repurposed for other tracking


When simpler solutions exist?

The answer, I think, is that teen safety is the justification.

But it's not the goal.

The goal is building the infrastructure while public opinion is on their side.

Because once it's built, once it's normalised, once everyone accepts that you scan your face or show your ID to use the internet...

That infrastructure can be used for anything.


What You Can Actually Do

1. Don't normalise this

When Discord asks for your face or ID, remember: you're not being unreasonable for thinking this is invasive.

This is invasive.

2. Use platforms that don't require surveillance

Discord isn't your only option. There are alternatives:


Or build your own server. It's easier than you think.

3. Be honest about the trade-offs

If you choose to stay on Discord, fine. I'm not judging.

But be honest with yourself about what you're trading: your biometric data, your behavioural patterns, your ID—all to access a free chat platform.

That's the exchange. Make it knowingly.

4. Demand better from platforms you pay for

If you're paying for Discord Nitro (£8.99/month), you're funding this system.

Ask yourself: is this what you want your money supporting?

5. Talk about this

Share this post. Start the conversation. Don't let this slide by as "just another update."

Because if we accept biometric age verification as normal, we're accepting a fundamentally different internet.

One where anonymity isn't an option.

Where privacy is conditional.

Where access requires surveillance.


Why I'm Building Snugg

This Discord situation is exactly why I started building Snugg this year.

I watched platform after platform make the same choices:

  • Free access funded by data extraction

  • Privacy as a premium feature

  • Surveillance infrastructure justified by safety

  • Users treated as resources to be mined


And I realised: it doesn't have to be this way.

Here's what Snugg doesn't require:

❌ Your face
❌ Your ID
❌ Your behavioural profile
❌ Your location history
❌ Your trust that we're securing your biometric data

Here's what Snugg does:

✅ End-to-end encryption (we physically can't read your content)
✅ Simple subscription (£5/month—you're the customer, not the product)
✅ Open source (verify what we're actually doing)
✅ Zero data collection beyond what's needed to run the service
✅ Your data, your control, your export anytime

We're not trying to beat Discord or Meta or any of the giants.

We're building an alternative for people who are tired of surveillance being the price of connection.

If that's you, we're building this for us.


The Bigger Picture

Discord's age verification announcement happened the same week Elon Musk quietly dismantled encryption on X DMs.

Same week Meta announced they're testing new premium subscriptions across Instagram, Facebook and WhatsApp—paying for features that used to be free.

Same week Reddit confirmed they're selling user data to AI training companies.

This isn't coincidence.

This is the direction.

Platforms are simultaneously:

  • Reducing privacy

  • Increasing surveillance

  • Charging more for degraded experiences

  • Normalising data extraction as the cost of access


And they're doing it while we're all scattered across different platforms, unable to coordinate resistance, trapped by network effects.

That's not a sustainable future.

That's not a future I want.


What Happens Next

Discord's global rollout starts in March 2026.

Most users will get automatically "verified" as adults by the behavioural AI. They won't even notice the change.

Some users will get flagged and asked to submit facial scans or IDs. Most will comply. Because what's the alternative?

A few will refuse and lose access to features. Some will leave.

Discord will call it a success. Other platforms will see that compliance is high. They'll roll out similar systems.

Within two years, biometric verification will be standard across major platforms.

And we'll have normalised giving tech companies our faces and IDs.

Unless we don't.

Unless enough people say: this isn't acceptable.

Unless enough people choose platforms that don't require surveillance.

Unless we build alternatives that actually respect users.


Join Us

I'm looking for 1,000 founding members for Snugg.

1,000 people who believe connection shouldn't require giving up your face, your ID, and your privacy.

If you're tired of platforms treating you as a surveillance target, join the waitlist.

If you want to help build something better, I want to hear from you.

Join the waitlist: snugg.social
Email me directly: hello@snugg.social

This is why I'm building Snugg. Not to compete with Discord or Meta.

But to prove there's another way.

A way that respects you.


Related Reading

If this article was useful, you might also want to read:

Understanding Encryption & Privacy:


Platform Privacy Problems:

The Bigger Picture:


Sources & Further Reading

Discord Age Verification Announcement (February 2026):


5CA Data Breach (September-October 2025):

UK Online Safety Act:

Discord User Statistics:

Privacy Analysis:


About the Author

I'm a yacht surveyor based in the Caribbean and the founder of Snugg. After 15 years watching social media platforms prioritise profits over privacy, I decided to build the alternative. I previously ran a successful sailing holiday business before algorithm changes destroyed organic reach. I'm not a privacy activist—just someone who thinks you shouldn't need to scan your face to chat with friends. When I'm not building Snugg or surveying yachts, I'm wondering when we started accepting surveillance as normal.

Connect: Twitter/X | LinkedIn | Email

About Snugg: I'm building the social media platform I wish existed. No ads. No tracking. No algorithms. No surveillance. Just you, your friends, and actual control over your digital life. Learn more

If this resonated with you, please share it. The more people who understand what's happening, the harder it is to normalise.

Share this post

Ready for Real Privacy?

Join our waitlist and be among the first to experience a truly private social platform.

Join Waitlist