Why does end-to-end encryption matter? What does it matter, really, whether a few people who work for our digital service providers can theoretically look at our DMs, documents, and photos? Why is it worth all this energy to make sophisticated secure channels that encrypt data all the way from one user’s device (the first “end”) to the other device they’re talking to, with no decryption possible in between? 

I left a nice career as a communications scholar to expand access to E2EE tools in our daily lives via the hardest contest in the world, building a startup. So put another way, why do I care so much?

Because: we either live in a world where everything we do and say is recorded, stored, and used to manage our experiences in perpetuity, or we don’t. 

That’s it. Which version of the future do you want to live in? I know my answer. 

What the last 20 years of social media have taught us—what I learned, studying them—is that promises made in policy will be broken. Only cryptography will even try to keep its word.

The system will be hacked. The data will be leaked. The data will be monetized, the platform you trusted will be acquired, the terms of service will be changed. Most pressingly, these days, the GenAI will be implemented for instant insights you didn’t ask for—and all of your years of content will be sitting there, in plaintext, ready to be read. 

But the cryptography—mathematics—will stand up for you and the assurances you were made. Policy is a promise, but cryptography is an oath. The data does not exist. The service operators cannot access it. There is nothing to renege on. What’s yours is yours. Forever. 

They want us to focus on the extreme use cases—the whistleblowers, the activists, the sexters. Authoritarian forces have been working hard to convince us that we should be ashamed to want to close our digital doors and windows. That if we want to have a private conversation with our loved ones, we must have something to hide. 

It’s true: more and more of our lives are being criminalized every day. A teen telling her friend her period is late, a parent talking to his child about their immigration status or gender identity, a college student sharing a meme are all now grounds for law enforcement action. 

But in some ways, these are the exceptions that make the rule. Why else do four billion people choose iMessage and WhatsApp every day? 

Because we feel safe there. Despite what the ideologues tell us, most of the time, every day, in mundane ways, privacy is safety. 

It’s not intellectual. We’re not planning anything. We just want to talk to our friends without being manipulated. In end-to-end encrypted channels, we can feel that we’re not being listened to. How? Because the products don’t act like they’re listening to us. They don’t show us ads referencing things we just said. They don’t try to trap us in content other than the conversations we signed on for. They let us relax, stay in flow, do what we came to do. End-to-end encrypted spaces are rooms where the doors close, the shades pull in, and we can be ourselves, together.  

Lately, living in San Francisco, I’ve been feeling like we’re crossing over into the sci-fi timeline. Generative AI is colliding with total access to our social data and autonomous machines in a conflagration that calls forth the Terminator, Black Mirror, Minority Report. The recent TV series Murderbot is basically a show about the Humane pin inside Alexander Skarsgard’s beautiful body, plus arm guns. In the show, Murderbot is a weapon who (that?) knows everything about the mission and is listening to, biomeasuring, and analyzing his human companions at all times every day.

In Minority Report, people are constantly being identified by their eyeballs at omnipresent retinal scanners. The superficial violation is incessant personalized advertisements, but the deeper threat is real-time location and apprehension by hostile state security forces. That slope is feeling real slippery lately, isn't it?

In real life Minnesota over the last few weeks, [redacted]

And in China over the last decade, [redacted]

Stanford philosopher Lowry Pressly argues that we’re not looking for privacy, but oblivion. Privacy is the protection of our personal data. But oblivion is the freedom in which that data does not exist. “Privacy is valuable not because it empowers us to exercise control over our information, but because it protects against the creation of such information in the first place.”

His publisher describes the argument further: "Privacy deepens our relationships with others as well as ourselves, reinforcing our capacities for agency, trust, play, self-discovery, and growth. Without privacy, the world would grow shallow, lonely, and inhospitable."

Humans are resilient. Some of us are already surviving the panopticon. Will we continue to let it spread, unchallenged?  

Over the past couple of years building Germ Network, I’ve talked to a lot of people about digital privacy. As a company, we’ve found that many of our earliest adopters and advocates are developers. But after hundreds of conversations, I don’t think developers are intrinsically more private than other people: I think they just understand that when we’re online, we’re being watched. 

Developers understand that unless they are end-to-end encrypted, things called “direct messages” or “private messages”... are not. Our Slacks, Snaps, Tinder DMs… are sitting pretty in spreadsheets on someone else’s computer. 

End-to-end encryption is the singular tool that lets people have a conversation over the internet with the same privacy they have sitting in a room together. If we are allowed to sit in rooms together IRL, we should be allowed to do so online. You might say that the management of our online gathering is increasingly constricting how we assemble in the real world. I might agree. 

We meet online. We hang out online. We make friends, find lovers, build communities online. Online is a place, connected to the other places we move through. If we don’t have freedom of speech, thought, movement, or assembly on the internet, we don’t have it in the real world. Our digital and material behaviors are intertwined, what we do on the internet already feeding back into manipulations of our material lives. Go here, try this, watch this, buy that. Every second, platforms poke what Roger McNamee called the data voodoo doll, manipulating us to move. 

It’s true that privacy can be abused. The privacy of physical spaces is abused just as it is online. But that doesn’t mean we banish windows and doors. It means we create systems of reporting, accountability, and most importantly, community, so that no one is alone, uncared for, without anyone to talk to that they trust.

And where might they have that trusted conversation? 

In an end-to-end encrypted message.