Privacy might survive the AI revolution, but mass surveillance could still win. That’s the warning from Proton CEO Andy Yen, who told the Semafor World Economy summit that his company’s encryption tools can protect user data from Big Tech’s AI models – but there’s one threat even end-to-end encryption can’t stop. As AI systems demand ever-more data and governments push for backdoor access in the name of child safety, Yen’s making the case for local AI processing while sounding the alarm on the surveillance infrastructure being built around it.
Proton built its reputation on keeping Big Tech out of your inbox. Now CEO Andy Yen is trying to do the same thing for AI – and he’s losing sleep over whether it’ll work.
Speaking at the Semafor World Economy summit, Yen laid out a vision for privacy in the AI era that depends on keeping your data on your device instead of shipping it to OpenAI, Google, or Microsoft data centers. It’s a direct challenge to the cloud-first AI model that’s made those companies billions, and it puts Proton in an increasingly awkward position as governments worldwide demand access to encrypted communications.
“Privacy in the AI era is possible,” Yen told the summit, according to coverage from ZDNet. But that confidence comes with a major asterisk. The real threat isn’t whether Meta can train its models on your messages or Amazon can analyze your shopping habits. It’s whether governments can force companies like Proton to build backdoors into the encryption that makes privacy possible in the first place.
The tension is playing out right now in debates over child safety online. Legislators in the US and Europe are pushing for laws that would require platforms to scan encrypted messages for child abuse material – a worthy goal that privacy advocates say would fundamentally break encryption. You can’t have a system that’s secure against hackers but transparent to law enforcement. The math doesn’t work that way.
Yen’s solution to the AI privacy problem is local processing. Instead of sending your emails, documents, and search queries to cloud servers where they can train the next generation of large language models, Proton wants AI features to run on your phone or laptop. Apple is making similar moves with its upcoming Apple Intelligence features, keeping sensitive data processing on-device whenever possible.
It’s technically feasible now in ways it wasn’t two years ago. Smaller, more efficient AI models can run on consumer hardware without draining your battery in an hour. Nvidia and chip makers are racing to pack more AI processing power into mobile devices. The technology exists to keep your data local while still getting AI features that feel magical.
But local AI only solves half the problem. Proton can encrypt your data in transit and at rest. It can process your queries locally instead of sending them to the cloud. What it can’t do is stop governments from demanding access to that data through legal channels – or building mass surveillance infrastructure that captures communications before encryption ever kicks in.
That’s what keeps Yen up at night, and it’s the vulnerability he discussed at Semafor. Even perfect encryption doesn’t help if authorities can compel you to hand over the keys, or if they’re capturing metadata about who you talk to and when. The surveillance state doesn’t need to read your messages if it can map your entire social network and track your movements.
The privacy community has been warning about this for years, but AI makes it exponentially worse. Surveillance systems that once required human analysts can now operate at machine scale, processing billions of data points to identify patterns and flag individuals. Tesla CEO Elon Musk and others have warned about AI-powered authoritarianism – the combination of surveillance infrastructure and artificial intelligence that could make mass control frighteningly efficient.
Proton’s positioned itself as the anti-Big Tech alternative, offering encrypted email, cloud storage, and VPN services to users who don’t trust Google or Microsoft with their data. The company says it has over 100 million users, a fraction of Gmail’s billions but a meaningful audience that’s willing to pay for privacy.
The question now is whether that model can survive the AI era. If privacy requires local processing, Proton needs to build competitive AI features that work on-device. If governments push harder for encryption backdoors, the company needs to hold the line legally even as pressure mounts. And if mass surveillance becomes the norm, Proton needs to convince users that its tools still matter.
Yen’s clearly wrestling with these tradeoffs. Protecting children is a legitimate concern. So is preventing terrorism and serious crime. But the history of surveillance suggests that once you build the infrastructure for good reasons, it gets used for everything. China’s social credit system started with noble goals. So did many Western surveillance programs.
The AI privacy debate is happening right now in corporate boardrooms, legislative chambers, and standards bodies around the world. Meta is training AI on public posts. Google is integrating AI into search and email. Microsoft is embedding it in Windows. The defaults being set today will determine how much privacy exists tomorrow.
Proton’s betting that enough people care about those defaults to seek out alternatives. Yen’s betting that local AI can deliver the features users want without the surveillance they don’t. And he’s warning that even if Proton gets all of that right, mass surveillance could still win if governments decide privacy is a luxury they can’t afford.
It’s a sobering message from someone who’s spent years building privacy tools. The technology works. The encryption is solid. The threat is political, not technical – and that’s much harder to solve.
Yen’s warning cuts through the usual privacy debate. This isn’t about whether Google reads your email or Meta tracks your clicks – those battles are mostly lost. It’s about whether the combination of AI and government surveillance creates a system that’s impossible to escape, even with the best encryption money can buy. Proton can protect your data from hackers and corporations. It can’t protect you from laws that require backdoors or surveillance systems that operate at the network level. That’s a problem technology alone can’t solve, and it’s why Yen’s message at Semafor matters. The AI era makes privacy harder and surveillance easier. Local processing helps, but it’s not enough if the political will to protect privacy disappears. Watch what happens with encryption legislation over the next year – that’ll tell you whether Yen’s nightmare scenario becomes reality.











Leave a Reply