I started Tinfoil with my cofounders in 2024 with the ambition of building a private AI platform. Back then there were zero alternatives to the chatbots like ChatGPT and Claude that would keep your chats private from the provider. The only option was running mediocre models locally or spending tens of thousands on hardware. Apple had only just pioneered the idea of using secure hardware for cloud-based AI inference[1] and NVIDIA confidential compute mode on their GPUs was still a new concept.
In just a little over a year things have changed a lot. People no longer ask us "but why?" or "I have nothing to hide" or "it's not worse than using Google!" It recently occurred to me that at some point in the last 3 to 6 months, this skepticism, which I would frequently hear when talking about Tinfoil, has diminished dramatically. People are now starting to believe in the work we're doing and see that we might be on to something.
First, I think there has been an increase in awareness of just how much more personal a chat history can be compared to Google searches, not to mention AI agents running free on our personal devices. We no longer use AI as a search engine but rather as a personal assistant, a confidant, and a coworker.
Second, perhaps as a by-product of the first point, is an awareness that with great power often comes great abuse of said power, which has led to a somewhat harsh realization of just how terrifyingly bad life can get for humans if AI falls into the hands of the wrong people.
On Friday, the Pentagon told Anthropic that it wanted to use Claude to analyze data collected on American citizens[2] "that could include information such as the questions you ask your favorite chatbot, your Google search history, your GPS-tracked movements, and your credit-card transactions, all of which could be cross-referenced with other details about your life."
This was perhaps the first time a true warning siren went off and a larger-than-usual circle of people have paused to consider the consequences of this panopticon. Props to Dario and Anthropic[3] for sticking to their principles and for sounding the alarm. My hope is that this will usher in a new era of understanding of why privacy on the internet was always important and is now becoming CRITICAL if we want to have a future that isn't hostile to the majority of humanity.
It's nice to see more people join in building for a future where AI is a tool that helps humans be creative rather than a tool used to subjugate and control them. One year ago our vision and ideas were seen as fringe and Tinfoil was viewed as just a research demo. Today we get thank you emails from journalists, lawyers, and appreciative users. More people are now building tools to improve AI privacy, including the cofounder of Signal, Moxie Marlinspike who is building Confer, DuckDuckGo introducing duck.ai with zero data retention promises, researchers designing unlinkable AI routing with openanonymity.ai, and Phala Network building distributed confidential computing. To me, it's reassuring to see more interest, research, and startups in this space. If enough people care we might yet avoid the perils of a total surveillance state.
At Tinfoil, my cofounders and I will continue to push the frontier of AI privacy and work tirelessly on advancing our mission. Right now is a rare window of opportunity to play a part in building tools for a better future. AI privacy is just one part of this. Constitutional principles, verification of AI workloads, guardrails, abuse prevention, open source models, power checks-and-balances, and many AI safety initiatives will all be important in preventing a future where humans are surveilled, and controlled, by ever more powerful Agents of the state.
[1] Apple announced Private Cloud Compute in June 2024 for verifiable private AI inference in Apple Intelligence. security.apple.com
[2] Ross Andersen, "Inside Anthropic's Killer-Robot Dispute With the Pentagon," The Atlantic, March 1, 2026. theatlantic.com
[3] Dario Amodei's statement on the situation with the Department of War. anthropic.com