With Privatemode, you can finally use generative AI without having to worry about data privacy and security. Powered by Confidential Computing, not even we can access your data. AI without privacy tradeoffs—finally! 🚀🔒
🚀 Hey Product Hunt! I'm the founder and CEO of Edgeless Systems. Today, we’re excited to launch Privatemode, the first GenAI service without privacy tradeoffs.
Why Privatemode?
AI should be private by design. But most AI tools process your data unencrypted in the cloud, store it externally, or even use it for training. If you handle sensitive information, this creates a serious security risk.
That’s why we built Privatemode—to give you AI without tradeoffs. Privatemode is an AI assistant and inference API that keeps your data encrypted at all times—even during processing. Privatemode ensures that no one can access your data. This includes us at Edgeless Systems as well as the cloud we're running on. Privatemode achieves this with the help of confidential computing, a powerful technology available in recent AMD and Intel server CPUs and the Nvidia H100.
Key features
🔒E2E encryption: With the help of confidential computing, your data (i.e., prompts) is protected at all times and remains encrypted even in memory during processing.
🔍E2E attestation: The integrity and authenticity of the Privatemode backend is automatically verified by your client software before sending any data. This mechanism is also made possible by confidential computing.
🛡️Advanced zero-trust architecture: Combining the above, the Privatemode service is architected to prevent any external party from accessing your data, including even Edgeless Systems.
📦Available as app or via API: The app is available for Windows and Mac. The app is required, because it takes care of the backend verification and encrypts your data client-side. Similar to Signal, there's no browser-based version for security considerations.
Under the hood
Currently, Privatemode serves Llama 3.3 70B. We're working on supporting additional models like DeepSeek R1. We built Privatemode with our Contrast framework for confidential containers and are running it on AMD SEV-SNP and Nvidia H100s in a EU data center. For reference, the source code is available. There are extensive docs on the architecture.
🎁 Try it now – free for 14 days, no credit card required.
Really proud of what we’ve built with Privatemode. Our goal was to make AI more private by default—so everyone can use LLMs without handing over sensitive data. Excited to see how people use it!
Report
Hunter
@mightymo Incredible work from your team Moritz. Congrats! 🚀
Report
Using LLMs easily without exposing any data to cloud/model/service providers? This is a real game-changer for AI!
Report
Hunter
@thomasstrottner Thanks for the support and your efforts in making this possible! 🚀
Privatemode AI is now live! Built with confidential computing, it's the only AI service that keeps data end-to-end encrypted. The goal is to make privacy the default, so businesses and individuals can work with AI securely. Looking forward to seeing people trying it!
Report
Hunter
@lml662 Couldn't have done it without you! 🚀 Your efforts were key to making this launch a success. Excited for what’s ahead!
Report
Love the idea of Confidential Computing protecting prompts even during processing. That’s a huge plus over traditional AI models that store or analyze unencrypted data.
Good Job :)
Report
Hunter
I’ve always been fascinated by the potential of Generative AI to drive efficiency and innovation—but data security is often overlooked. Privatemode ensures you can leverage AI’s power while keeping all inputs encrypted and secure. Excited to see this launch! 🚀
Report
Really great stuff! 🚀🔥
This opens up so many opportunities for building new AI products for sensitive data without the need to self-host. I love it!
And super happy to be joining the team right now :)
Report
Hunter
@marko_rosenmueller We’re thrilled to have you on board—looking forward to what we’ll build together!
Constellation
🚀 Hey Product Hunt! I'm the founder and CEO of Edgeless Systems. Today, we’re excited to launch Privatemode, the first GenAI service without privacy tradeoffs.
Why Privatemode?
AI should be private by design. But most AI tools process your data unencrypted in the cloud, store it externally, or even use it for training. If you handle sensitive information, this creates a serious security risk.
That’s why we built Privatemode—to give you AI without tradeoffs. Privatemode is an AI assistant and inference API that keeps your data encrypted at all times—even during processing. Privatemode ensures that no one can access your data. This includes us at Edgeless Systems as well as the cloud we're running on. Privatemode achieves this with the help of confidential computing, a powerful technology available in recent AMD and Intel server CPUs and the Nvidia H100.
Key features
🔒E2E encryption: With the help of confidential computing, your data (i.e., prompts) is protected at all times and remains encrypted even in memory during processing.
🔍E2E attestation: The integrity and authenticity of the Privatemode backend is automatically verified by your client software before sending any data. This mechanism is also made possible by confidential computing.
🛡️Advanced zero-trust architecture: Combining the above, the Privatemode service is architected to prevent any external party from accessing your data, including even Edgeless Systems.
📦Available as app or via API: The app is available for Windows and Mac. The app is required, because it takes care of the backend verification and encrypts your data client-side. Similar to Signal, there's no browser-based version for security considerations.
Under the hood
Currently, Privatemode serves Llama 3.3 70B. We're working on supporting additional models like DeepSeek R1. We built Privatemode with our Contrast framework for confidential containers and are running it on AMD SEV-SNP and Nvidia H100s in a EU data center. For reference, the source code is available. There are extensive docs on the architecture.
🎁 Try it now – free for 14 days, no credit card required.
Constellation
Really proud of what we’ve built with Privatemode. Our goal was to make AI more private by default—so everyone can use LLMs without handing over sensitive data. Excited to see how people use it!
@mightymo Incredible work from your team Moritz. Congrats! 🚀
Using LLMs easily without exposing any data to cloud/model/service providers? This is a real game-changer for AI!
@thomasstrottner Thanks for the support and your efforts in making this possible! 🚀
Constellation
Privatemode AI is now live! Built with confidential computing, it's the only AI service that keeps data end-to-end encrypted. The goal is to make privacy the default, so businesses and individuals can work with AI securely. Looking forward to seeing people trying it!
@lml662 Couldn't have done it without you! 🚀 Your efforts were key to making this launch a success. Excited for what’s ahead!
Love the idea of Confidential Computing protecting prompts even during processing. That’s a huge plus over traditional AI models that store or analyze unencrypted data.
Good Job :)
I’ve always been fascinated by the potential of Generative AI to drive efficiency and innovation—but data security is often overlooked. Privatemode ensures you can leverage AI’s power while keeping all inputs encrypted and secure. Excited to see this launch! 🚀
Really great stuff! 🚀🔥
This opens up so many opportunities for building new AI products for sensitive data without the need to self-host.
I love it!
And super happy to be joining the team right now :)
@marko_rosenmueller We’re thrilled to have you on board—looking forward to what we’ll build together!