All activity
Every time you paste logs, configs, or code into an AI assistant, you risk leaking API keys, passwords, internal hostnames, and customer data.
Privatiser strips it all out locally before you hit send, then lets you reverse the anonymization on the AI's response.
Works as a browser extension, VS Code extension, and web tool. Nothing leaves your machine.

PrivatiserAnonymise sensitive data before it hits any AI
privatiserleft a comment
Hey PH! Built this after watching a colleague paste a production .env file into an AI assistant. The data was gone before anyone noticed. Privatiser detects 30+ pattern types across secrets, PII, network identifiers, AWS/GCP/Azure resources, and more. All processing is local. Would love feedback on what patterns you'd want added, or what's getting flagged that shouldn't be.

PrivatiserAnonymise sensitive data before it hits any AI
