VoidLLM

VoidLLM

The LLM proxy that never sees your prompts

2 followers

VoidLLM is a self-hosted LLM proxy and AI gateway that routes requests across OpenAI, Anthropic, Azure, Ollama, and vLLM — with load balancing, failover, API key management, usage tracking, and rate limiting. What makes it different: zero-knowledge architecture. VoidLLM never stores or logs any prompt or response content. Not a toggle — by design. GDPR-compliant from day one. Built in Go for sub-2ms overhead. Single binary, full web UI included. Free tier with unlimited users.

VoidLLM makers

Here are the founders, developers, designers and product people who worked on VoidLLM