VoidLLM is a self-hosted LLM proxy and AI gateway that routes requests across OpenAI, Anthropic, Azure, Ollama, and vLLM β with load balancing, failover, API key management, usage tracking, and rate limiting.
What makes it different: zero-knowledge architecture. VoidLLM never stores or logs any prompt or response content. Not a toggle β by design. GDPR-compliant from day one.
Built in Go for sub-2ms overhead. Single binary, full web UI included. Free tier with unlimited users.