All activity
VoidLLM is a self-hosted LLM proxy and AI gateway that routes requests across OpenAI, Anthropic, Azure, Ollama, and vLLM — with load balancing, failover, API key management, usage tracking, and rate limiting.
What makes it different: zero-knowledge architecture. VoidLLM never stores or logs any prompt or response content. Not a toggle — by design. GDPR-compliant from day one.
Built in Go for sub-2ms overhead. Single binary, full web UI included. Free tier with unlimited users.
VoidLLMThe LLM proxy that never sees your prompts
Christian Romenileft a comment
Hey PH! I built VoidLLM because every LLM proxy I tried either logged my prompts for "observability" or added 10ms+ overhead with Python. VoidLLM is a Go-based LLM gateway that sits between your apps and any LLM provider. It handles auth, rate limiting, cost tracking, and load balancing across multiple deployments — but it never touches your prompt data. Zero-knowledge by architecture, not by...
VoidLLMThe LLM proxy that never sees your prompts
