VoidLLM
p/voidllm
The LLM proxy that never sees your prompts
β€’0 reviewsβ€’2 followers
Start new thread
trending
Christian Romeniβ€’

27d ago

VoidLLM - The LLM proxy that never sees your prompts

VoidLLM is a self-hosted LLM proxy and AI gateway that routes requests across OpenAI, Anthropic, Azure, Ollama, and vLLM β€” with load balancing, failover, API key management, usage tracking, and rate limiting. What makes it different: zero-knowledge architecture. VoidLLM never stores or logs any prompt or response content. Not a toggle β€” by design. GDPR-compliant from day one. Built in Go for sub-2ms overhead. Single binary, full web UI included. Free tier with unlimited users.