Christian Romeni

Christian Romeni

Building VoidLLM

Badges

Tastemaker
Tastemaker
Gone streaking
Gone streaking

Forums

VoidLLM - The LLM proxy that never sees your prompts

VoidLLM is a self-hosted LLM proxy and AI gateway that routes requests across OpenAI, Anthropic, Azure, Ollama, and vLLM — with load balancing, failover, API key management, usage tracking, and rate limiting. What makes it different: zero-knowledge architecture. VoidLLM never stores or logs any prompt or response content. Not a toggle — by design. GDPR-compliant from day one. Built in Go for sub-2ms overhead. Single binary, full web UI included. Free tier with unlimited users.
View more