After months of obsessing over speed, scale, and reliability, we're launching Bifrost, a blazing-fast, open-source LLM gateway built for real-world AI workloads.
We built Bifrost because we were having scaling issues with existing gateways. So we went deep. Pure Go. Microsecond overhead. 1000+ models. MCPs. Governance. Self-hosted. Open source. It s the kind of infra product we wish existed when we were scaling our own stack. If you're building with LLMs and care about performance at scale, this one's for you.
We go live on Product Hunt now! If you're building with LLMs and care about performance at scale, check it out here: https://www.producthunt.com/prod...
Hi folks I m Pratham, an engineer, former founder, and a full-time infra nerd.
I ve been building since college, when I founded a startup called Interact, which scaled to more than 7000 active users in just 3 months and since then, I ve stayed obsessed with backend systems that are fast, stable, and scale effortlessly. Lately, I ve been working on Bifrost, an open-source, and the latest LLM gateway written in Go.
If you're building LLM apps and running into bottlenecks with tools like LiteLLM, Bifrost might be worth a look. It s fully self-hosted, adds only ~11 s mean overhead at 5K RPS, and supports providers like OpenAI, Anthropic, Mistral, Groq, Bedrock, and more. It also comes with built-in monitoring, real-time configuration, governance, MCP support, and a clean web UI to manage it all.