Kent Phung

COO - Llm Gateway

by
A high-performance reverse proxy that intelligently distributes requests across multiple LLM providers (OpenAI, Google Gemini, Anthropic Claude) and API keys. It provides seamless OpenAI API compatibility, advanced load balancing algorithms, cost optimization

Add a comment

Replies

Best
Kent Phung
Maker
📌
I built this on Go because I wanted strong performance and the ability to handle heavy loads across different environments. It integrates easily with various types of storage, and scaling is as simple as adding new nodes — no complicated setup required. I also designed a smart balancing mechanism that manages load by key and priority, making it flexible for real-world workloads. Everything runs inside Docker, so deployment is as easy as configuring a single file with your environment and security variables. Ultimately, I wanted something lightIight, fast, and robust enough to serve as an AI/LLM infrastructure backbone — and that’s how this project came to life.