LLM Gateway

LLM Gateway

Use any AI model with just one API

5.0
•1 review•

315 followers

Route, manage, and analyze your LLM requests across multiple providers with a unified API interface.
LLM Gateway gallery image
LLM Gateway gallery image
LLM Gateway gallery image
LLM Gateway gallery image
LLM Gateway gallery image
LLM Gateway gallery image
Free Options
Launch Team
Anima Playground
AI with an Eye for Design
Promoted

What do you think? …

Ismail Ghallou

Hey all! We're excited to finally launch llmgateway.io on Product Hunt šŸš€

A fully open-source Ai Gateway for your LLM needs, route, manage, and analyze your LLM requests across multiple providers with a unified API interface.

Since launch, LLM Gateway continued to grow, mostly through subreddits, X and LinkedIn. Now it's time to launch it on Product Hunt.

Luca and I (Ismail) are excited to launch LLM Gateway with the Product Hunt community with an exceptional offer of 50% off the pro plan forever! use Promo code: "PRODUCTHUNT"

Here is a list of our top features:

ā—† Simple usage overview dashboard (total requests, tokens used, cost estimate, organization credits, etc.)

ā—† BYOK (Bring your own keys), use credits or go Hybrid mode (we support up to 11+ providers and over 60+ models)

ā—† Configure caching to your preferences

ā—† Activity stats: Keep track of each model usage, charts showcasing cost estimates per provider, request volume etc...
ā—† Advanced activity: Detailed overview of each prompt, model, cost, provider, time, response metrics and cost information etc...)
ā—† Self-hostable: Checkout the Docs

We've put a lot of time, effort & care into building LLM Gateway, and we honestly hope you like the product! Let us know if you have any feedback, we'll take your feedback and work on it! Thank you for your support!

JaredL

Great to see an open-source solution like LLM Gateway! šŸš€ The BYOK feature is intriguing. How does it manage security when users bring their own keys?

Luca Steeb

@jaredlĀ They are stored in plaintext at this time, but we may add some encryption wrapper around it to make it harder to leak the secret. This applies to both cloud & self hosted.

Joey Judd

i saw it is open sourced, so does it mean we can combine it with our own product and let users charge LLMs to your product to use or let them bring their own keys to use our own product without charging. And in both ways, they can manage the token usage in Gateway dashboard right??

Luca Steeb

@joey_zhu1Ā not sure I understand to be honest, right now you’re free to do whatever you want. we might restrict reselling llmgateway directly by charging users credits since that’s not the intention. but using the gateway as an end user will always stay free and open source

JefferyHus

Good luck with your project

Ismail Ghallou

@jefferyhusĀ Thanks mate! can't wait for what's coming!

Harun Đulić

This looks great! It's something I wanted to exist to make integrating LLMs in my product easier.
Congrats on the launch!

Also, wanted to let you know that your Docs link in the main menu and in the footer point to http://localhost:3005/

Ismail Ghallou

@harundĀ Issue has been fixed! Thanks a lot for the support and feedback

CaiCai

The backend monitoring page looks very cool, congratulations on the release!

Ismail Ghallou

@hi_caicaiĀ Thanks!

GermƔn Merlo
Yeahhh Ismail! That’s the way. Such a big number of LLMs are making me crazy. Glad to see you helping on this. Wish you all the best
Luca Steeb

@german_merlo1Ā thanks!

123
Next
Last