MakeHub.ai

MakeHub.ai

LLM Provider arbitrage to get the best performance for the $

5.0
1 review

79 followers

OpenAI-compatible endpoint. Single API, routes to the cheapest and fastest provider for each model. Works with closed and open LLMs. Real-time benchmarks (price, latency, load) run in the background. Usable direclty now on Roo and Cline forks
MakeHub.ai gallery image
MakeHub.ai gallery image
MakeHub.ai gallery image
MakeHub.ai gallery image
Payment Required
Launch Team / Built With
Flowstep
Flowstep
Generate real UI in seconds
Promoted

What do you think? …

Romain Batlle
The main difference with OR, is that we are doing real time arbitrage within the many provider we reference, which allows you to always get the absolute best value for your \$ at the exact moment of inference, which we find very cool.
Evgenii Zaitsev

Really exciting! The real-time arbitrage feature sounds like a game-changer for optimizing cost and performance. How does it handle model compatibility across different providers, especially with closed LLMs?

Romain Batlle

@evgenii_zaitsev1 It does very well on those subject, we spent a lot of time uniformising everything from tool call to prompt caching, so overall it works as if you had only a single API key. Closed LLMs were particularly difficult because most of them have their own framework, and act like they handle openai compatible endpoints but it doesn't work on all their features, so we have to create proxys to do that. Hoping that I answered your question!

Erliza. P

💸⚙️ MakeHub.ai launches today! Smart LLM provider arbitrage = max performance for every dollar spent. AI devs, it’s optimization season 📊🤖

Liam Xavier

Hi Make Hub team,

Big congratulations on your product launch — it looks truly impressive and caught our attention! 🚀

I’m Liam from Scrapeless, and we’d love to explore a potential collaboration with you.

We offer a robust Deep SERP API, providing high-quality access to both Google Search and Google Trends data — fast, reliable, and tailored for AI-native products and analytics workflows.

We’d love to offer you free access to our API in exchange for a mention or shoutout on your Twitter or LinkedIn, and we're also happy to cover the promotion costs to help boost your visibility.

If this sounds interesting, I’d love to chat more — feel free to suggest a time or just reply here!

Rachit Magon

@romain_batlle Awesome work! Just curious, how can we use Crew AI with LangChain?

Romain Batlle

@rachitmagon Trivially! We ship an openAI compatible endpoint. Meaning that you simply have to modify the base_url and you are good to go with Both Crew AI and LangChain:

More details in the doc here: https://www.makehub.ai/docs/basic-usage/quick-start

Rachit Magon

@romain_batlle Awesome, that makes it super easy. Thanks for clarifying