Romain Batlle

Romain Batlle

MakeHub.aiMakeHub.ai
Co-Founder and I like sometimes Finance
18 points

Forums

Romain Batlle

7mo ago

MakeHub.ai - LLM Provider arbitrage to get the best performance for the $

OpenAI-compatible endpoint. Single API, routes to the cheapest and fastest provider for each model. Works with closed and open LLMs. Real-time benchmarks (price, latency, load) run in the background. Usable direclty now on Roo and Cline forks