Inferbase - AI model and infrastructure decision platform
by•
A unified platform for comparing AI models across pricing,benchmarks, and capabilities. Filter hundreds of models, compare up to 4 side-by-side, estimate token costs, and calculate GPU requirements for self-hosting. Data sourced from provider APIs, benchmark leaderboards, and official docs. Free during beta.
Replies
Best
Maker
📌
We built Inferbase because we got tired of juggling 15 browser tabs every time we needed to evaluate AI models.
Pricing is scattered across provider sites with no standard format. Benchmarks live on separate leaderboards. GPU requirements require back-of-napkin math. We wanted one place where all of that comes together.
What it does:
- Browse and filter hundreds of AI models by provider, cost, capability
- Compare up to 4 models side-by-side on benchmarks, pricing, context length
- Estimate token costs across providers
- Calculate GPU requirements for self-hosting (VRAM, throughput, latency)
- Get model recommendations based on your use case
All data is sourced from provider APIs, benchmark leaderboards, and official docs. Methodology is public.
Free during beta. Would love feedback on what's missing or what would make this more useful for your workflow.
Replies