liteLLM

liteLLM

One library to standardize all LLM APIs

5.0
20 reviews

176 followers

Simplify using OpenAI, Azure, Cohere, Anthropic, Replicate, Google LLM APIs. TLDR Call all LLM APIs using the chatGPT format - completion(model, messages) Consistent outputs and exceptions for all LLM APIs Logging and Error Tracking for all models

liteLLM Reviews

The community submitted 20 reviews to tell us what they like about liteLLM, what liteLLM can do better, and more.

5.0
Based on 20 reviews
Review liteLLM?
Reviewers describe liteLLM as a practical LLM proxy and abstraction layer for teams juggling multiple model providers. The lone user highlights caching, load balancing across services like Groq, OpenRouter, and Ollama, plus broad compatibility through an OpenAI-style API and easy pairing with Langfuse for monitoring. Founder feedback from the makers of Budibase, JDoodle.ai, and Crossnode echoes that: they value faster integration, easier model switching, fallbacks, and less vendor lock-in.
+17
Summarized with AI
Pros
Cons
Tines
Tines
Promoted
Reviews
All Reviews
Most Informative