
liteLLM
One library to standardize all LLM APIs
5.0•16 reviews•129 followers
One library to standardize all LLM APIs
5.0•16 reviews•129 followers
Simplify using OpenAI, Azure, Cohere, Anthropic, Replicate, Google LLM APIs. TLDR Call all LLM APIs using the chatGPT format - completion(model, messages) Consistent outputs and exceptions for all LLM APIs Logging and Error Tracking for all models
liteLLM Reviews
The community submitted 16 reviews to tell us what they like about liteLLM, what liteLLM can do better, and more.
5.0
Based on 16 reviews
Review liteLLM?
liteLLM earns strong praise for simplifying multi-provider setups and keeping code consistent while swapping models. Makers of , , and highlight its role as a versatile API hub and LLM gateway, noting easy switching and support for custom models. Other makers commend unified interfaces and routing. Users echo that it works well as an OpenAI-compatible proxy, enabling caching, load balancing across providers (including local via Ollama), and smooth integration with observability tools like Langfuse for monitoring and performance analysis.
+13
Summarized with AI
Pros
Cons
Reviews
All Reviews
Most Informative


PingPrompt
CamelAI
Strix