Skip to main content

LiteLLM

Python SDK and proxy to call 100+ LLMs with one interface

LLM FrameworksFreeOpen source

LiteLLM is an open-source Python library and proxy server that provides a unified OpenAI-compatible interface for 100+ LLM providers. Call Anthropic Claude, Google Gemini, Azure OpenAI, Cohere, Groq, and others with the same `completion()` function. The LiteLLM proxy adds load balancing, rate limiting, and cost tracking.

Key specs
16,000 GitHub stars source
as of 2026-03-27
Loading…

FAQ

Alternatives

Integrations

None listed.

Built on

None listed.