$ ai-evals
← all companies

LiteLLM

Open-source Python SDK and proxy that translates requests across 100+ LLM providers into the OpenAI format.

score8.0
LLM gatewaymulti-provider routingopen-sourceopen sourcewww.litellm.ai

Verdict

The OSS standard for self-hosted LLM gateway deployments. LiteLLM is what teams reach for when "we need to route across providers" turns into "we need to operate it ourselves" — which it usually does once compliance, cost, or scale gets serious. Widely deployed, well-maintained, with a credible commercial team behind it.

What it is

LiteLLM is an open-source Python SDK + proxy server that translates calls to 100+ LLM providers into a unified OpenAI-compatible interface. Run it as a Python library inside your app, or stand up the proxy server in front of all your LLM traffic. Free under MIT-style licensing; BerriAI (the company) sells managed and enterprise tiers on top.

Developer experience

As a library:

from litellm import completion
 
response = completion(
    model="anthropic/claude-opus-4-7",
    messages=[{"role": "user", "content": "..."}],
)

As a proxy: deploy the server, point your existing OpenAI SDK at it, switch model names. The two patterns cover most of what real production deployments need.

Where it shines

  • Self-hosted control. Run it in your own VPC, behind your own auth, with your own keys. Compliance and security teams love this.
  • Coverage. 100+ models. Slightly behind OpenRouter on raw catalog, but more than enough for most production deployments.
  • Virtual keys + budgets. Real multi-team governance — issue scoped keys, set per-user spending caps, track who's burning tokens where.
  • Observability friendly. Native Braintrust, Langfuse, and OpenTelemetry exporters mean you don't have to rebuild instrumentation.

Where it falls short

  • Ops burden. "Self-hostable" is a feature when you have DevOps capacity and a tax when you don't. Redis + Postgres + the proxy itself is real infrastructure.
  • Documentation sprawl. Coverage is broad; quality is uneven across the long tail of providers.
  • Enterprise gating. SSO, advanced guardrails, and some governance features sit behind the paid tier. Worth knowing before assuming "OSS = unlimited."

Bottom line

The right pick if your team can run infrastructure and your priority is control: own the keys, own the data, own the routing logic. If you don't have DevOps capacity, OpenRouter or Portkey will move faster.

Related