Give Developers

  • Azure Access

  • Gemini Access

  • Bedrock Access

  • OpenAI Access

  • Anthropic Access

LLM Gateway to provide model access, fallbacks and spend tracking across 100+ LLMs. All in the OpenAI format.

🚅 LiteLLM

Cost Tracking

Batches API

Guardrails

Model Access

Budgets

LLM Observability

Rate Limiting

Prompt Management

s3 Logging

Pass-Through Endpoints

User

0M+

docker pulls

1B+

requests served

80%

uptime

425+

contributors

What is LiteLLM?

LiteLLM simplifies model access, spend tracking and fallbacks across 100+ LLMs. Watch this demo, to learn more.

Features

LiteLLM makes it easy for Platform teams to give developers LLM access

Spend Tracking

Budgets & Rate Limits

OpenAI-Compatible

LLM Fallbacks

The Best Pricing Plans

Open Source

$0

Free

  • 100+ LLM Provider Integrations

  • Langfuse, Langsmith, OTEL Logging

  • Virtual Keys, Budgets, Teams

  • Load Balancing, RPM/TPM limits

  • LLM Guardrails

Enterprise

Get In Touch

Cloud or Self-Hosted

For giving LLM access to a large number of developers and projects.

  • Everything in OSS

  • Enterprise Support + Custom SLAs

  • JWT Auth, SSO, Audit Logs

  • All Enterprise Features - Docs

Netflix uses LiteLLM to give developers Day 0 LLM access

David Leen

Staff Software Engineer - Netflix

LiteLLM has let my team provide the latest LLM models to our users usually within a day of them being released. Without LiteLLM this would be hours of work each time a new model is announced. It means we don't have to transform inputs and outputs across providers and has saved us months of work.

Mark Koltnuk

Principal Architect (GenAI Platform) - Lemonade

Our experience with LiteLLM and Langfuse at Lemonade has been outstanding. LiteLLM streamlines the complexities of managing multiple LLM models

Steve Farthing

Staff Engineer - RocketMoney

The LiteLLM proxy has streamlined our management of LLMs by standardizing logging, the OpenAI API, and authentication for all models, significantly reducing operational complexities. This enables us to quickly adapt to changing demands and swiftly adopt new models.

Test

Test

Test