Skip to main content

Anthropic SDK

Pass-through endpoints for Anthropic - call provider-specific endpoint, in native format (no translation).

Just replace https://api.anthropic.com with LITELLM_PROXY_BASE_URL/anthropic 🚀

Example Usage

curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages \
--header 'accept: application/json' \
--header 'content-type: application/json' \
--header "Authorization: bearer sk-anything" \
--data '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

Supports ALL Anthropic Endpoints (including streaming).

See All Anthropic Endpoints

Quick Start

Let's call the Anthropic /messages endpoint

  1. Add Anthropic API Key to your environment
export ANTHROPIC_API_KEY=""
  1. Start LiteLLM Proxy
litellm

# RUNNING on http://0.0.0.0:4000
  1. Test it!

Let's call the Anthropic /messages endpoint

curl http://0.0.0.0:4000/anthropic/v1/messages \
--header "x-api-key: $LITELLM_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "content-type: application/json" \
--data \
'{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

Examples

Anything after http://0.0.0.0:4000/anthropic is treated as a provider-specific route, and handled accordingly.

Key Changes:

Original EndpointReplace With
https://api.anthropic.comhttp://0.0.0.0:4000/anthropic (LITELLM_PROXY_BASE_URL="http://0.0.0.0:4000")
bearer $ANTHROPIC_API_KEYbearer anything (use bearer LITELLM_VIRTUAL_KEY if Virtual Keys are setup on proxy)

Example 1: Messages endpoint

LiteLLM Proxy Call

curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages \
--header "x-api-key: $LITELLM_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "content-type: application/json" \
--data '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

Direct Anthropic API Call

curl https://api.anthropic.com/v1/messages \
--header "x-api-key: $ANTHROPIC_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "content-type: application/json" \
--data \
'{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

Example 2: Token Counting API

LiteLLM Proxy Call

curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages/count_tokens \
--header "x-api-key: $LITELLM_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "anthropic-beta: token-counting-2024-11-01" \
--header "content-type: application/json" \
--data \
'{
"model": "claude-3-5-sonnet-20241022",
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

Direct Anthropic API Call

curl https://api.anthropic.com/v1/messages/count_tokens \
--header "x-api-key: $ANTHROPIC_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "anthropic-beta: token-counting-2024-11-01" \
--header "content-type: application/json" \
--data \
'{
"model": "claude-3-5-sonnet-20241022",
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

Example 3: Batch Messages

LiteLLM Proxy Call

curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages/batches \
--header "x-api-key: $LITELLM_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "anthropic-beta: message-batches-2024-09-24" \
--header "content-type: application/json" \
--data \
'{
"requests": [
{
"custom_id": "my-first-request",
"params": {
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}
},
{
"custom_id": "my-second-request",
"params": {
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hi again, friend"}
]
}
}
]
}'

Direct Anthropic API Call

curl https://api.anthropic.com/v1/messages/batches \
--header "x-api-key: $ANTHROPIC_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "anthropic-beta: message-batches-2024-09-24" \
--header "content-type: application/json" \
--data \
'{
"requests": [
{
"custom_id": "my-first-request",
"params": {
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}
},
{
"custom_id": "my-second-request",
"params": {
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hi again, friend"}
]
}
}
]
}'

Advanced - Use with Virtual Keys

Pre-requisites

Use this, to avoid giving developers the raw Anthropic API key, but still letting them use Anthropic endpoints.

Usage

  1. Setup environment
export DATABASE_URL=""
export LITELLM_MASTER_KEY=""
export COHERE_API_KEY=""
litellm

# RUNNING on http://0.0.0.0:4000
  1. Generate virtual key
curl -X POST 'http://0.0.0.0:4000/key/generate' \
-H 'Authorization: Bearer sk-1234' \
-H 'Content-Type: application/json' \
-d '{}'

Expected Response

{
...
"key": "sk-1234ewknldferwedojwojw"
}
  1. Test it!
curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages \
--header 'accept: application/json' \
--header 'content-type: application/json' \
--header "Authorization: bearer sk-1234ewknldferwedojwojw" \
--data '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'