Claude Code Integration
Configure Claude Code to use LLM Gateway for access to any model through the Anthropic API format
LLM Gateway provides a native Anthropic-compatible endpoint at /v1/messages that allows you to use any model in our catalog while maintaining the familiar Anthropic API format. This is especially useful for Claude Code users who want to access models beyond Claude.
Video Tutorial
Watch this quick video guide on setting up Claude Code with LLM Gateway:
Quick Start
Configure Claude Code to use LLM Gateway with these environment variables:
1export ANTHROPIC_BASE_URL=https://api.llmgateway.io2export ANTHROPIC_AUTH_TOKEN=llmgtwy_your_api_key_here3# optional: specify a model, otherwise it uses the default Claude model4export ANTHROPIC_MODEL=gpt-5 # or any model from our catalog56# now run claude!7claude
1export ANTHROPIC_BASE_URL=https://api.llmgateway.io2export ANTHROPIC_AUTH_TOKEN=llmgtwy_your_api_key_here3# optional: specify a model, otherwise it uses the default Claude model4export ANTHROPIC_MODEL=gpt-5 # or any model from our catalog56# now run claude!7claude
Why Use LLM Gateway with Claude Code?
The Anthropic endpoint transforms requests from Anthropic's message format to the OpenAI-compatible format used by LLM Gateway, then transforms the responses back to Anthropic's format. This means you can:
- Use any model available in LLM Gateway with Claude Code
- Maintain existing workflows that use Anthropic's API format
- Access models from OpenAI, Google, Cohere, and other providers through the Anthropic interface
- Leverage LLM Gateway's routing, caching, and cost optimization features
Choosing Models
You can use any model from the models page. Popular options for Claude Code include:
Use OpenAI's Latest Models
1# Use the latest GPT model2export ANTHROPIC_MODEL=gpt-534# Use a cost-effective alternative5export ANTHROPIC_MODEL=gpt-5-mini
1# Use the latest GPT model2export ANTHROPIC_MODEL=gpt-534# Use a cost-effective alternative5export ANTHROPIC_MODEL=gpt-5-mini
Use Google's Gemini
1export ANTHROPIC_MODEL=google/gemini-2.5-pro
1export ANTHROPIC_MODEL=google/gemini-2.5-pro
Use Anthropic's Claude Models
1export ANTHROPIC_MODEL=anthropic/claude-3-5-sonnet-20241022
1export ANTHROPIC_MODEL=anthropic/claude-3-5-sonnet-20241022
Environment Variables
When configuring Claude Code, you can use these environment variables:
ANTHROPIC_MODEL
Specifies the main model to use for primary requests.
1export ANTHROPIC_MODEL=gpt-5
1export ANTHROPIC_MODEL=gpt-5
Complete Configuration Example
1export ANTHROPIC_BASE_URL=https://api.llmgateway.io2export ANTHROPIC_AUTH_TOKEN=llmgtwy_your_api_key_here3export ANTHROPIC_MODEL=gpt-54export ANTHROPIC_SMALL_FAST_MODEL=gpt-5-nano
1export ANTHROPIC_BASE_URL=https://api.llmgateway.io2export ANTHROPIC_AUTH_TOKEN=llmgtwy_your_api_key_here3export ANTHROPIC_MODEL=gpt-54export ANTHROPIC_SMALL_FAST_MODEL=gpt-5-nano
Making Manual API Requests
If you want to test the endpoint directly, you can make manual requests:
1curl -X POST "https://api.llmgateway.io/v1/messages" \2 -H "Authorization: Bearer $LLM_GATEWAY_API_KEY" \3 -H "Content-Type: application/json" \4 -d '{5 "model": "gpt-5",6 "messages": [7 {"role": "user", "content": "Hello, how are you?"}8 ],9 "max_tokens": 10010 }'
1curl -X POST "https://api.llmgateway.io/v1/messages" \2 -H "Authorization: Bearer $LLM_GATEWAY_API_KEY" \3 -H "Content-Type: application/json" \4 -d '{5 "model": "gpt-5",6 "messages": [7 {"role": "user", "content": "Hello, how are you?"}8 ],9 "max_tokens": 10010 }'
Response Format
The endpoint returns responses in Anthropic's message format:
1{2 "id": "msg_abc123",3 "type": "message",4 "role": "assistant",5 "model": "gpt-5",6 "content": [7 {8 "type": "text",9 "text": "Hello! I'm doing well, thank you for asking. How can I help you today?"10 }11 ],12 "stop_reason": "end_turn",13 "stop_sequence": null,14 "usage": {15 "input_tokens": 13,16 "output_tokens": 2017 }18}
1{2 "id": "msg_abc123",3 "type": "message",4 "role": "assistant",5 "model": "gpt-5",6 "content": [7 {8 "type": "text",9 "text": "Hello! I'm doing well, thank you for asking. How can I help you today?"10 }11 ],12 "stop_reason": "end_turn",13 "stop_sequence": null,14 "usage": {15 "input_tokens": 13,16 "output_tokens": 2017 }18}
Benefits of Using LLM Gateway
- Multi-Provider Access: Use models from OpenAI, Anthropic, Google, and more through a single API
- Cost Control: Track and limit your AI spending with detailed usage analytics
- Unified Billing: One account for all providers instead of managing multiple API keys
- Caching: Reduce costs with response caching for repeated requests
- Analytics: Monitor usage patterns and costs in the dashboard
- Discounts: Visit Models page
Get Started
Ready to enhance your Claude Code experience? Sign up for LLM Gateway and get your API key today.