Documentation
Welcome to the AI Quota Router API. Our infrastructure allows you to access multiple premium LLM providers (OpenAI, Anthropic, Meta, etc.) through a single, unified endpoint with predictable, capped pricing.
Authentication
All API requests must include your API Key in the Authorization header as a Bearer token.
Authorization: Bearer YOUR_API_KEY
Endpoints
We provide an OpenAI-compatible interface. Use our base URL for your requests:
POST https://api.aiquotarouter.com/v1/chat/completions
Integration Examples
import requests
url = "https://api.aiquotarouter.com/v1/chat/completions"
headers = {
"Authorization": "Bearer YOUR_API_KEY",
"Content-Type": "application/json"
}
data = {
"model": "gpt-4o", # Or claude-3-5-sonnet, llama-3, etc.
"messages": [
{"role": "user", "content": "Hello, how can I integrate your API?"}
]
}
response = requests.post(url, headers=headers, json=data)
print(response.json())
const axios = require('axios');
const response = await axios.post('https://api.aiquotarouter.com/v1/chat/completions', {
model: 'claude-3-5-sonnet',
messages: [
{ role: 'user', content: 'What are the benefits of a unified API?' }
]
}, {
headers: {
'Authorization': 'Bearer YOUR_API_KEY',
'Content-Type': 'application/json'
}
});
console.log(response.data);
curl https://api.aiquotarouter.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"model": "gemini-1.5-pro",
"messages": [{"role": "user", "content": "Say hello!"}]
}'
Supported Models
You can use any supported model by simply changing the model parameter in your request body. No other code changes are required.
- OpenAI: gpt-4o, gpt-4-turbo, gpt-3.5-turbo
- Anthropic: claude-3-5-sonnet, claude-3-opus
- Google: gemini-1.5-pro, gemini-1.5-flash
- Meta: llama-3.1-405b, llama-3.1-70b