Documentation

Welcome to the AI Quota Router API. Our infrastructure allows you to access multiple premium LLM providers (OpenAI, Anthropic, Meta, etc.) through a single, unified endpoint with predictable, capped pricing.

Authentication

All API requests must include your API Key in the Authorization header as a Bearer token.

Authorization: Bearer YOUR_API_KEY

Endpoints

We provide an OpenAI-compatible interface. Use our base URL for your requests:

POST https://api.aiquotarouter.com/v1/chat/completions

Integration Examples

import requests

url = "https://api.aiquotarouter.com/v1/chat/completions"
headers = {
    "Authorization": "Bearer YOUR_API_KEY",
    "Content-Type": "application/json"
}
data = {
    "model": "gpt-4o", # Or claude-3-5-sonnet, llama-3, etc.
    "messages": [
        {"role": "user", "content": "Hello, how can I integrate your API?"}
    ]
}

response = requests.post(url, headers=headers, json=data)
print(response.json())
                    
const axios = require('axios');

const response = await axios.post('https://api.aiquotarouter.com/v1/chat/completions', {
    model: 'claude-3-5-sonnet',
    messages: [
        { role: 'user', content: 'What are the benefits of a unified API?' }
    ]
}, {
    headers: {
        'Authorization': 'Bearer YOUR_API_KEY',
        'Content-Type': 'application/json'
    }
});

console.log(response.data);
                    
curl https://api.aiquotarouter.com/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -d '{
    "model": "gemini-1.5-pro",
    "messages": [{"role": "user", "content": "Say hello!"}]
  }'
                    

Supported Models

You can use any supported model by simply changing the model parameter in your request body. No other code changes are required.