RouterLink exposes three API surfaces. Pick the one that matches the SDK or tool you already use:
| Surface | Base URL | When to use |
|---|
| OpenAI-compatible | https://router-link.world3.ai/v1 | OpenAI SDK, LangChain, anything that speaks the OpenAI Chat Completions schema |
| Anthropic-compatible | https://router-link.world3.ai/api/anthropic | Anthropic SDK, Claude Code, anything that speaks /v1/messages |
| RouterLink native | https://router-link.world3.ai/api/v1 | Account, billing, key management |
Beta environment base URL is https://router-link-beta.world3.ai. Same paths, separate credit balance.
Authentication
All endpoints use bearer-token auth. Get a key from the Console and pass it in the Authorization header:
Authorization: Bearer rl-...
For the Anthropic-compatible surface, the SDK sends the key in the x-api-key header — both work.
OpenAI-compatible
Chat Completions
POST /v1/chat/completions
Drop-in compatible with OpenAI’s Chat Completions API. Streaming, tool use, vision, JSON mode, and structured outputs all work the same way.
from openai import OpenAI
client = OpenAI(
base_url="https://router-link.world3.ai/v1",
api_key="<your-routerlink-key>",
)
stream = client.chat.completions.create(
model="gpt-4o",
messages=[{"role": "user", "content": "Stream me a haiku."}],
stream=True,
)
for chunk in stream:
print(chunk.choices[0].delta.content or "", end="")
Cross-provider routing
The same endpoint accepts any model name from any supported provider — RouterLink dispatches to the right upstream:
{ "model": "gpt-4o" } // OpenAI
{ "model": "claude-sonnet-4-6" } // Anthropic
{ "model": "gemini-2.5-pro" } // Google
For the live model list, call GET /v1/models or check the Models page in the Console.
Other endpoints
POST /v1/embeddings — text embeddings
POST /v1/images/generations — image generation
POST /v1/audio/transcriptions — speech-to-text
POST /v1/audio/speech — text-to-speech
Anthropic-compatible
Native /v1/messages and /v1/messages/count_tokens under /api/anthropic:
import anthropic
client = anthropic.Anthropic(
base_url="https://router-link.world3.ai/api/anthropic",
api_key="<your-routerlink-key>",
)
msg = client.messages.create(
model="claude-sonnet-4-6",
max_tokens=1024,
messages=[{"role": "user", "content": "Hi!"}],
)
Supported features: streaming, tool use, prompt caching (cache_control), extended thinking, vision, and the Files API (/v1/files).
Streaming
Both surfaces support Server-Sent Events. Set stream: true (OpenAI) or use the Anthropic streaming client. SSE event names match the upstream provider exactly — your existing parsing code keeps working.
Errors
Errors follow the upstream provider’s schema, with a stable RouterLink-specific code in the error.type field:
| HTTP | error.type | Meaning |
|---|
400 | routerlink_invalid_request | Malformed payload or unsupported parameter |
401 | routerlink_unauthorized | Missing or invalid API key |
403 | routerlink_forbidden | Key disabled, account suspended, or model not enabled |
404 | routerlink_not_found | Unknown route or resource |
429 | routerlink_budget_exceeded | Per-key monthly budget cap hit — raise the cap or add credits |
429 | routerlink_rate_limited | Per-key request rate limit hit |
502 | routerlink_upstream_error | Upstream provider returned an error |
503 | routerlink_upstream_unavailable | Upstream provider unreachable |
Rate limits & budgets
Each API key has two independent guardrails:
- Rate limit — requests per minute, configured per key. Returns
429 routerlink_rate_limited with a Retry-After header.
- Monthly budget — hard cap on USD spend per calendar month. Returns
429 routerlink_budget_exceeded. Email alerts fire at 70 / 90 / 100% of the cap.
Both are configured in the Console under API Keys → Edit.
Webhooks
Coming soon — subscribe to billing events, budget alerts, and key lifecycle changes. Track the roadmap for status.
SDKs and integrations
- OpenAI SDKs — Python, Node, Go, .NET — all work without modification, just override
base_url.
- Anthropic SDKs — Python, TypeScript — set
base_url to the /api/anthropic path.
- LangChain / LlamaIndex — use the OpenAI provider with our base URL.
- Claude Code — see the dedicated guide.
- AI SDK (Vercel) — use
@ai-sdk/openai-compatible with our base URL.
Need help?
- Operational status: status.routerlink.ai
- Issues / feature requests: file via the Console Support menu