Skip to main content
RouterLink is a unified LLM gateway. One API key, one endpoint, every major model provider — OpenAI, Anthropic, Google, and more — with built-in billing, credits, per-key budgets, and observability.

Get started

Start here

Send your first request in under 60 seconds.

What you get

OpenAI-compatible API

Drop-in replacement for api.openai.com. Use any OpenAI SDK, point it at RouterLink, and call any provider.

Anthropic-compatible API

Native /v1/messages and /v1/messages/count_tokens. Works with the Anthropic SDK and Claude Code.

Claude Code support

Use Claude Code with RouterLink credits — no Anthropic subscription required.

Per-key budgets & alerts

Hard-cap monthly spend per API key, with email alerts at 70 / 90 / 100%.

Build with the tools you already use

Claude Code

Two env vars and you’re routing Claude Code through RouterLink.

OpenAI SDK

Override base_url — Python, Node, Go, .NET all work unchanged.

Anthropic SDK

Native /v1/messages with streaming, tool use, prompt caching, and Files API.

LangChain & AI SDK

Use the OpenAI provider with our base URL and pick any model.

Run it your way

Streaming

Server-Sent Events on both surfaces — your existing parser keeps working.

Cross-provider routing

Same endpoint, different model field — RouterLink dispatches upstream.

Errors & rate limits

Stable error codes and per-key rate limits with Retry-After headers.

Status & support

Live operational status and incident history.