Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.routerlink.ai/llms.txt

Use this file to discover all available pages before exploring further.

This documentation is provided for informational purposes only and demonstrates how to configure and use our API with third-party AI chat interfaces. Any third-party software, websites, or services mentioned are not operated, controlled, or endorsed by us.

Introduction to LobeChat

Image
LobeChat is an open-source, multi-modal AI chat platform and framework that serves as a unified interface for various large language models (LLMs). Designed as a personal AI “operating system,” it enables users to access, orchestrate, and interact with different AI models and tools through a single, cohesive interface. Key features include:
  • Multi-provider model support via OpenRouter protocol
  • Plugin ecosystem for extended functionality
  • Customizable conversation management
  • Cross-platform accessibility

Step

1

Access LobeChat

Navigate to https://lobechat.com and authenticate using your preferred sign-in method (e.g., GitHub, Google, or email).
Lobe1
2

Navigate to AI Service Provider Settings

  1. Locate and click your user avatar in the top-left corner of the interface.
  2. Select “Settings” from the dropdown menu.
Lobe2
  1. In the Settings panel, navigate to “AI Service Provider” in the left sidebar.
  2. Scroll down and locate the OpenRouter provider option.
Lobe3
3

Configure RouterLink API Connection

Configure the OpenRouter provider with your RouterLink credentials:
Lobe4
API Configuration Parameters
FieldContent
API KeyObtain one from RouterLInk API
API Proxy URLhttps://router-link.world3.ai/api/v1
After entering the credentials, click the ”+” button adjacent to “Fetch Models” to add a custom model configuration.
Lobe5
Lobe6 1

Model Registration

Specify the model you wish to utilize. This example demonstrates configuration for Gemini 3 Pro Image Preview.
For optimal compatibility, please select API format: OpenAI.
Provider: world3-router-north-america
Image
Model ID Format:Enter the fully-qualified model ID:world3-router-north-america/google/gemini-3-pro-image-preview
Image
Click “OK” to confirm the model configuration.
Lobe7

Enable the Provider

Toggle the OpenRouter master switch in the top-right corner to enable the provider connection.
Lobe8
4

Initiate a Conversation

  1. Click “Chat” in the top-left navigation area to access the conversation interface.
Lobe9
  1. Select “Just Chat” and choose your configured OpenRouter model from the model selector.
To streamline model selection, navigate to Settings and disable unused model providers, leaving only your RouterLink-configured models active.
Lobe10
  1. If you are using the free daily WORLD3 credits, please set the max tokens to ensure the service works properly (for example, set it to 4096).
Lobe11

Troubleshooting

IssueSolution
Authentication failedVerify your API key is correctly copied without leading/trailing spaces
Model not respondingConfirm the model ID matches the exact format from the RouterLink documentation
Connection timeoutCheck your network connectivity and ensure the API Proxy URL is correct