Skip to main content
Follow the steps below or watch the video walkthrough:

Video Walkthrough

Before you begin:
  • An active Agent 37 account
  • A running OpenClaw instance in Agent 37
  • The base URL for your custom provider
  • An API key if your provider requires authentication
  • A model ID supported by that provider

When to use a custom provider

Use a custom provider when:
  • your preferred vendor is not in the built-in provider list
  • you use an OpenAI-compatible proxy or gateway
  • you route traffic through a private or self-hosted endpoint
  • you want multiple external providers behind one custom base URL

Step 1: Gather the provider details

Before you open the setup flow, make sure you have:
  1. The provider base URL, for example https://your-provider.example.com/v1
  2. The API compatibility type:
    • OpenAI-compatible
    • Anthropic-compatible
    • Unknown, if you want OpenClaw to auto-detect
  3. The API key, if required
  4. The model ID you want to use
  5. An endpoint ID or label so you can recognize this provider later

Step 2: Open your Agent 37 instance

  1. Go to the Agent 37 dashboard and sign in.
  2. Open an existing instance, or create a new one.
  3. Open Terminal.

Step 3: Start the configuration flow

In Terminal, run:
openclaw configure
When prompted, choose:
  1. Where will the gateway run?Local
  2. Select Section to Configure?Model
  3. Model / Auth ProviderCustom Provider

Step 4: Enter the custom provider details

OpenClaw will guide you through the custom provider setup. You will be asked to:
  1. API Base URL → your provider’s base API URL, such as: https://your-provider.example.com/v1 Example: https://api.minimax.io/v1
According to OpenClaw’s onboarding flow, Custom provider is the correct choice for endpoints that are not listed directly, including providers that expose OpenAI-compatible or Anthropic-compatible APIs. Next, complete the remaining prompts:
  1. How do you want to provide this API key now?Paste API key now
  2. Endpoint-compatibility
    • Choose OpenAI-compatible if your provider exposes an OpenAI-style /v1 API
    • Choose Anthropic-compatible if it uses Anthropic’s Messages API
    • Choose Unknown if you want OpenClaw to auto-detect compatibility
  3. Enter API key → paste the API key you copied from your provider
  4. Model ID → enter the exact upstream model name Example: Minimax-M2.7
  5. Alias (optional) → enter a friendly display name for the model
  6. Endpoint ID → enter a short provider identifier such as: myproxy or minimax
If your provider claims OpenAI compatibility, start with the OpenAI-compatible option first. If requests fail, verify the exact API format your provider supports.

Step 5: Continue and verify the model

After you finish the prompt flow, run:
openclaw models list
You should see the custom provider model available in the output. If you selected it as the default, it should be marked accordingly. Your custom model should appear as: <endpoint-id>/<model-id> Example: minimax/Minimax-M2.7 If you want to make it your default model, run:
openclaw models set minimax/Minimax-M2.7

Optional: Understand the underlying config

OpenClaw stores custom providers in the model configuration. A custom provider entry typically includes fields like:
  • baseUrl
  • apiKey
  • api
  • models
For example:
{
  "models": {
    "mode": "merge",
    "providers": {
      "custom-proxy": {
        "baseUrl": "http://localhost:4000/v1",
        "apiKey": "YOUR_API_KEY",
        "api": "openai-completions",
        "models": [
          {
            "id": "llama-3.1-8b",
            "name": "Llama 3.1 8B"
          }
        ]
      }
    }
  }
}
“OpenAI-compatible” does not always mean full compatibility with every OpenClaw feature. Some providers only support basic chat-completions payloads and may not fully support tools, reasoning controls, or advanced response formats.

Troubleshooting tips

  • Confirm that the base URL includes the correct API path, often /v1
  • Make sure the model ID exactly matches the provider catalog
  • Recheck whether the endpoint is OpenAI-compatible or Anthropic-compatible
  • Verify that your API key has active billing or access

What to do next

OpenAI API

Use the built-in OpenAI flow if you do not need a custom endpoint.

Claude API

Use the Anthropic flow if you want the native Claude provider setup.