Deploy OpenClaw with Docker Compose and MiniMax-M2.1 (via XAI Router)

Posted February 9, 2026 by The XAI Tech Teamย โ€ย 3ย min read

OpenClaw

OpenClaw + XAI Router

This guide shows how to deploy OpenClaw Gateway with Docker Compose and use MiniMax-M2.1 via XAI Router.

  • Template repo: https://github.com/xaixagent/openclaw
  • Default protocol: OpenAI Chat Completions (/v1/chat/completions)
  • You only need: XAI_API_KEY (XAI Router is natively compatible with OpenAI API paths)

What you get

  1. A long-running openclaw-gateway container (default port 18789)
  2. An OpenAI Chat API compatible endpoint: http://<your-host>:18789/v1/chat/completions
  3. Default upstream model: xairouter/MiniMax-M2.1 (configured in the template)

Prerequisites

  • A Linux machine (local or server)
  • Docker + Docker Compose (docker --version and docker-compose --version should work)
  • An XAI Router API key (sk-...)

Step 1: Download the template

git clone https://github.com/xaixagent/openclaw.git
cd openclaw

Key files in the folder:

  • docker-compose.yml: defines openclaw-gateway and openclaw-cli
  • .env.example: environment template
  • configs/openclaw.openai-chat.json: default preset (MiniMax-M2.1 + Chat Completions)

Step 2: Create and fill .env

cp .env.example .env

Note: .env contains secrets. Do not commit it to Git.

Edit .env and set at least:

  • XAI_API_KEY: your XAI Router key
  • OPENCLAW_GATEWAY_TOKEN: gateway token (used as Authorization: Bearer ...)

Generate a random OPENCLAW_GATEWAY_TOKEN (pick one; 32 bytes / 64 hex chars recommended):

openssl rand -hex 32

or:

python - <<'PY'
import secrets
print(secrets.token_hex(32))
PY

Minimal .env example:

XAI_API_KEY="sk-xxxxxxxxxxxxxxxx"
OPENCLAW_GATEWAY_TOKEN="your-random-token"
OPENCLAW_CONFIG_NAME="openclaw.openai-chat.json"
OPENCLAW_GATEWAY_PORT=18789

Tip: in Docker, the service must listen on 0.0.0.0 for port mapping to work, so keep OPENCLAW_GATEWAY_BIND=lan (the default).

Step 3: Start OpenClaw Gateway

docker-compose up -d openclaw-gateway

Check status:

docker-compose ps

Follow logs (recommended on first boot):

docker-compose logs -f openclaw-gateway

Step 4: Verify with curl

Run on the same machine:

set -a
source .env
set +a

curl -sS http://127.0.0.1:${OPENCLAW_GATEWAY_PORT:-18789}/v1/chat/completions \
  -H "Authorization: Bearer $OPENCLAW_GATEWAY_TOKEN" \
  -H "Content-Type: application/json" \
  -H "x-openclaw-agent-id: main" \
  -d '{"model":"openclaw","messages":[{"role":"user","content":"hello"}]}'

You should get an OpenAI Chat Completions style JSON response.

Why is the request model openclaw?

openclaw is the Gateway's unified model name.

  • Your app sends: model: "openclaw"
  • The actual upstream model is configured in the preset (default: xairouter/MiniMax-M2.1)

To confirm, open configs/openclaw.openai-chat.json and you will see something like:

{
  "agents": {
    "defaults": {
      "model": { "primary": "xairouter/MiniMax-M2.1" }
    }
  },
  "models": {
    "providers": {
      "xairouter": {
        "baseUrl": "https://api.xairouter.com/v1",
        "api": "openai-completions"
      }
    }
  }
}

Step 5: Use it like OpenAI Chat API (optional)

In the OpenAI SDK:

  • set baseURL to your gateway: http://<host>:18789/v1
  • set apiKey to OPENCLAW_GATEWAY_TOKEN

Node.js example (openai SDK):

import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "http://127.0.0.1:18789/v1",
  apiKey: process.env.OPENCLAW_GATEWAY_TOKEN,
});

const resp = await client.chat.completions.create({
  model: "openclaw",
  messages: [{ role: "user", content: "ping" }],
});

console.log(resp.choices[0]?.message?.content);

Common maintenance commands

Stop and remove containers (data in state/ remains):

docker-compose down

Pull new images and restart:

docker-compose pull
docker-compose up -d openclaw-gateway

To fully reset (this removes session/state data):

docker-compose down
rm -rf state workspace codex

More on connecting OpenClaw via XAI Router: /blog/openclaw/