Install OpenClaw on Windows (WSL2 / Docker) and Configure XAI Router (MiniMax-M2.1)

Posted February 10, 2026 by The XAI Tech Teamย โ€ย 5ย min read

OpenClaw

OpenClaw + XAI Router

This guide shows how to run OpenClaw on Windows and route requests to MiniMax-M2.1 via XAI Router (xairouter).

You have two options:

  1. Non-Docker (Recommended: WSL2 + Ubuntu): run the OpenClaw CLI + Gateway inside WSL2 (Linux)
  2. Docker (Docker Desktop + Docker Compose): run the Gateway in a container
  • Template repo: https://github.com/xaixagent/openclaw
  • Default protocol: OpenAI Chat Completions (/v1/chat/completions)
  • Default upstream model: xairouter/MiniMax-M2.1
  • You only need: XAI_API_KEY (XAI Router is natively compatible with OpenAI API paths)

What you get

  1. A working OpenClaw Gateway (default port 18789)
  2. An OpenAI Chat API compatible endpoint: http://127.0.0.1:18789/v1/chat/completions
  3. Default upstream model: xairouter/MiniMax-M2.1 (configured, no app changes needed)

Prerequisites

  • Windows 10/11
  • An XAI Router API key (sk-...)

Pick one:

  • Option 1 (recommended): WSL2 + Ubuntu
  • Option 2: Docker Desktop + Docker Compose

OpenClaw recommends WSL2 on Windows for better compatibility (Node/Bun/pnpm tooling, Linux binaries, skills).


The core idea: run OpenClaw as a Linux app inside WSL2.

Step 1: Install WSL2 + Ubuntu

Open PowerShell (Admin recommended):

wsl --install
# Or pick a distro explicitly:
wsl --list --online
wsl --install -d Ubuntu-24.04

Reboot if Windows asks.

In your WSL terminal:

sudo tee /etc/wsl.conf >/dev/null <<'EOF'
[boot]
systemd=true
EOF

Then in PowerShell:

wsl --shutdown

Re-open Ubuntu and verify:

systemctl --user status

Step 3: Install OpenClaw (inside WSL2)

Use the official installer (handles Node detection/installation):

curl -fsSL https://openclaw.ai/install.sh | bash

Verify:

openclaw --version

Step 4: Configure XAI Router (MiniMax-M2.1)

  1. Set environment variables:
export XAI_API_KEY="sk-..."
export OPENCLAW_GATEWAY_TOKEN="$(openssl rand -hex 32)"
  1. Create ~/.openclaw/openclaw.json:
mkdir -p ~/.openclaw

cat > ~/.openclaw/openclaw.json <<'EOF'
{
  "models": {
    "mode": "merge",
    "providers": {
      "xairouter": {
        "baseUrl": "https://api.xairouter.com/v1",
        "apiKey": "${XAI_API_KEY}",
        "api": "openai-completions",
        "models": [
          { "id": "MiniMax-M2.1", "name": "MiniMax" }
        ]
      }
    }
  },
  "agents": {
    "defaults": {
      "model": { "primary": "xairouter/MiniMax-M2.1" }
    }
  },
  "gateway": {
    "mode": "local",
    "auth": {
      "mode": "token",
      "token": "${OPENCLAW_GATEWAY_TOKEN}"
    },
    "http": {
      "endpoints": {
        "chatCompletions": { "enabled": true },
        "responses": { "enabled": false }
      }
    }
  }
}
EOF

Note: this uses the OpenAI Chat Completions compatible path, so the provider api is openai-completions and you verify via /v1/chat/completions.

Step 5: Start the Gateway

In WSL2:

openclaw gateway --bind loopback --port 18789 --force

Step 6: Verify with curl (inside WSL2)

curl -sS http://127.0.0.1:18789/v1/chat/completions \
  -H "Authorization: Bearer $OPENCLAW_GATEWAY_TOKEN" \
  -H "Content-Type: application/json" \
  -H "x-openclaw-agent-id: main" \
  -d '{"model":"openclaw","messages":[{"role":"user","content":"ping"}]}'

You should get an OpenAI Chat Completions style JSON response.


Option 2: Windows + Docker (Docker Desktop + Docker Compose)

The core idea: run the Gateway in a container. You only need to fill .env.

Step 1: Download the template

In PowerShell:

git clone https://github.com/xaixagent/openclaw.git
cd openclaw

Key files:

  • docker-compose.yml: defines openclaw-gateway and openclaw-cli
  • .env.example: environment template
  • configs/openclaw.openai-chat.json: default preset (MiniMax-M2.1 + Chat Completions)

Step 2: Create and fill .env

Copy-Item .env.example .env
notepad .env

Set at least:

  • XAI_API_KEY: your XAI Router key
  • OPENCLAW_GATEWAY_TOKEN: gateway token (used as Authorization: Bearer ...)
  • OPENCLAW_CONFIG_NAME: set to openclaw.openai-chat.json

Generate a random OPENCLAW_GATEWAY_TOKEN (32 bytes / 64 hex chars recommended):

$bytes = New-Object byte[] 32
[System.Security.Cryptography.RandomNumberGenerator]::Create().GetBytes($bytes)
($bytes | ForEach-Object { $_.ToString("x2") }) -join ""

Minimal .env example:

XAI_API_KEY="sk-xxxxxxxxxxxxxxxx"
OPENCLAW_GATEWAY_TOKEN="your-random-token"
OPENCLAW_CONFIG_NAME="openclaw.openai-chat.json"
OPENCLAW_GATEWAY_PORT=18789

Step 3: Start OpenClaw Gateway

docker compose up -d openclaw-gateway

Check status:

docker compose ps

Follow logs (recommended on first boot):

docker compose logs -f openclaw-gateway

If your environment only has docker-compose, replace docker compose with docker-compose.

Step 4: Verify MiniMax-M2.1 with curl.exe

On the same machine (in PowerShell, curl can be an alias, so use curl.exe):

# Load variables from .env into this PowerShell session
Get-Content .env | ForEach-Object {
  if ($_ -match '^\s*#' -or $_ -match '^\s*$') { return }
  $k, $v = $_ -split '=', 2
  Set-Item -Path ("env:" + $k.Trim()) -Value $v.Trim()
}

curl.exe -sS "http://127.0.0.1:$env:OPENCLAW_GATEWAY_PORT/v1/chat/completions" `
  -H "Authorization: Bearer $env:OPENCLAW_GATEWAY_TOKEN" `
  -H "Content-Type: application/json" `
  -H "x-openclaw-agent-id: main" `
  -d '{"model":"openclaw","messages":[{"role":"user","content":"ping"}]}'

Why is the request model "openclaw"?

openclaw is the Gateway's unified model name.

  • Your app sends: model: "openclaw"
  • The real upstream model is configured in the preset (default: xairouter/MiniMax-M2.1)

Use it like OpenAI Chat API (optional)

In your OpenAI SDK:

  • set baseURL to your Gateway: http://127.0.0.1:18789/v1
  • set apiKey to OPENCLAW_GATEWAY_TOKEN

Node.js example (openai SDK):

import OpenAI from "openai";

const client = new OpenAI({
  baseURL: "http://127.0.0.1:18789/v1",
  apiKey: process.env.OPENCLAW_GATEWAY_TOKEN,
});

const resp = await client.chat.completions.create({
  model: "openclaw",
  messages: [{ role: "user", content: "ping" }],
});

console.log(resp.choices[0]?.message?.content);

Windows: LAN access (optional)

If other machines on your LAN cannot reach the Gateway, you may need to allow the port through Windows Firewall (Admin PowerShell, example port 18789):

New-NetFirewallRule -DisplayName "OpenClaw Gateway 18789" -Direction Inbound -Action Allow -Protocol TCP -LocalPort 18789

Common maintenance commands (Docker option)

Stop and remove containers (data in state/ remains):

docker compose down

Pull new images and restart:

docker compose pull
docker compose up -d openclaw-gateway

To fully reset (this removes session/state data):

docker compose down
Remove-Item -Recurse -Force state, workspace, codex

More on connecting OpenClaw via XAI Router: /blog/openclaw/