Install OpenClaw on Windows (WSL2 / Docker) and Configure XAI Router (gpt-5.4)
Posted February 10, 2026 by The XAI Tech Teamย โย 4ย min read
OpenClaw + XAI Router
This guide shows how to run OpenClaw on Windows and route requests to gpt-5.4 via XAI Router (xairouter).
You have two options:
- Non-Docker (Recommended: WSL2 + Ubuntu): run the OpenClaw CLI + Gateway inside WSL2 (Linux)
- Docker (Docker Desktop + Docker Compose): run the Gateway in a container
- Template repo: https://github.com/xaixagent/openclaw
- Recommended upstream protocol: OpenAI Responses (
/v1/responses) - Default upstream model:
xairouter/gpt-5.4 - You only need:
XAI_API_KEY
What you get
- A working OpenClaw Gateway (default port
18789) - An OpenResponses-compatible endpoint:
http://127.0.0.1:18789/v1/responses - Default upstream model:
xairouter/gpt-5.4
Prerequisites
- Windows 10/11
- An XAI Router API key (
sk-...)
Pick one:
- Option 1 (recommended): WSL2 + Ubuntu
- Option 2: Docker Desktop + Docker Compose
OpenClaw recommends WSL2 on Windows for better compatibility (Node/Bun/pnpm tooling, Linux binaries, skills).
Option 1: Windows non-Docker (WSL2 + Ubuntu, recommended)
The core idea: run OpenClaw as a Linux app inside WSL2.
Step 1: Install WSL2 + Ubuntu
Open PowerShell (Admin recommended):
wsl --install
# Or pick a distro explicitly:
wsl --list --online
wsl --install -d Ubuntu-24.04Reboot if Windows asks.
Step 2: Enable systemd (recommended)
In your WSL terminal:
sudo tee /etc/wsl.conf >/dev/null <<'EOF'
[boot]
systemd=true
EOFThen in PowerShell:
wsl --shutdownRe-open Ubuntu and verify:
systemctl --user statusStep 3: Install OpenClaw (inside WSL2)
Use the official installer (handles Node detection/installation):
curl -fsSL https://openclaw.ai/install.sh | bashVerify:
openclaw --versionStep 4: Configure XAI Router (gpt-5.4 recommended Responses path)
- Set environment variables:
export XAI_API_KEY="sk-..."
export OPENCLAW_GATEWAY_TOKEN="$(openssl rand -hex 32)"- Create
~/.openclaw/openclaw.json:
mkdir -p ~/.openclaw
cat > ~/.openclaw/openclaw.json <<'EOF'
{
"models": {
"mode": "replace",
"providers": {
"xairouter": {
"baseUrl": "https://api.xairouter.com/v1",
"apiKey": "${XAI_API_KEY}",
"api": "openai-responses",
"models": [
{ "id": "gpt-5.4", "name": "GPT-5.4" }
]
}
}
},
"agents": {
"defaults": {
"model": { "primary": "xairouter/gpt-5.4" },
"models": {
"xairouter/gpt-5.4": {
"alias": "Codex",
"params": { "transport": "sse" }
}
}
}
},
"gateway": {
"mode": "local",
"auth": {
"mode": "token",
"token": "${OPENCLAW_GATEWAY_TOKEN}"
},
"http": {
"endpoints": {
"responses": { "enabled": true }
}
}
}
}
EOFNote: this makes OpenClaw talk to XAI Router through upstream
openai-responses; there is no need to addheaders.originator, andparams.transport = "sse"keeps it on HTTP/v1/responses. If you prefer WebSocket-first behavior, switch it to"auto".
Step 5: Start the Gateway
In WSL2:
openclaw gateway --bind loopback --port 18789 --forceStep 6: Verify with curl (inside WSL2)
curl -sS http://127.0.0.1:18789/v1/responses \
-H "Authorization: Bearer $OPENCLAW_GATEWAY_TOKEN" \
-H "Content-Type: application/json" \
-H "x-openclaw-agent-id: main" \
-d '{"model":"openclaw","input":"ping"}'You should get an OpenResponses-style JSON response.
Option 2: Windows + Docker (Docker Desktop + Docker Compose)
The core idea: run the Gateway in a container. You only need to fill .env.
Step 1: Download the template
In PowerShell:
git clone https://github.com/xaixagent/openclaw.git
cd openclawKey files:
docker-compose.yml: definesopenclaw-gatewayandopenclaw-cli.env.example: environment templateconfigs/: where you place your own OpenClaw config
Step 2: Create and fill .env
Copy-Item .env.example .env
notepad .envSet at least:
XAI_API_KEY: your XAI Router keyOPENCLAW_GATEWAY_TOKEN: gateway token (used asAuthorization: Bearer ...)OPENCLAW_CONFIG_NAME: set toopenclaw.xairouter-codex.json
Generate a random OPENCLAW_GATEWAY_TOKEN (32 bytes / 64 hex chars recommended):
$bytes = New-Object byte[] 32
[System.Security.Cryptography.RandomNumberGenerator]::Create().GetBytes($bytes)
($bytes | ForEach-Object { $_.ToString("x2") }) -join ""Minimal .env example:
XAI_API_KEY="sk-xxxxxxxxxxxxxxxx"
OPENCLAW_GATEWAY_TOKEN="your-random-token"
OPENCLAW_CONFIG_NAME="openclaw.xairouter-codex.json"
OPENCLAW_GATEWAY_PORT=18789Step 2 (extra): Create configs/openclaw.xairouter-codex.json
@'
{
"models": {
"mode": "replace",
"providers": {
"xairouter": {
"baseUrl": "https://api.xairouter.com/v1",
"apiKey": "${XAI_API_KEY}",
"api": "openai-responses",
"models": [
{ "id": "gpt-5.4", "name": "GPT-5.4" }
]
}
}
},
"agents": {
"defaults": {
"model": { "primary": "xairouter/gpt-5.4" },
"models": {
"xairouter/gpt-5.4": {
"alias": "Codex",
"params": { "transport": "sse" }
}
}
}
},
"gateway": {
"mode": "local",
"auth": {
"mode": "token",
"token": "${OPENCLAW_GATEWAY_TOKEN}"
},
"http": {
"endpoints": {
"responses": { "enabled": true }
}
}
}
}
'@ | Set-Content configs/openclaw.xairouter-codex.jsonStep 3: Start OpenClaw Gateway
docker compose up -d openclaw-gatewayCheck status:
docker compose psFollow logs (recommended on first boot):
docker compose logs -f openclaw-gatewayIf your environment only has
docker-compose, replacedocker composewithdocker-compose.
Step 4: Verify gpt-5.4 with curl.exe
On the same machine (in PowerShell, curl can be an alias, so use curl.exe):
# Load variables from .env into this PowerShell session
Get-Content .env | ForEach-Object {
if ($_ -match '^\s*#' -or $_ -match '^\s*$') { return }
$k, $v = $_ -split '=', 2
Set-Item -Path ("env:" + $k.Trim()) -Value $v.Trim()
}
curl.exe -sS "http://127.0.0.1:$env:OPENCLAW_GATEWAY_PORT/v1/responses" `
-H "Authorization: Bearer $env:OPENCLAW_GATEWAY_TOKEN" `
-H "Content-Type: application/json" `
-H "x-openclaw-agent-id: main" `
-d '{"model":"openclaw","input":"ping"}'Why is the request model "openclaw"?
openclaw is the Gateway's unified model name.
- Your app sends:
model: "openclaw" - The real upstream model is configured in the preset (default:
xairouter/gpt-5.4)
Use it like OpenAI Responses API (optional)
In your OpenAI SDK:
- set
baseURLto your Gateway:http://127.0.0.1:18789/v1 - set
apiKeytoOPENCLAW_GATEWAY_TOKEN
Then call OpenClaw like the OpenAI Responses API.
Node.js example (openai SDK):
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "http://127.0.0.1:18789/v1",
apiKey: process.env.OPENCLAW_GATEWAY_TOKEN,
});
const resp = await client.responses.create({
model: "openclaw",
input: "ping",
});
console.log(resp.output_text);Windows: LAN access (optional)
If other machines on your LAN cannot reach the Gateway, you may need to allow the port through Windows Firewall (Admin PowerShell, example port 18789):
New-NetFirewallRule -DisplayName "OpenClaw Gateway 18789" -Direction Inbound -Action Allow -Protocol TCP -LocalPort 18789Common maintenance commands (Docker option)
Stop and remove containers (data in state/ remains):
docker compose downPull new images and restart:
docker compose pull
docker compose up -d openclaw-gatewayTo fully reset (this removes session/state data):
docker compose down
Remove-Item -Recurse -Force state, workspace, codexMore on connecting OpenClaw via XAI Router: /blog/openclaw/