Route OpenClaw Through XAI Router
Posted February 1, 2026 by XAI Tech Teamย โย 3ย min read
OpenClaw + XAI Router
If your goal is to give OpenClaw routed through XAI Router the simplest, most stable Responses experience, keep it to one clean setup: custom provider + openai-responses + https://api.xairouter.com/v1.
Short Version
- Do not treat OpenClaw's built-in
openai-codexOAuth path as the XAI Router entry point; that path is fundamentally direct OpenAI / ChatGPT sign-in. - If you want XAI Router, use a custom provider such as
xairouter. - Set
apitoopenai-responsesandbaseUrltohttps://api.xairouter.com/v1. - Use
xairouter/gpt-5.4as the default model. - If you want the steadiest HTTP Responses path, set
params.transportto"sse"; if you prefer WebSocket-first behavior, switch it to"auto".
Recommended Config
Save this to ~/.openclaw/openclaw.json:
{
"agents": {
"defaults": {
"model": { "primary": "xairouter/gpt-5.4" },
"models": {
"xairouter/gpt-5.4": {
"alias": "Codex",
"params": { "transport": "sse" }
}
}
}
},
"models": {
"mode": "replace",
"providers": {
"xairouter": {
"baseUrl": "https://api.xairouter.com/v1",
"apiKey": "${XAI_API_KEY}",
"api": "openai-responses",
"models": [
{ "id": "gpt-5.4", "name": "GPT-5.4" }
]
}
}
}
}Why This Is the Recommended Path
- This is the OpenClaw-documented custom-provider structure:
models.providers.* + api: "openai-responses". - The current recommendation is to avoid adding
headers.originator; let OpenClaw send standard Responses payloads to XAI Router / codex-cloud directly. It is simpler and less likely to hit the native Codex branch by mistake. transport: "sse"keeps OpenClaw on the HTTP Responses path;automeans WebSocket-first with SSE fallback.- Compared with a CLI backend, this keeps OpenClaw's own agent / session / model pipeline intact instead of wrapping an external command.
- The current WebSocket path is still a high-fidelity relay, not byte-for-byte identical, so
sseremains the safer default.
Verify
export XAI_API_KEY="sk-..."
openclaw models status
openclaw agent --message "Introduce yourself in one sentence"If xairouter/gpt-5.4 is the default model and replies work, the setup is good.
FAQ
1) Can I just use openai-codex/gpt-5.4?
Yes, but that is OpenClaw talking directly to OpenAI OAuth, not routing through XAI Router. If your goal is the recommended XAI Router path in this guide, do not use that route.
2) Can I still use the Codex CLI backend?
Yes, but that is better when you explicitly want to reuse an external codex binary. If you want OpenClaw's own native model path and toolchain, the openai-responses provider in this guide is usually the better fit.
3) What if I want my local Gateway to expose /v1/responses?
That works too. With gateway.http.endpoints.responses.enabled = true, send requests to http://127.0.0.1:18789/v1/responses. For platform-specific examples, see /en/blog/openclaw-macos/ and /en/blog/openclaw-windows-xai-router/.