Connect OpenClaw via XAI Router
Posted February 1, 2026 by The XAI Tech Teamย โย 4ย min read
OpenClaw + XAI Router
OpenClaw is a personal AI assistant that can connect to multiple channels. This guide provides four options for XAI Router:
- Path A: Claude API compatible (glm-4.7)
- Path B: OpenAI API compatible (gpt-5.2)
- Path C: CLI backend (Codex CLI)
- Path D: CLI backend (Claude Code CLI)
Pick one path and configure it.
Path A: Claude API compatible (glm-4.7)
1) Set the environment variable
export XAI_API_KEY="sk-..."
2) Add an OpenClaw config
Save the following to ~/.openclaw/openclaw.json:
{
"agents": {
"defaults": {
"model": { "primary": "xairouter/glm-4.7" },
"models": {
"xairouter/glm-4.7": { "alias": "glm" }
}
}
},
"models": {
"mode": "merge",
"providers": {
"xairouter": {
"baseUrl": "https://api.xairouter.com",
"apiKey": "${XAI_API_KEY}",
"api": "anthropic-messages",
"models": [
{
"id": "glm-4.7",
"name": "GLM"
}
]
}
}
}
}
Key points:
apimust beanthropic-messages.- Use
https://api.xairouter.comas thebaseUrl. - The model ref is
xairouter/glm-4.7.
3) Verify
openclaw models status
If you see xairouter/glm-4.7 as the default model, you are done.
Path B: OpenAI API compatible (gpt-5.2)
1) Set the environment variable
export XAI_API_KEY="sk-..."
2) Add an OpenClaw config
Save the following to ~/.openclaw/openclaw.json:
{
"agents": {
"defaults": {
"model": { "primary": "xairouter/gpt-5.2" },
"models": {
"xairouter/gpt-5.2": { "alias": "GPT-5.2" }
}
}
},
"models": {
"mode": "merge",
"providers": {
"xairouter": {
"baseUrl": "https://api.xairouter.com/v1",
"apiKey": "${XAI_API_KEY}",
"api": "openai-responses",
"models": [
{
"id": "gpt-5.2",
"name": "GPT-5.2"
}
]
}
}
}
}
Key points:
apimust beopenai-responses.- Use
https://api.xairouter.com/v1as thebaseUrlfor/v1/responses. - The model ref is
xairouter/gpt-5.2.
3) Verify
openclaw models status
If you see xairouter/gpt-5.2 as the default model, you are done.
Path C: CLI backend (Codex CLI + /v1/responses)
1) Prepare Codex CLI
- Make sure the
codexcommand works locally. - Configure Codex CLI to use
https://api.xairouter.com/v1/responses. - Authenticate with
XAI_API_KEYas a Bearer token.
Note: OpenClaw only executes Codex CLI and does not change your HTTP settings.
2) Add an OpenClaw config
Save the following to ~/.openclaw/openclaw.json:
{
"agents": {
"defaults": {
"model": { "primary": "codex-cli/gpt-5.2-codex" },
"models": {
"codex-cli/gpt-5.2-codex": { "alias": "Codex" }
},
"cliBackends": {
"codex-cli": {
"command": "codex",
"env": {
"XAI_API_KEY": "${XAI_API_KEY}"
}
}
}
}
}
}
Key points:
codex-cliis the built in CLI backend provider id.- If
codexis not on PATH, setcommandto an absolute path like"/path/to/codex". - If your Codex CLI expects different env var names, update the
envmap. - The CLI backend is text only, OpenClaw tool calls are disabled.
3) Verify
openclaw agent --message "hi" --model codex-cli/gpt-5.2-codex
If you get a reply, the setup is working.
Path D: CLI backend (Claude Code CLI + multi model)
1) Prepare Claude Code CLI
- Make sure the
claudecommand works locally. - Configure Claude Code CLI endpoint and auth as required by your model vendor.
- Models like GLM-4.7 can be used if Claude Code CLI supports them.
Note: OpenClaw only executes Claude Code CLI and does not change your HTTP settings.
2) Add an OpenClaw config
Save the following to ~/.openclaw/openclaw.json:
{
"agents": {
"defaults": {
"model": { "primary": "claude-cli/glm-4.7" },
"models": {
"claude-cli/glm-4.7": { "alias": "GLM 4.7" }
},
"cliBackends": {
"claude-cli": {
"command": "claude",
"env": {
"XAI_API_KEY": "${XAI_API_KEY}"
},
"modelAliases": {
"glm-4.7": "glm-4.7"
}
}
}
}
}
}
Key points:
claude-cliis the built in CLI backend provider id.- If
claudeis not on PATH, setcommandto an absolute path like"/path/to/claude". modelAliasesmaps OpenClaw model ids to CLI model ids.- If your CLI expects different env var names, update the
envmap. - The CLI backend is text only, OpenClaw tool calls are disabled.
3) Verify
openclaw agent --message "hi" --model claude-cli/glm-4.7
If you get a reply, the setup is working.
FAQ
1) Can I keep other models as fallbacks?
Yes. Add them to agents.defaults.model.fallbacks and include them in agents.defaults.models.
CNY 12 per month for glm-4.7. Sign up at m.xairouter.com to get started.