Connect OpenClaw via XAI Router

Posted February 1, 2026 by The XAI Tech Teamย โ€ย 4ย min read

OpenClaw

OpenClaw + XAI Router

OpenClaw is a personal AI assistant that can connect to multiple channels. This guide provides four options for XAI Router:

  • Path A: Claude API compatible (glm-4.7)
  • Path B: OpenAI API compatible (gpt-5.2)
  • Path C: CLI backend (Codex CLI)
  • Path D: CLI backend (Claude Code CLI)

Pick one path and configure it.


Path A: Claude API compatible (glm-4.7)

1) Set the environment variable

export XAI_API_KEY="sk-..."

2) Add an OpenClaw config

Save the following to ~/.openclaw/openclaw.json:

{
  "agents": {
    "defaults": {
      "model": { "primary": "xairouter/glm-4.7" },
      "models": {
        "xairouter/glm-4.7": { "alias": "glm" }
      }
    }
  },
  "models": {
    "mode": "merge",
    "providers": {
      "xairouter": {
        "baseUrl": "https://api.xairouter.com",
        "apiKey": "${XAI_API_KEY}",
        "api": "anthropic-messages",
        "models": [
          {
            "id": "glm-4.7",
            "name": "GLM"
          }
        ]
      }
    }
  }
}

Key points:

  • api must be anthropic-messages.
  • Use https://api.xairouter.com as the baseUrl.
  • The model ref is xairouter/glm-4.7.

3) Verify

openclaw models status

If you see xairouter/glm-4.7 as the default model, you are done.


Path B: OpenAI API compatible (gpt-5.2)

1) Set the environment variable

export XAI_API_KEY="sk-..."

2) Add an OpenClaw config

Save the following to ~/.openclaw/openclaw.json:

{
  "agents": {
    "defaults": {
      "model": { "primary": "xairouter/gpt-5.2" },
      "models": {
        "xairouter/gpt-5.2": { "alias": "GPT-5.2" }
      }
    }
  },
  "models": {
    "mode": "merge",
    "providers": {
      "xairouter": {
        "baseUrl": "https://api.xairouter.com/v1",
        "apiKey": "${XAI_API_KEY}",
        "api": "openai-responses",
        "models": [
          {
            "id": "gpt-5.2",
            "name": "GPT-5.2"
          }
        ]
      }
    }
  }
}

Key points:

  • api must be openai-responses.
  • Use https://api.xairouter.com/v1 as the baseUrl for /v1/responses.
  • The model ref is xairouter/gpt-5.2.

3) Verify

openclaw models status

If you see xairouter/gpt-5.2 as the default model, you are done.


Path C: CLI backend (Codex CLI + /v1/responses)

1) Prepare Codex CLI

  1. Make sure the codex command works locally.
  2. Configure Codex CLI to use https://api.xairouter.com/v1/responses.
  3. Authenticate with XAI_API_KEY as a Bearer token.

Note: OpenClaw only executes Codex CLI and does not change your HTTP settings.

2) Add an OpenClaw config

Save the following to ~/.openclaw/openclaw.json:

{
  "agents": {
    "defaults": {
      "model": { "primary": "codex-cli/gpt-5.2-codex" },
      "models": {
        "codex-cli/gpt-5.2-codex": { "alias": "Codex" }
      },
      "cliBackends": {
        "codex-cli": {
          "command": "codex",
          "env": {
            "XAI_API_KEY": "${XAI_API_KEY}"
          }
        }
      }
    }
  }
}

Key points:

  • codex-cli is the built in CLI backend provider id.
  • If codex is not on PATH, set command to an absolute path like "/path/to/codex".
  • If your Codex CLI expects different env var names, update the env map.
  • The CLI backend is text only, OpenClaw tool calls are disabled.

3) Verify

openclaw agent --message "hi" --model codex-cli/gpt-5.2-codex

If you get a reply, the setup is working.


Path D: CLI backend (Claude Code CLI + multi model)

1) Prepare Claude Code CLI

  1. Make sure the claude command works locally.
  2. Configure Claude Code CLI endpoint and auth as required by your model vendor.
  3. Models like GLM-4.7 can be used if Claude Code CLI supports them.

Note: OpenClaw only executes Claude Code CLI and does not change your HTTP settings.

2) Add an OpenClaw config

Save the following to ~/.openclaw/openclaw.json:

{
  "agents": {
    "defaults": {
      "model": { "primary": "claude-cli/glm-4.7" },
      "models": {
        "claude-cli/glm-4.7": { "alias": "GLM 4.7" }
      },
      "cliBackends": {
        "claude-cli": {
          "command": "claude",
          "env": {
            "XAI_API_KEY": "${XAI_API_KEY}"
          },
          "modelAliases": {
            "glm-4.7": "glm-4.7"
          }
        }
      }
    }
  }
}

Key points:

  • claude-cli is the built in CLI backend provider id.
  • If claude is not on PATH, set command to an absolute path like "/path/to/claude".
  • modelAliases maps OpenClaw model ids to CLI model ids.
  • If your CLI expects different env var names, update the env map.
  • The CLI backend is text only, OpenClaw tool calls are disabled.

3) Verify

openclaw agent --message "hi" --model claude-cli/glm-4.7

If you get a reply, the setup is working.


FAQ

1) Can I keep other models as fallbacks?

Yes. Add them to agents.defaults.model.fallbacks and include them in agents.defaults.models.

CNY 12 per month for glm-4.7. Sign up at m.xairouter.com to get started.