XAI Router -> Codex-Cloud: Minimal Claude Code Compatibility Guide

Posted March 4, 2026 by XAI Technical Teamย โ€ย 2ย min read

If you want a stable, simple, and controllable path to run Claude Code with gpt-5.3-codex, this is the recommended architecture:

Claude Code -> XAI Router -> Codex-Cloud -> OpenAI Response API (gpt-5.3-codex)

Its core principle is straightforward: minimum changes.


Architecture At A Glance

Claude Code
  โ””โ”€ ANTHROPIC_BASE_URL=https://api.xairouter.com
     โ””โ”€ /v1/messages
        โ””โ”€ Codex-Cloud (Claude API -> Response API)
           โ””โ”€ model: gpt-5.3-codex

Think of Codex-Cloud as a focused conversion layer:

  1. It receives Claude API requests.
  2. It applies lightweight parameter mapping.
  3. It forwards traffic to Response API with minimal extra logic.

Compatibility Experience (User View)

This setup is designed to feel like โ€œClaude Code as-isโ€:

  1. Keep the same Claude Code workflow and Anthropic-style configuration.
  2. POST /v1/messages works end-to-end, including streaming and tool calls.
  3. The model is unified to gpt-5.3-codex without business code rewrites.
  4. effort: "max" maps to xhigh for the highest reasoning tier.

Enterprise Scenario: Smooth Fallback When Claude API Is Unstable

For enterprise systems, one key value is this: failover stays at the gateway layer, not in application code.

When upstream quality degrades, applications can continue sending Claude API-shaped requests while XAI Router applies backend fallback strategy (for example, switching from a primary route to a Codex-Cloud backup route), which means:

  1. No application code changes and no protocol changes.
  2. The request entrypoint remains the same (ANTHROPIC_BASE_URL).
  3. Faster switchover and simpler rollback.

One-line summary: stable frontend behavior, switchable backend routing.


Minimal Local Config (Ready To Reuse)

Below is a reusable local config template.

1) Shell environment variables

export ANTHROPIC_BASE_URL="https://api.xairouter.com"
export ANTHROPIC_AUTH_TOKEN="sk-***"
export ANTHROPIC_DEFAULT_OPUS_MODEL="gpt-5.3-codex"
export ANTHROPIC_DEFAULT_SONNET_MODEL="gpt-5.3-codex"
export ANTHROPIC_DEFAULT_HAIKU_MODEL="gpt-5.3-codex"

2) ~/.claude/settings.json

{
  "model": "gpt-5.3-codex",
  "skipDangerousModePermissionPrompt": true,
  "effort": "max",
  "permissionMode": "bypassPermissions"
}

What this gives you:

  1. Default model is gpt-5.3-codex.
  2. Highest reasoning tier (xhigh) via effort: "max".
  3. A more direct local permission mode with fewer interruptions.

30-Second Smoke Test

1) Run Claude Code directly

claude "Write a minimal HTTP server in Rust and explain each module briefly."

If responses are stable and tool calls work, the full path is ready.


FAQ

Q1: Can I still use the highest reasoning mode through this conversion path?

Yes. Set "effort": "max" in ~/.claude/settings.json, and it maps to xhigh.


One-line takeaway: XAI Router + Codex-Cloud keeps Claude API compatibility while enabling fast onboarding and enterprise-grade smooth fallback.