OpenCode + XAI Router: Use Codex in opencode

Posted January 10, 2026 by The XAI Tech Teamย โ€ย 3ย min read

OpenCode (opencode) is a developer-friendly coding assistant for the terminal and beyond. This guide shows you how to route opencode through XAI Router (xairouter) and use Codex models like gpt-5.2-codex reliably.

Tip: If you only need OpenAI-compatible Chat/Completions models (e.g. gpt-4o / gpt-4.1), jump to the โ€œSimplified config for non-Codex modelsโ€ section.

Prerequisites

  1. An XAI Router account: Sign up at m.xairouter.com and create an API Key.
  2. opencode installed locally.
  3. The model ID you want to use (e.g. gpt-5.2-codex).

Step 1: Create an API Key in XAI Router

  1. Log in to m.xairouter.com.
  2. Go to API Keys and create a new key (use a label like opencode).
  3. Copy the key. Youโ€™ll use it as OPENAI_API_KEY.

Step 2: Set the environment variable

We recommend OPENAI_API_KEY to match opencodeโ€™s OpenAI-compatible setup.

macOS / Linux:

export OPENAI_API_KEY="sk-xxx"

Windows PowerShell:

$env:OPENAI_API_KEY="sk-xxx"

Step 3: Configure opencode (Codex models)

Create or overwrite ~/.config/opencode/opencode.json:

cat > ~/.config/opencode/opencode.json << 'EOF'
{
  "$schema": "https://opencode.ai/config.json",
  "model": "openai/gpt-5.2-codex",
  "small_model": "openai/gpt-5.2-codex",
  "provider": {
    "openai": {
      "name": "XAI Router",
      "env": ["OPENAI_API_KEY"],
      "whitelist": ["gpt-5.2", "gpt-5.2-codex"],
      "options": {
        "baseURL": "https://api.xairouter.com"
      },
      "models": {
        "gpt-5.2-codex": {
          "id": "gpt-5.2-codex",
          "name": "gpt-5.2-codex",
          "tool_call": true,
          "reasoning": true
        }
      }
    }
  },
  "share": "disabled"
}
EOF

Note: We use the openai provider with baseURL pointing to https://api.xairouter.com so opencode calls the Responses API, which Codex requires. small_model and whitelist prevent fallback to other small models (e.g. gpt-5-nano).


Step 4: Enable Codex compatibility mode (required)

Codex Responses does not allow system messages and requires instructions plus store=false. opencodeโ€™s Codex mode handles this automatically.

Two simple steps:

  1. Write a dummy OAuth entry to trigger Codex mode:
cat > ~/.local/share/opencode/auth.json << 'EOF'
{
  "openai": {
    "type": "oauth",
    "refresh": "dummy",
    "access": "dummy",
    "expires": 0
  }
}
EOF
chmod 600 ~/.local/share/opencode/auth.json
  1. Start opencode with default plugins disabled (prevents rewrites to the official ChatGPT endpoint):
OPENCODE_DISABLE_DEFAULT_PLUGINS=1

Step 5: Validate

opencode debug config
opencode models openai

You should see:

  • model = openai/gpt-5.2-codex
  • baseURL = https://api.xairouter.com

Common errors and fixes

  • Instructions are required
  • Store must be set to false
  • System messages are not allowed
  • Unsupported parameter: max_output_tokens

These usually mean Codex compatibility mode is not enabled. Re-check Step 4 and ensure you start with OPENCODE_DISABLE_DEFAULT_PLUGINS=1.


Simplified config for non-Codex models (optional)

If you only use Chat/Completions models (e.g. gpt-4o / gpt-4.1), switch to OpenAI-compatible provider:

{
  "$schema": "https://opencode.ai/config.json",
  "model": "xai/gpt-4o-mini",
  "provider": {
    "xai": {
      "name": "XAI Router",
      "npm": "@ai-sdk/openai-compatible",
      "env": ["XAI_API_KEY"],
      "options": {
        "baseURL": "https://api.xairouter.com/v1"
      }
    }
  }
}

With the setup above, your opencode runs Codex through XAI Router with centralized key management, observability, and cost control.