Configure OpenCode with XAI Router (Responses-first + Chat)

Posted January 10, 2026 by The XAI Tech Teamย โ€ย 2ย min read

OpenCode (opencode) can be routed through XAI Router with one unified endpoint. For gpt-5.4, the recommended setup keeps OpenCode on its own Responses request shape and lets XAI Router apply only the minimal compatibility shim needed for that client. Regular OpenAI-compatible models continue to use the standard compatibility API. This guide keeps only two clear production-ready profiles:

  1. gpt-5.4 (Responses API)
  2. MiniMax-M2.5 (Chat API)

Prerequisites

  1. You already created an API key at m.xairouter.com.
  2. opencode is installed locally.

Config file location

  1. Linux/macOS: ~/.config/opencode/opencode.jsonc
  2. Windows: %USERPROFILE%\.config\opencode\opencode.jsonc

Keep only one profile at a time. To switch profiles, overwrite opencode.jsonc.


Important notes

  1. This guide uses OpenCode's openai provider and points baseURL to https://api.xairouter.com/v1.
  2. For gpt-5.4, explicitly add headers.originator = "opencode" on the openai/gpt-5.4 model. This lets codex-cloud recognize OpenCode traffic and apply the minimal Responses shim only for that client.
  3. Without that header, requests fall back to the generic Responses conversion path. It may still work, but it is not the recommended OpenCode-targeted setup.
  4. If you enable OpenCode's built-in Codex OAuth plugin, it connects directly to chatgpt.com/backend-api/codex/responses by default and does not go through XAI Router. If you want traffic to go through XAI Router, use the openai provider config in this guide.

Profile A: gpt-5.4 (Responses-first path)

Linux/macOS

export XAI_API_KEY="sk-xxx"
mkdir -p ~/.config/opencode
cat > ~/.config/opencode/opencode.jsonc << 'JSON'
{
  "$schema": "https://opencode.ai/config.json",
  "model": "openai/gpt-5.4",
  "small_model": "openai/gpt-5.4",
  "provider": {
    "openai": {
      "options": {
        "baseURL": "https://api.xairouter.com/v1",
        "apiKey": "{env:XAI_API_KEY}"
      },
      "models": {
        "gpt-5.4": {
          "headers": {
            "originator": "opencode"
          }
        }
      }
    }
  }
}
JSON

opencode debug config
opencode run "hello"

Windows (PowerShell)

$env:XAI_API_KEY="sk-xxx"
New-Item -ItemType Directory -Path "$env:USERPROFILE\.config\opencode" -Force | Out-Null
@'
{
  "$schema": "https://opencode.ai/config.json",
  "model": "openai/gpt-5.4",
  "small_model": "openai/gpt-5.4",
  "provider": {
    "openai": {
      "options": {
        "baseURL": "https://api.xairouter.com/v1",
        "apiKey": "{env:XAI_API_KEY}"
      },
      "models": {
        "gpt-5.4": {
          "headers": {
            "originator": "opencode"
          }
        }
      }
    }
  }
}
'@ | Set-Content -Path "$env:USERPROFILE\.config\opencode\opencode.jsonc" -Encoding utf8

opencode debug config
opencode run "hello"

Profile B: MiniMax-M2.5 (Chat API)

Linux/macOS

export XAI_API_KEY="sk-xxx"
mkdir -p ~/.config/opencode
cat > ~/.config/opencode/opencode.jsonc << 'JSON'
{
  "$schema": "https://opencode.ai/config.json",
  "model": "xai-chat/MiniMax-M2.5",
  "small_model": "xai-chat/MiniMax-M2.5",
  "provider": {
    "xai-chat": {
      "npm": "@ai-sdk/openai-compatible",
      "options": {
        "baseURL": "https://api.xairouter.com/v1",
        "apiKey": "{env:XAI_API_KEY}"
      },
      "models": {
        "MiniMax-M2.5": {}
      }
    }
  }
}
JSON

opencode debug config
opencode run "hello"

Windows (PowerShell)

$env:XAI_API_KEY="sk-xxx"
New-Item -ItemType Directory -Path "$env:USERPROFILE\.config\opencode" -Force | Out-Null
@'
{
  "$schema": "https://opencode.ai/config.json",
  "model": "xai-chat/MiniMax-M2.5",
  "small_model": "xai-chat/MiniMax-M2.5",
  "provider": {
    "xai-chat": {
      "npm": "@ai-sdk/openai-compatible",
      "options": {
        "baseURL": "https://api.xairouter.com/v1",
        "apiKey": "{env:XAI_API_KEY}"
      },
      "models": {
        "MiniMax-M2.5": {}
      }
    }
  }
}
'@ | Set-Content -Path "$env:USERPROFILE\.config\opencode\opencode.jsonc" -Encoding utf8

opencode debug config
opencode run "hello"

Which profile should I use?

  1. If you want gpt-5.4, use Profile A and keep headers.originator = "opencode".
  2. If you want MiniMax-M2.5, use Profile B.
  3. Do not mix both profiles in the same opencode.jsonc.