Configure OpenCode with XAI Router (Responses-first + Chat)
Posted January 10, 2026 by The XAI Tech Teamย โย 2ย min read
OpenCode (opencode) can be routed through XAI Router with one unified endpoint. For gpt-5.4, the recommended setup keeps OpenCode on its own Responses request shape and lets XAI Router apply only the minimal compatibility shim needed for that client. Regular OpenAI-compatible models continue to use the standard compatibility API. This guide keeps only two clear production-ready profiles:
gpt-5.4(Responses API)MiniMax-M2.5(Chat API)
Prerequisites
- You already created an API key at m.xairouter.com.
opencodeis installed locally.
Config file location
- Linux/macOS:
~/.config/opencode/opencode.jsonc - Windows:
%USERPROFILE%\.config\opencode\opencode.jsonc
Keep only one profile at a time. To switch profiles, overwrite opencode.jsonc.
Important notes
- This guide uses OpenCode's
openaiprovider and pointsbaseURLtohttps://api.xairouter.com/v1. - For
gpt-5.4, explicitly addheaders.originator = "opencode"on theopenai/gpt-5.4model. This letscodex-cloudrecognize OpenCode traffic and apply the minimal Responses shim only for that client. - Without that header, requests fall back to the generic Responses conversion path. It may still work, but it is not the recommended OpenCode-targeted setup.
- If you enable OpenCode's built-in Codex OAuth plugin, it connects directly to
chatgpt.com/backend-api/codex/responsesby default and does not go through XAI Router. If you want traffic to go through XAI Router, use theopenaiprovider config in this guide.
Profile A: gpt-5.4 (Responses-first path)
Linux/macOS
export XAI_API_KEY="sk-xxx"
mkdir -p ~/.config/opencode
cat > ~/.config/opencode/opencode.jsonc << 'JSON'
{
"$schema": "https://opencode.ai/config.json",
"model": "openai/gpt-5.4",
"small_model": "openai/gpt-5.4",
"provider": {
"openai": {
"options": {
"baseURL": "https://api.xairouter.com/v1",
"apiKey": "{env:XAI_API_KEY}"
},
"models": {
"gpt-5.4": {
"headers": {
"originator": "opencode"
}
}
}
}
}
}
JSON
opencode debug config
opencode run "hello"Windows (PowerShell)
$env:XAI_API_KEY="sk-xxx"
New-Item -ItemType Directory -Path "$env:USERPROFILE\.config\opencode" -Force | Out-Null
@'
{
"$schema": "https://opencode.ai/config.json",
"model": "openai/gpt-5.4",
"small_model": "openai/gpt-5.4",
"provider": {
"openai": {
"options": {
"baseURL": "https://api.xairouter.com/v1",
"apiKey": "{env:XAI_API_KEY}"
},
"models": {
"gpt-5.4": {
"headers": {
"originator": "opencode"
}
}
}
}
}
}
'@ | Set-Content -Path "$env:USERPROFILE\.config\opencode\opencode.jsonc" -Encoding utf8
opencode debug config
opencode run "hello"Profile B: MiniMax-M2.5 (Chat API)
Linux/macOS
export XAI_API_KEY="sk-xxx"
mkdir -p ~/.config/opencode
cat > ~/.config/opencode/opencode.jsonc << 'JSON'
{
"$schema": "https://opencode.ai/config.json",
"model": "xai-chat/MiniMax-M2.5",
"small_model": "xai-chat/MiniMax-M2.5",
"provider": {
"xai-chat": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "https://api.xairouter.com/v1",
"apiKey": "{env:XAI_API_KEY}"
},
"models": {
"MiniMax-M2.5": {}
}
}
}
}
JSON
opencode debug config
opencode run "hello"Windows (PowerShell)
$env:XAI_API_KEY="sk-xxx"
New-Item -ItemType Directory -Path "$env:USERPROFILE\.config\opencode" -Force | Out-Null
@'
{
"$schema": "https://opencode.ai/config.json",
"model": "xai-chat/MiniMax-M2.5",
"small_model": "xai-chat/MiniMax-M2.5",
"provider": {
"xai-chat": {
"npm": "@ai-sdk/openai-compatible",
"options": {
"baseURL": "https://api.xairouter.com/v1",
"apiKey": "{env:XAI_API_KEY}"
},
"models": {
"MiniMax-M2.5": {}
}
}
}
}
'@ | Set-Content -Path "$env:USERPROFILE\.config\opencode\opencode.jsonc" -Encoding utf8
opencode debug config
opencode run "hello"Which profile should I use?
- If you want
gpt-5.4, use Profile A and keepheaders.originator = "opencode". - If you want
MiniMax-M2.5, use Profile B. - Do not mix both profiles in the same
opencode.jsonc.