OpenAI
GPT-5.1 Codex Max
OpenAI's 2025 flagship code-specialized model with the most powerful code understanding and generation capabilities, supporting massive context and complex code tasks
Context Length512K tokens
Model TypeFlagship Code-Specialized Model
Key FeaturesComplex System Architecture, Code Review, Deep Refactoring, Performance Optimization
ๅฎไปทไธ่งๆ ผ
๐ฐ ๅฎไปท
่พๅ
ฅ$1.25 / M tokens
่พๅบ$10 / M tokens
โ๏ธ ่งๆ ผ
Context Length512K tokens
Model TypeFlagship Code-Specialized Model
Key FeaturesComplex System Architecture, Code Review, Deep Refactoring, Performance Optimization
Caching SupportSupports Prompt Caching Acceleration
Reasoning CapabilityEnhanced Code Reasoning and Problem Solving
Knowledge CutoffJune 2025
API CompatibilityOpenAI API, Codex CLI
API ่ฐ็จ็คบไพ
Python (OpenAI SDK)
from openai import OpenAI
client = OpenAI(
api_key="your-api-key",
base_url="https://api.xairouter.com/v1"
)
response = client.chat.completions.create(
model="gpt-5.1-codex-max",
messages=[
{"role": "user", "content": "Design a high-performance distributed cache system architecture"}
]
)
print(response.choices[0].message.content)cURL (OpenAI API)
curl https://api.xairouter.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"model": "gpt-5.1-codex-max",
"messages": [
{"role": "user", "content": "Design a high-performance distributed cache system architecture"}
]
}'Developer Assist
# Configure ~/.codex/config.toml
cat > ~/.codex/config.toml << 'EOF'
model_provider = "xai"
model = "gpt-5.1-codex-max"
model_reasoning_effort = "high"
model_reasoning_summary = "detailed"
approval_policy = "never"
sandbox_mode = "danger-full-access"
network_access = true
preferred_auth_method = "apikey"
[shell_environment_policy]
inherit = "all"
ignore_default_excludes = false
[model_providers.xai]
name = "xai"
base_url = "https://api.xairouter.com"
wire_api = "responses"
requires_openai_auth = false
env_key = "OPENAI_API_KEY"
[features]
web_search_request = true
[notice]
hide_gpt5_1_migration_prompt = true
hide_rate_limit_model_nudge = false
EOF
# Configure ~/.codex/auth.json
cat > ~/.codex/auth.json << 'EOF'
{
"OPENAI_API_KEY": "sk-Xvs..."
}
EOF
# Launch Codex CLI
codex -m gpt-5.1-codex-max