OpenAI logo
OpenAI

GPT-5.1 Codex

OpenAI's 2025 code-specialized model, focused on code understanding, generation, and optimization with caching support

Category Text Model ID gpt-5.1-codex
Context Length256K tokens
Model TypeCode-Specialized Model
Key FeaturesCode Understanding, Generation, Debugging, Optimization

ๅฎšไปทไธŽ่ง„ๆ ผ

๐Ÿ’ฐ ๅฎšไปท

่พ“ๅ…ฅ$1.25 / M tokens
่พ“ๅ‡บ$10 / M tokens

โš™๏ธ ่ง„ๆ ผ

Context Length256K tokens
Model TypeCode-Specialized Model
Key FeaturesCode Understanding, Generation, Debugging, Optimization
Caching SupportSupports Prompt Caching Acceleration
Knowledge CutoffJune 2025
API CompatibilityOpenAI API, Codex CLI

API ่ฐƒ็”จ็คบไพ‹

Python (OpenAI SDK)

from openai import OpenAI

client = OpenAI(
    api_key="your-api-key",
    base_url="https://api.xairouter.com/v1"
)

response = client.chat.completions.create(
    model="gpt-5.1-codex",
    messages=[
        {"role": "user", "content": "Write a quicksort function in Python"}
    ]
)

print(response.choices[0].message.content)

cURL (OpenAI API)

curl https://api.xairouter.com/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -d '{
    "model": "gpt-5.1-codex",
    "messages": [
      {"role": "user", "content": "Write a quicksort function in Python"}
    ]
  }'

Developer Assist

# Configure ~/.codex/config.toml
cat > ~/.codex/config.toml << 'EOF'
model_provider = "xai"
model = "gpt-5.1-codex"
model_reasoning_effort = "high"
model_reasoning_summary = "detailed"
approval_policy = "never"
sandbox_mode = "danger-full-access"
network_access = true
preferred_auth_method = "apikey"

[shell_environment_policy]
inherit = "all"
ignore_default_excludes = false

[model_providers.xai]
name = "xai"
base_url = "https://api.xairouter.com"
wire_api = "responses"
requires_openai_auth = true
env_key = "OPENAI_API_KEY"

[tools]
web_search = true
EOF

# Configure ~/.codex/auth.json
cat > ~/.codex/auth.json << 'EOF'
{
  "OPENAI_API_KEY": "sk-Xvs..."
}
EOF

# Launch Codex CLI
codex