OpenAI logo
OpenAI

GPT-5.3 Codex Spark

OpenAI 于 2026 年发布的超低延迟代码模型,面向实时编程协作与快速迭代,支持缓存加速

类型 文本模型 Model ID gpt-5.3-codex-spark
上下文长度128K tokens
模型类型超低延迟代码模型(小型)
特色能力实时编码、快速迭代、定向代码修改

定价与规格

💰 定价

输入$1.75 / M tokens
输出$14.0 / M tokens
缓存$0.175 / M tokens

⚙️ 规格

上下文长度128K tokens
模型类型超低延迟代码模型(小型)
特色能力实时编码、快速迭代、定向代码修改
生成速度1000+ tokens/s(官方演示)
输入输出文本输入、文本输出
发布状态研究预览(2026年2月12日)
API 兼容性OpenAI、Codex

API 调用示例

Python (OpenAI SDK)

from openai import OpenAI

client = OpenAI(
    api_key="your-api-key",
    base_url="https://api.xairouter.com/v1"
)

response = client.chat.completions.create(
    model="gpt-5.3-codex-spark",
    messages=[
        {"role": "user", "content": "请帮我写一个 Python 快速排序函数"}
    ]
)

print(response.choices[0].message.content)

cURL (OpenAI API)

curl https://api.xairouter.com/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -d '{
    "model": "gpt-5.3-codex-spark",
    "messages": [
      {"role": "user", "content": "请帮我写一个 Python 快速排序函数"}
    ]
  }'

开发者辅助

# 配置 ~/.codex/config.toml
cat > ~/.codex/config.toml << 'EOF'
model_provider = "xai"
model = "gpt-5.3-codex-spark"
approval_policy = "never"
sandbox_mode = "danger-full-access"
network_access = true
preferred_auth_method = "apikey"

[shell_environment_policy]
inherit = "all"
ignore_default_excludes = false

[model_providers.xai]
name = "xai"
base_url = "https://api.xairouter.com"
wire_api = "responses"
requires_openai_auth = false
env_key = "OPENAI_API_KEY"
web_search = true
EOF

# 设置环境变量 (添加到 ~/.bashrc 或 ~/.zshrc)
export OPENAI_API_KEY="sk-Xvs..."

# 启动 Codex
codex