Mistral AI
Mistral Large 3
Mistral AI's flagship MoE open-source model with 675B total parameters, multimodal capabilities and 256K context
Context Length256K tokens
ArchitectureMoE (41B/675B)
LicenseApache 2.0
ๅฎไปทไธ่งๆ ผ
๐ฐ ๅฎไปท
่พๅ
ฅ$0.5 / M tokens
่พๅบ$1.5 / M tokens
โ๏ธ ่งๆ ผ
Context Length256K tokens
ArchitectureMoE (41B/675B)
LicenseApache 2.0
Special CapabilitiesMultimodal ยท Multilingual
API ่ฐ็จ็คบไพ
Python
from openai import OpenAI
client = OpenAI(
base_url="https://api.xairouter.com/v1",
api_key="your-api-key"
)
response = client.chat.completions.create(
model="mistral-large-latest",
messages=[
{"role": "user", "content": "Hello!"}
]
)
print(response.choices[0].message.content)cURL
curl https://api.xairouter.com/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer YOUR_API_KEY" \
-d '{
"model": "mistral-large-latest",
"messages": [
{"role": "user", "content": "Hello!"}
]
}'