Mistral Small 3.1 24B
Mistral Mistral Small 3.1 24B — pricing, 128k context window, API cost calculator and alternatives.
By Mistral
Specs
- Provider
- Mistral
- Slug
- mistralai/mistral-small-3-1-24b-instruct
- Capabilities
- vision
Pricing freshness
- Tier
- standard
- Currency
- USD
- As of
- 2026-05-08 17:08 UTC
Pricing history
Tracking Mistral Small 3.1 24B pricing since 2026-05-08. We'll plot the chart here once it changes.
Quickstart — call Mistral Small 3.1 24B from your app
curl https://api.mistral.ai/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $MISTRAL_API_KEY" \
-d '{
"model": "mistral-small-3-1-24b-instruct",
"messages": [{"role": "user", "content": "Hello!"}]
}'
Official docs: https://docs.mistral.ai/api/
# pip install openai
from openai import OpenAI
client = OpenAI(
base_url="https://api.mistral.ai/v1",
api_key=os.environ["MISTRAL_API_KEY"],
)
resp = client.chat.completions.create(
model="mistral-small-3-1-24b-instruct",
messages=[{"role": "user", "content": "Hello!"}],
)
print(resp.choices[0].message.content)
Official docs: https://docs.mistral.ai/api/
// npm install openai
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.mistral.ai/v1",
apiKey: process.env.MISTRAL_API_KEY,
});
const resp = await client.chat.completions.create({
model: "mistral-small-3-1-24b-instruct",
messages: [{ role: "user", content: "Hello!" }],
});
console.log(resp.choices[0].message.content);
Official docs: https://docs.mistral.ai/api/
Related models
Similar capabilities, context window, and price tier — drawn from across the catalog so you can compare alternatives in one click.
Frequently asked questions
What is Mistral Small 3.1 24B?
Mistral Small 3.1 24B is a large language model API from Mistral with a 128k-token context window. It costs $0.35 per 1M input tokens and $0.56 per 1M output tokens.
How much does Mistral Small 3.1 24B cost?
Mistral Small 3.1 24B is priced at $0.35 per 1M input tokens and $0.56 per 1M output tokens via the Mistral API. A 50/50 input/output workload of 1M total tokens costs about $0.455.
What is the context window of Mistral Small 3.1 24B?
Mistral Small 3.1 24B supports up to 128k tokens of context per request — roughly 256 pages of English text or 16000 lines of code at a typical density.
Does Mistral Small 3.1 24B support vision, tool use, or JSON mode?
Mistral Small 3.1 24B supports image input (vision). It does not support tool/function calling and structured JSON mode.
Who makes Mistral Small 3.1 24B?
Mistral Small 3.1 24B is built and operated by Mistral. Pricing, context window, and capabilities on this page are refreshed nightly from Mistral's public catalog.
Can I self-host Mistral Small 3.1 24B?
Mistral Small 3.1 24B is API-only — its weights are not publicly distributed by Mistral, so it cannot be self-hosted today.