Llama 3.1 Euryale 70B v2.2
Sao10K Llama 3.1 Euryale 70B v2.2 — pricing, 131k context window, API cost calculator and alternatives.
By Sao10K
Specs
- Provider
- Sao10K
- Slug
- sao10k/l3-1-euryale-70b
- Capabilities
- tools, json_mode
Pricing freshness
- Tier
- standard
- Currency
- USD
- As of
- 2026-05-08 17:08 UTC
Pricing history
Tracking Llama 3.1 Euryale 70B v2.2 pricing since 2026-05-08. We'll plot the chart here once it changes.
Quickstart — call Llama 3.1 Euryale 70B v2.2 from your app
curl https://openrouter.ai/api/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENROUTER_API_KEY" \
-d '{
"model": "sao10k/l3-1-euryale-70b",
"messages": [{"role": "user", "content": "Hello!"}]
}'
Official docs: https://openrouter.ai/docs
# pip install openai
from openai import OpenAI
client = OpenAI(
base_url="https://openrouter.ai/api/v1",
api_key=os.environ["OPENROUTER_API_KEY"],
)
resp = client.chat.completions.create(
model="sao10k/l3-1-euryale-70b",
messages=[{"role": "user", "content": "Hello!"}],
)
print(resp.choices[0].message.content)
Official docs: https://openrouter.ai/docs
// npm install openai
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://openrouter.ai/api/v1",
apiKey: process.env.OPENROUTER_API_KEY,
});
const resp = await client.chat.completions.create({
model: "sao10k/l3-1-euryale-70b",
messages: [{ role: "user", content: "Hello!" }],
});
console.log(resp.choices[0].message.content);
Official docs: https://openrouter.ai/docs
Related models
Similar capabilities, context window, and price tier — drawn from across the catalog so you can compare alternatives in one click.
Frequently asked questions
What is Llama 3.1 Euryale 70B v2.2?
Llama 3.1 Euryale 70B v2.2 is a large language model API from Sao10K with a 131k-token context window. It costs $0.85 per 1M input tokens and $0.85 per 1M output tokens.
How much does Llama 3.1 Euryale 70B v2.2 cost?
Llama 3.1 Euryale 70B v2.2 is priced at $0.85 per 1M input tokens and $0.85 per 1M output tokens via the Sao10K API. A 50/50 input/output workload of 1M total tokens costs about $0.85.
What is the context window of Llama 3.1 Euryale 70B v2.2?
Llama 3.1 Euryale 70B v2.2 supports up to 131k tokens of context per request — roughly 262 pages of English text or 16384 lines of code at a typical density.
Does Llama 3.1 Euryale 70B v2.2 support vision, tool use, or JSON mode?
Llama 3.1 Euryale 70B v2.2 supports tool/function calling and structured JSON mode. It does not support image input (vision).
Who makes Llama 3.1 Euryale 70B v2.2?
Llama 3.1 Euryale 70B v2.2 is built and operated by Sao10K. Pricing, context window, and capabilities on this page are refreshed nightly from Sao10K's public catalog.
Can I self-host Llama 3.1 Euryale 70B v2.2?
Llama 3.1 Euryale 70B v2.2 is API-only — its weights are not publicly distributed by Sao10K, so it cannot be self-hosted today.