Llama 3 Euryale 70B v2.1
Sao10K Llama 3 Euryale 70B v2.1 — pricing, 8k context window, API cost calculator and alternatives.
By Sao10K
Specs
- Provider
- Sao10K
- Slug
- sao10k/l3-euryale-70b
- Capabilities
- tools
Pricing freshness
- Tier
- standard
- Currency
- USD
- As of
- 2026-05-08 17:08 UTC
Pricing history
Tracking Llama 3 Euryale 70B v2.1 pricing since 2026-05-08. We'll plot the chart here once it changes.
Quickstart — call Llama 3 Euryale 70B v2.1 from your app
curl https://openrouter.ai/api/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENROUTER_API_KEY" \
-d '{
"model": "sao10k/l3-euryale-70b",
"messages": [{"role": "user", "content": "Hello!"}]
}'
Official docs: https://openrouter.ai/docs
# pip install openai
from openai import OpenAI
client = OpenAI(
base_url="https://openrouter.ai/api/v1",
api_key=os.environ["OPENROUTER_API_KEY"],
)
resp = client.chat.completions.create(
model="sao10k/l3-euryale-70b",
messages=[{"role": "user", "content": "Hello!"}],
)
print(resp.choices[0].message.content)
Official docs: https://openrouter.ai/docs
// npm install openai
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://openrouter.ai/api/v1",
apiKey: process.env.OPENROUTER_API_KEY,
});
const resp = await client.chat.completions.create({
model: "sao10k/l3-euryale-70b",
messages: [{ role: "user", content: "Hello!" }],
});
console.log(resp.choices[0].message.content);
Official docs: https://openrouter.ai/docs
Related models
Similar capabilities, context window, and price tier — drawn from across the catalog so you can compare alternatives in one click.
Frequently asked questions
What is Llama 3 Euryale 70B v2.1?
Llama 3 Euryale 70B v2.1 is a large language model API from Sao10K with a 8k-token context window. It costs $1.48 per 1M input tokens and $1.48 per 1M output tokens.
How much does Llama 3 Euryale 70B v2.1 cost?
Llama 3 Euryale 70B v2.1 is priced at $1.48 per 1M input tokens and $1.48 per 1M output tokens via the Sao10K API. A 50/50 input/output workload of 1M total tokens costs about $1.48.
What is the context window of Llama 3 Euryale 70B v2.1?
Llama 3 Euryale 70B v2.1 supports up to 8k tokens of context per request — roughly 16 pages of English text or 1024 lines of code at a typical density.
Does Llama 3 Euryale 70B v2.1 support vision, tool use, or JSON mode?
Llama 3 Euryale 70B v2.1 supports tool/function calling. It does not support image input (vision) and structured JSON mode.
Who makes Llama 3 Euryale 70B v2.1?
Llama 3 Euryale 70B v2.1 is built and operated by Sao10K. Pricing, context window, and capabilities on this page are refreshed nightly from Sao10K's public catalog.
Can I self-host Llama 3 Euryale 70B v2.1?
Llama 3 Euryale 70B v2.1 is API-only — its weights are not publicly distributed by Sao10K, so it cannot be self-hosted today.