L LLM Cloud Hub
Model

Llama 3.1 70B Hanami x1

Sao10K Llama 3.1 70B Hanami x1 — pricing, 16k context window, API cost calculator and alternatives.

By Sao10K

Context window Maximum tokens (input + output) the model can process in a single request. Glossary →
16,000
tokens
Input price What the model charges for tokens you send (prompt + context). Glossary →
$3.0000
per 1M tokens
Output price What the model charges for tokens it generates. Usually 3–5× pricier than input. Glossary →
$3.0000
per 1M tokens

Specs

Provider
Sao10K
Slug
sao10k/l3-1-70b-hanami-x1
Capabilities
text-only

Pricing freshness

Tier
standard
Currency
USD
As of
2026-05-08 17:08 UTC
Estimate monthly cost → See alternatives to Llama 3.1 70B Hanami x1 →

Pricing history

Tracking Llama 3.1 70B Hanami x1 pricing since 2026-05-08. We'll plot the chart here once it changes.

Quickstart — call Llama 3.1 70B Hanami x1 from your app

curl https://openrouter.ai/api/v1/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer $OPENROUTER_API_KEY" \
  -d '{
    "model": "sao10k/l3-1-70b-hanami-x1",
    "messages": [{"role": "user", "content": "Hello!"}]
  }'

Official docs: https://openrouter.ai/docs

Related models

Similar capabilities, context window, and price tier — drawn from across the catalog so you can compare alternatives in one click.

Frequently asked questions

What is Llama 3.1 70B Hanami x1?

Llama 3.1 70B Hanami x1 is a large language model API from Sao10K with a 16k-token context window. It costs $3 per 1M input tokens and $3 per 1M output tokens.

How much does Llama 3.1 70B Hanami x1 cost?

Llama 3.1 70B Hanami x1 is priced at $3 per 1M input tokens and $3 per 1M output tokens via the Sao10K API. A 50/50 input/output workload of 1M total tokens costs about $3.

What is the context window of Llama 3.1 70B Hanami x1?

Llama 3.1 70B Hanami x1 supports up to 16k tokens of context per request — roughly 32 pages of English text or 2000 lines of code at a typical density.

Does Llama 3.1 70B Hanami x1 support vision, tool use, or JSON mode?

Llama 3.1 70B Hanami x1 is a text-only model — it does not support vision, tool use, or structured JSON mode.

Who makes Llama 3.1 70B Hanami x1?

Llama 3.1 70B Hanami x1 is built and operated by Sao10K. Pricing, context window, and capabilities on this page are refreshed nightly from Sao10K's public catalog.

Can I self-host Llama 3.1 70B Hanami x1?

Llama 3.1 70B Hanami x1 is API-only — its weights are not publicly distributed by Sao10K, so it cannot be self-hosted today.

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.