L LLM Cloud Hub
Vendor comparison

MiniMax vs Sao10K

Every MiniMax and Sao10K LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

MiniMax

8 models
Model Context In $/1M Out $/1M
MiniMax M1
tools
1000k 0.4000 2.2000
MiniMax M2
tools, json_mode
197k 0.2550 1.0000
MiniMax M2-her 66k 0.3000 1.2000
MiniMax M2.1
tools, json_mode
197k 0.2900 0.9500
MiniMax M2.5
tools, json_mode
197k 0.1500 1.1500
MiniMax M2.5 (free)
tools, json_mode
197k 0.0000 0.0000
MiniMax M2.7
tools, json_mode
197k 0.2600 1.2000
MiniMax-01
vision
1000k 0.2000 1.1000

Sao10K

5 models
Model Context In $/1M Out $/1M
Llama 3 8B Lunaris
json_mode
8k 0.0400 0.0500
Llama 3 Euryale 70B v2.1
tools
8k 1.4800 1.4800
Llama 3.1 70B Hanami x1 16k 3.0000 3.0000
Llama 3.1 Euryale 70B v2.2
tools, json_mode
131k 0.8500 0.8500
Llama 3.3 Euryale 70B
json_mode
131k 0.6500 0.7500

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.