L LLM Cloud Hub
Vendor comparison

Baidu Qianfan vs Sao10K

Every Baidu Qianfan and Sao10K LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

Baidu Qianfan

8 models
Model Context In $/1M Out $/1M
CoBuddy (free)
tools
131k 0.0000 0.0000
ERNIE 4.5 21B A3B
tools
120k 0.0700 0.2800
ERNIE 4.5 21B A3B Thinking 131k 0.0700 0.2800
ERNIE 4.5 300B A47B 123k 0.2800 1.1000
ERNIE 4.5 VL 28B A3B
vision, tools
30k 0.1400 0.5600
ERNIE 4.5 VL 424B A47B
vision
123k 0.4200 1.2500
Qianfan-OCR-Fast
vision
66k 0.6800 2.8100
Qianfan-OCR-Fast (free)
vision
66k 0.0000 0.0000

Sao10K

5 models
Model Context In $/1M Out $/1M
Llama 3 8B Lunaris
json_mode
8k 0.0400 0.0500
Llama 3 Euryale 70B v2.1
tools
8k 1.4800 1.4800
Llama 3.1 70B Hanami x1 16k 3.0000 3.0000
Llama 3.1 Euryale 70B v2.2
tools, json_mode
131k 0.8500 0.8500
Llama 3.3 Euryale 70B
json_mode
131k 0.6500 0.7500

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.