L LLM Cloud Hub
Vendor comparison

Baidu Qianfan vs Cohere

Every Baidu Qianfan and Cohere LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

Baidu Qianfan

8 models
Model Context In $/1M Out $/1M
CoBuddy (free)
tools
131k 0.0000 0.0000
ERNIE 4.5 21B A3B
tools
120k 0.0700 0.2800
ERNIE 4.5 21B A3B Thinking 131k 0.0700 0.2800
ERNIE 4.5 300B A47B 123k 0.2800 1.1000
ERNIE 4.5 VL 28B A3B
vision, tools
30k 0.1400 0.5600
ERNIE 4.5 VL 424B A47B
vision
123k 0.4200 1.2500
Qianfan-OCR-Fast
vision
66k 0.6800 2.8100
Qianfan-OCR-Fast (free)
vision
66k 0.0000 0.0000

Cohere

4 models
Model Context In $/1M Out $/1M
Command A
json_mode
256k 2.5000 10.0000
Command R (08-2024)
tools, json_mode
128k 0.1500 0.6000
Command R+ (08-2024)
tools, json_mode
128k 2.5000 10.0000
Command R7B (12-2024)
json_mode
128k 0.0375 0.1500

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.