L LLM Cloud Hub
Vendor comparison

DeepSeek vs Z.ai

Every DeepSeek and Z.ai LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

DeepSeek

14 models
Model Context In $/1M Out $/1M
DeepSeek V3
tools, json_mode
164k 0.3200 0.8900
DeepSeek V3 0324
tools, json_mode
164k 0.2000 0.7700
DeepSeek V3.1
tools, json_mode
164k 0.2100 0.7900
DeepSeek V3.1 Terminus
tools, json_mode
164k 0.2700 0.9500
DeepSeek V3.2
tools, json_mode
131k 0.2520 0.3780
DeepSeek V3.2 Exp
tools, json_mode
164k 0.2700 0.4100
DeepSeek V3.2 Speciale
json_mode
164k 0.2870 0.4310
DeepSeek V4 Flash
tools, json_mode
1049k 0.1260 0.2520
DeepSeek V4 Flash (free)
tools
1049k 0.0000 0.0000
DeepSeek V4 Pro
tools, json_mode
1049k 0.4350 0.8700
R1
tools
64k 0.7000 2.5000
R1 0528
tools, json_mode
164k 0.5000 2.1500
R1 Distill Llama 70B
json_mode
131k 0.7000 0.8000
R1 Distill Qwen 32B
json_mode
33k 0.2900 0.2900

Z.ai

13 models
Model Context In $/1M Out $/1M
GLM 4 32B
tools
128k 0.1000 0.1000
GLM 4.5
tools, json_mode
131k 0.6000 2.2000
GLM 4.5 Air
tools
131k 0.1300 0.8500
GLM 4.5 Air (free)
tools
131k 0.0000 0.0000
GLM 4.5V
vision, tools
66k 0.6000 1.8000
GLM 4.6
tools, json_mode
203k 0.4300 1.7400
GLM 4.6V
vision, tools
131k 0.3000 0.9000
GLM 4.7
tools, json_mode
203k 0.4000 1.7500
GLM 4.7 Flash
tools, json_mode
203k 0.0600 0.4000
GLM 5
tools, json_mode
203k 0.6000 1.9200
GLM 5 Turbo
tools, json_mode
203k 1.2000 4.0000
GLM 5.1
tools, json_mode
203k 0.9800 3.0800
GLM 5V Turbo
vision, tools, json_mode
203k 1.2000 4.0000

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.