L LLM Cloud Hub
Vendor comparison

xAI vs Z.ai

Every xAI and Z.ai LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

xAI

11 models
Model Context In $/1M Out $/1M
Grok 3
tools, json_mode
131k 3.0000 15.0000
Grok 3 Beta
tools, json_mode
131k 3.0000 15.0000
Grok 3 Mini
tools, json_mode
131k 0.3000 0.5000
Grok 3 Mini Beta
tools, json_mode
131k 0.3000 0.5000
Grok 4
vision, tools, json_mode
256k 3.0000 15.0000
Grok 4 Fast
vision, tools, json_mode
2000k 0.2000 0.5000
Grok 4.1 Fast
vision, tools, json_mode
2000k 0.2000 0.5000
Grok 4.20
vision, tools, json_mode
2000k 1.2500 2.5000
Grok 4.20 Multi-Agent
vision, json_mode
2000k 2.0000 6.0000
Grok 4.3
vision, tools, json_mode
1000k 1.2500 2.5000
Grok Code Fast 1
tools, json_mode
256k 0.2000 1.5000

Z.ai

13 models
Model Context In $/1M Out $/1M
GLM 4 32B
tools
128k 0.1000 0.1000
GLM 4.5
tools, json_mode
131k 0.6000 2.2000
GLM 4.5 Air
tools
131k 0.1300 0.8500
GLM 4.5 Air (free)
tools
131k 0.0000 0.0000
GLM 4.5V
vision, tools
66k 0.6000 1.8000
GLM 4.6
tools, json_mode
203k 0.4300 1.7400
GLM 4.6V
vision, tools
131k 0.3000 0.9000
GLM 4.7
tools, json_mode
203k 0.4000 1.7500
GLM 4.7 Flash
tools, json_mode
203k 0.0600 0.4000
GLM 5
tools, json_mode
203k 0.6000 1.9200
GLM 5 Turbo
tools, json_mode
203k 1.2000 4.0000
GLM 5.1
tools, json_mode
203k 0.9800 3.0800
GLM 5V Turbo
vision, tools, json_mode
203k 1.2000 4.0000

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.