L LLM Cloud Hub
Vendor comparison

Perplexity vs Z.ai

Every Perplexity and Z.ai LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

Perplexity

5 models
Model Context In $/1M Out $/1M
Sonar
vision
127k 1.0000 1.0000
Sonar Deep Research 128k 2.0000 8.0000
Sonar Pro
vision
200k 3.0000 15.0000
Sonar Pro Search
vision
200k 3.0000 15.0000
Sonar Reasoning Pro
vision
128k 2.0000 8.0000

Z.ai

13 models
Model Context In $/1M Out $/1M
GLM 4 32B
tools
128k 0.1000 0.1000
GLM 4.5
tools, json_mode
131k 0.6000 2.2000
GLM 4.5 Air
tools
131k 0.1300 0.8500
GLM 4.5 Air (free)
tools
131k 0.0000 0.0000
GLM 4.5V
vision, tools
66k 0.6000 1.8000
GLM 4.6
tools, json_mode
203k 0.4300 1.7400
GLM 4.6V
vision, tools
131k 0.3000 0.9000
GLM 4.7
tools, json_mode
203k 0.4000 1.7500
GLM 4.7 Flash
tools, json_mode
203k 0.0600 0.4000
GLM 5
tools, json_mode
203k 0.6000 1.9200
GLM 5 Turbo
tools, json_mode
203k 1.2000 4.0000
GLM 5.1
tools, json_mode
203k 0.9800 3.0800
GLM 5V Turbo
vision, tools, json_mode
203k 1.2000 4.0000

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.