L LLM Cloud Hub
Vendor comparison

EssentialAI vs Perplexity

Every EssentialAI and Perplexity LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

EssentialAI

1 model
Model Context In $/1M Out $/1M
Rnj 1 Instruct
tools, json_mode
33k 0.1500 0.1500

Perplexity

5 models
Model Context In $/1M Out $/1M
Sonar
vision
127k 1.0000 1.0000
Sonar Deep Research 128k 2.0000 8.0000
Sonar Pro
vision
200k 3.0000 15.0000
Sonar Pro Search
vision
200k 3.0000 15.0000
Sonar Reasoning Pro
vision
128k 2.0000 8.0000

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.