L LLM Cloud Hub
Vendor comparison

Arcee AI vs Perplexity

Every Arcee AI and Perplexity LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

Arcee AI

8 models
Model Context In $/1M Out $/1M
Coder Large 33k 0.5000 0.8000
Maestro Reasoning 131k 0.9000 3.3000
Spotlight
vision
131k 0.1800 0.1800
Trinity Large Preview
tools, json_mode
131k 0.1500 0.4500
Trinity Large Thinking
tools, json_mode
262k 0.2200 0.8500
Trinity Large Thinking (free)
tools
262k 0.0000 0.0000
Trinity Mini
tools, json_mode
131k 0.0450 0.1500
Virtuoso Large
tools
131k 0.7500 1.2000

Perplexity

5 models
Model Context In $/1M Out $/1M
Sonar
vision
127k 1.0000 1.0000
Sonar Deep Research 128k 2.0000 8.0000
Sonar Pro
vision
200k 3.0000 15.0000
Sonar Pro Search
vision
200k 3.0000 15.0000
Sonar Reasoning Pro
vision
128k 2.0000 8.0000

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.