L LLM Cloud Hub
Vendor comparison

Arcee AI vs xAI

Every Arcee AI and xAI LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

Arcee AI

8 models
Model Context In $/1M Out $/1M
Coder Large 33k 0.5000 0.8000
Maestro Reasoning 131k 0.9000 3.3000
Spotlight
vision
131k 0.1800 0.1800
Trinity Large Preview
tools, json_mode
131k 0.1500 0.4500
Trinity Large Thinking
tools, json_mode
262k 0.2200 0.8500
Trinity Large Thinking (free)
tools
262k 0.0000 0.0000
Trinity Mini
tools, json_mode
131k 0.0450 0.1500
Virtuoso Large
tools
131k 0.7500 1.2000

xAI

11 models
Model Context In $/1M Out $/1M
Grok 3
tools, json_mode
131k 3.0000 15.0000
Grok 3 Beta
tools, json_mode
131k 3.0000 15.0000
Grok 3 Mini
tools, json_mode
131k 0.3000 0.5000
Grok 3 Mini Beta
tools, json_mode
131k 0.3000 0.5000
Grok 4
vision, tools, json_mode
256k 3.0000 15.0000
Grok 4 Fast
vision, tools, json_mode
2000k 0.2000 0.5000
Grok 4.1 Fast
vision, tools, json_mode
2000k 0.2000 0.5000
Grok 4.20
vision, tools, json_mode
2000k 1.2500 2.5000
Grok 4.20 Multi-Agent
vision, json_mode
2000k 2.0000 6.0000
Grok 4.3
vision, tools, json_mode
1000k 1.2500 2.5000
Grok Code Fast 1
tools, json_mode
256k 0.2000 1.5000

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.