L LLM Cloud Hub
Vendor comparison

Meta vs xAI

Every Meta and xAI LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

Meta

14 models
Model Context In $/1M Out $/1M
Llama 3 70B Instruct 8k 0.5100 0.7400
Llama 3 8B Instruct 8k 0.0400 0.0400
Llama 3.1 70B Instruct
tools, json_mode
131k 0.4000 0.4000
Llama 3.1 8B Instruct
tools, json_mode
16k 0.0200 0.0500
Llama 3.2 11B Vision Instruct
vision, json_mode
131k 0.2450 0.2450
Llama 3.2 1B Instruct 60k 0.0270 0.2000
Llama 3.2 3B Instruct 80k 0.0510 0.3400
Llama 3.2 3B Instruct (free) 131k 0.0000 0.0000
Llama 3.3 70B Instruct
tools, json_mode
131k 0.1000 0.3200
Llama 3.3 70B Instruct (free)
tools
66k 0.0000 0.0000
Llama 4 Maverick
vision, json_mode
1049k 0.1500 0.6000
Llama 4 Scout
vision, tools, json_mode
328k 0.0800 0.3000
Llama Guard 3 8B 131k 0.4800 0.0300
Llama Guard 4 12B
vision, json_mode
164k 0.1800 0.1800

xAI

11 models
Model Context In $/1M Out $/1M
Grok 3
tools, json_mode
131k 3.0000 15.0000
Grok 3 Beta
tools, json_mode
131k 3.0000 15.0000
Grok 3 Mini
tools, json_mode
131k 0.3000 0.5000
Grok 3 Mini Beta
tools, json_mode
131k 0.3000 0.5000
Grok 4
vision, tools, json_mode
256k 3.0000 15.0000
Grok 4 Fast
vision, tools, json_mode
2000k 0.2000 0.5000
Grok 4.1 Fast
vision, tools, json_mode
2000k 0.2000 0.5000
Grok 4.20
vision, tools, json_mode
2000k 1.2500 2.5000
Grok 4.20 Multi-Agent
vision, json_mode
2000k 2.0000 6.0000
Grok 4.3
vision, tools, json_mode
1000k 1.2500 2.5000
Grok Code Fast 1
tools, json_mode
256k 0.2000 1.5000

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.