L LLM Cloud Hub
Vendor comparison

DeepSeek vs Meta

Every DeepSeek and Meta LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

DeepSeek

14 models
Model Context In $/1M Out $/1M
DeepSeek V3
tools, json_mode
164k 0.3200 0.8900
DeepSeek V3 0324
tools, json_mode
164k 0.2000 0.7700
DeepSeek V3.1
tools, json_mode
164k 0.2100 0.7900
DeepSeek V3.1 Terminus
tools, json_mode
164k 0.2700 0.9500
DeepSeek V3.2
tools, json_mode
131k 0.2520 0.3780
DeepSeek V3.2 Exp
tools, json_mode
164k 0.2700 0.4100
DeepSeek V3.2 Speciale
json_mode
164k 0.2870 0.4310
DeepSeek V4 Flash
tools, json_mode
1049k 0.1260 0.2520
DeepSeek V4 Flash (free)
tools
1049k 0.0000 0.0000
DeepSeek V4 Pro
tools, json_mode
1049k 0.4350 0.8700
R1
tools
64k 0.7000 2.5000
R1 0528
tools, json_mode
164k 0.5000 2.1500
R1 Distill Llama 70B
json_mode
131k 0.7000 0.8000
R1 Distill Qwen 32B
json_mode
33k 0.2900 0.2900

Meta

14 models
Model Context In $/1M Out $/1M
Llama 3 70B Instruct 8k 0.5100 0.7400
Llama 3 8B Instruct 8k 0.0400 0.0400
Llama 3.1 70B Instruct
tools, json_mode
131k 0.4000 0.4000
Llama 3.1 8B Instruct
tools, json_mode
16k 0.0200 0.0500
Llama 3.2 11B Vision Instruct
vision, json_mode
131k 0.2450 0.2450
Llama 3.2 1B Instruct 60k 0.0270 0.2000
Llama 3.2 3B Instruct 80k 0.0510 0.3400
Llama 3.2 3B Instruct (free) 131k 0.0000 0.0000
Llama 3.3 70B Instruct
tools, json_mode
131k 0.1000 0.3200
Llama 3.3 70B Instruct (free)
tools
66k 0.0000 0.0000
Llama 4 Maverick
vision, json_mode
1049k 0.1500 0.6000
Llama 4 Scout
vision, tools, json_mode
328k 0.0800 0.3000
Llama Guard 3 8B 131k 0.4800 0.0300
Llama Guard 4 12B
vision, json_mode
164k 0.1800 0.1800

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.