L LLM Cloud Hub
Vendor comparison

AllenAI vs MiniMax

Every AllenAI and MiniMax LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.

AllenAI

1 model
Model Context In $/1M Out $/1M
Olmo 3 32B Think
json_mode
66k 0.1500 0.5000

MiniMax

8 models
Model Context In $/1M Out $/1M
MiniMax M1
tools
1000k 0.4000 2.2000
MiniMax M2
tools, json_mode
197k 0.2550 1.0000
MiniMax M2-her 66k 0.3000 1.2000
MiniMax M2.1
tools, json_mode
197k 0.2900 0.9500
MiniMax M2.5
tools, json_mode
197k 0.1500 1.1500
MiniMax M2.5 (free)
tools, json_mode
197k 0.0000 0.0000
MiniMax M2.7
tools, json_mode
197k 0.2600 1.2000
MiniMax-01
vision
1000k 0.2000 1.1000

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.