LiquidAI vs Nous
Every LiquidAI and Nous LLM model side by side: pricing per million tokens, context windows, and capabilities. Refreshed nightly from upstream.
LiquidAI
3 models| Model | Context | In $/1M | Out $/1M |
|---|---|---|---|
| LFM2-24B-A2B | 33k | 0.0300 | 0.1200 |
| LFM2.5-1.2B-Instruct (free) | 33k | 0.0000 | 0.0000 |
| LFM2.5-1.2B-Thinking (free) | 33k | 0.0000 | 0.0000 |
Nous
6 models| Model | Context | In $/1M | Out $/1M |
|---|---|---|---|
|
Hermes 2 Pro - Llama-3 8B
json_mode
|
8k | 0.1400 | 0.1400 |
|
Hermes 3 405B Instruct
json_mode
|
131k | 1.0000 | 1.0000 |
| Hermes 3 405B Instruct (free) | 131k | 0.0000 | 0.0000 |
|
Hermes 3 70B Instruct
json_mode
|
131k | 0.3000 | 0.3000 |
|
Hermes 4 405B
json_mode
|
131k | 1.0000 | 3.0000 |
|
Hermes 4 70B
json_mode
|
131k | 0.1300 | 0.4000 |