L LLM Cloud Hub
Glossary

Input vs. output tokens

Input tokens are what you send to the model; output tokens are what it generates back.

Input tokens are what you send to the model (prompt + context). Output tokens are what comes back. Output tokens are usually 3–5× more expensive than input — that's the autoregressive generation cost.

See also
← Back to full glossary

Keyboard shortcuts

?
Show this overlay
/
Focus the first form field
g h
Go to / (home)
g b
Go to /best-llm-for
g c
Go to /cost
g s
Go to /self-hosted
g x
Go to /compliance
Esc
Close any overlay

Inspired by Linear and GitHub conventions. The two-key sequences (g then h) work within ~1 second.