Tokens are the basic units that AI models use to process and generate text. A token roughly equals 4 characters or about 0.75 words in English. Understanding tokens helps you estimate costs, manage context windows, and craft efficient prompts for AI-assisted development.
Tokens are the currency of AI interactions. Everything you send to AI and everything it generates is measured in tokens, which affects both cost and capability.
Cost: Most AI APIs charge per token:
Context limits:
Quick mental math:
For cost:
For context:
| Model | Context Window |
|---|---|
| GPT-4o | 128K tokens |
| Claude 3.5 | 200K tokens |
| Gemini 1.5 | 1M+ tokens |
These limits continue expanding, making larger codebases accessible to AI.