BACK TO GLOSSARY
AI & LLM Glossary

What is Context Window?

Context WindowThe maximum amount of text (measured in tokens) that an AI model can process at one time, including both input and output.

Definition & Explanation

The context window is the total amount of information—including the system prompt, conversation history, code files, and outputs—that an LLM can hold in its "memory" during a single session. Larger context windows allow AI coding tools to understand more of your codebase at once. For example, Claude has a 200K token context window, enabling it to process entire large codebases in a single session. Context window size is a key factor when choosing an AI coding tool.

AI Tools Using Context Window

Frequently Asked Questions

What is a context window in AI?

A context window is the maximum amount of text an AI model can process at once, including your prompt, the model's response, and any prior conversation history.

How big are context windows for AI coding tools?

Context windows vary significantly: GPT-4 has 128K tokens, Claude has up to 200K tokens, and some specialized models offer even larger windows. Larger context windows let the AI understand more of your codebase simultaneously.

Why does context window size matter for coding?

A larger context window means the AI can read more of your code at once—more files, more functions, more context. This leads to more accurate suggestions and better understanding of complex codebases.

Related Terms

EXPLORE MORE

Browse All AI & LLM Terms

Explore our complete glossary of AI, LLM, and system prompt terminology.

View Full Glossary