Technique Guide
Context Engineering
Prompt engineering is about what you say. Context engineering is about what information you put in front of the model at all. It's the discipline behind RAG, agent memory management, and high-quality AI systems at scale.
What is Context Engineering?
Every AI model has a context window — the total amount of text it can process at once. Context engineering is the practice of deliberately managing what goes into that window: what documents to retrieve, what conversation history to include, what system instructions to prioritize, and what to leave out.
The insight: model quality is often more sensitive to what information is available than to the exact phrasing of the prompt. Giving a model the right context transforms a weak response into a strong one; giving it the wrong context makes even a perfectly phrased prompt fail.
Write context
Instructions, persona, background knowledge
Retrieved context
Documents, database results, search results
Conversation context
Relevant prior messages and decisions
Tool context
Results from previous tool calls and actions
Core Context Engineering Techniques
Retrieval-Augmented Generation (RAG)
Instead of asking the model to recall facts, retrieve relevant documents and inject them as context. The most widely used technique for grounding AI responses in real data.
Context compression
Summarize or compress earlier context to free up space for new content. For long conversations or document pipelines, raw context grows too large — compression keeps the essential information without hitting limits.
Selective history inclusion
Don't include all conversation history — include only the relevant parts. For agents, this means summarizing earlier steps and keeping only the most recent N exchanges in full.
Context window positioning
Where information appears in the context matters. Most models attend better to content at the beginning and end than the middle. Place critical instructions at the start; place key retrieved content near the task.
Articles
Core Concept
What is Context Engineering?
The emerging discipline that goes beyond prompt engineering — deciding what information the model should have access to at each step.
Technique
How RAG Works
Retrieval-augmented generation is the most common context engineering technique — inject relevant knowledge into the context at query time.
Foundations
What is Prompt Engineering?
The foundation: understanding how the text you provide shapes model behavior.
Related Lessons
Structured lessons covering context management and the techniques that depend on it.
Related Guides
Go Deep on Context Engineering
The Advanced track covers context engineering, prompt compression, and RAG in depth.
Start Context Engineering Lesson