Lance Martin TL;DR Agents need context to perform tasks. Context engineering is the art and science of filling the context window with just the right information at each step of an agent’s trajectory. In this post, I group context engineering into a few common strategies seen across many popular agents today. Context Engineering As Andrej Karpathy puts it, LLMs are like a new kind of operating system. The LLM is like the CPU and its context window is like the RAM, serving as the model’s working memory. Just like RAM, the LLM context window has limited capacity to handle various sources of context. And just as an operating system curates what fits into a CPU’s RAM, “context engineering” plays a similar role. Karpathy summarizes this well: [Context engineering is the] ”…delicate art and science of filling the context window with just the right information for the next step.” What are the types of context that we need to manage when building LLM applications? Context engineering is an umbrella that applies across a few different context types: Instructions – prompts, memories, few‑shot examples, tool descriptions, etc Knowledge – facts, memories, etc Tools – feedback from tool calls Context Engineering for Agents This year, interest in agents has grown tremendously as LLMs get better at reasoning and tool calling. Agents interleave LLM invocations and tool calls, often for long-running tasks. However, long-running tasks and accumulating feedback from tool calls mean that agents often utilize a large number of tokens. This can cause numerous problems: it can exceed the size of the context window, balloon cost / latency, or degrade agent performance. Drew Breunig nicely outlined a number of specific ways that longer context can cause perform problems, including: With this in mind, Cognition called out the importance of context engineering: “Context engineering” … is effectively the #1 job of engineers building AI agents. Anthropic also laid it out clearly: Agents often e...
First seen: 2025-07-04 05:10
Last seen: 2025-07-04 15:12