Despite the hype, Model Context Protocol (MCP) isn’t magic or revolutionary. But, it’s simple, well-timed, and well-executed. At Stainless, we’re betting it’s here to stay.“MCP helps you build agents and complex workflows on top of LLMs”. If you’ve paid attention, you know we’ve been here before. There are numerous past attempts at connecting the world to an LLM in a structured, automatic way.Function/tool calling: Write a JSON schema, the model picks a function. But you had to manually wire each function per request and assume most of the responsibility for implementing retry logic.ReAct / LangChain: Let the model emit an Action: string, then parse it yourself—often flaky and hard to debug.ChatGPT plugins: Fancy, but gated. You had to host an OpenAPI server and obtain approval.Custom GPTs: Lower barrier to entry, but still stuck inside OpenAI’s runtime.AutoGPT, BabyAGI: Agents with ambition, but a mess of configuration, loops, and error cascades.Heck, even MCP itself isn’t new—the spec was released by Anthropic in November, but it suddenly blew up in February, 3 months later.Interest over time for MCP in Google search trends (link)Why is it that MCP is seemingly in ascent, while previous attempts fell short?Why is MCP eating the world?1. The models finally got good enoughEarly tool use was messy—at best—due to unreliable models. Even basic functionality required extensive error handling—retries, validation, and detailed error messages were necessary just to get complex workflows running.Tool use in an agentic setting requires a high standard of robustness. Those who used earlier coding agents know the perils of context poisoning—when a nonsensical output from your agent sends the rest of the conversation into an inescapable spiral. These dangers only multiply as you add more tools.With newer models, LLMs are good enough that they don’t get sucked into pits of despair and they can usually recover from mistakes. Of course, models and agents are still far from perfect...
First seen: 2025-06-24 16:12
Last seen: 2025-06-24 22:13