I am Philip, an Engineer at Hypr MCP, where we help companies connect their internal applications to LLM-based workflows with the power of MCP servers. Join our waitlist or book a demo to learn more. Every time we showcase our Hypr MCP platform, this is the most frequently asked question: How did we manage to get the prompt analytics? In this blog post, I want to show you how and why we built prompt analytics into our MCP server Gateway. # Introduction The Model Context Protocol (MCP) allows LLM clients and agents to dynamically add context to the prompt and even perform method calls. Typical use cases for dynamically adding additional context to LLM prompts can be found in the engineering domain. MCP servers like Context7 and GitMCP can provide dynamic documentation based on prompts, while MCP servers from specific software vendors like Stack Auth (https://mcp.stack-auth.com/) can directly add relevant information to the prompts if a tool description matches a prompt’s problem. On the other side, MCP servers can be used to let LLMs instruct LLM clients to perform actions on third-party systems like the GitHub or HubSpot MCP server. # MCP Server Analytics—MCP Servers Often Run in the Dark Previously, MCP servers mostly ran on the client side with stdio being the default method of how JSON-RPC messages were sent from and to the clients. A benefit for these servers has been simplicity - MCP server developers didn’t need to care about the runtime and connectivity constraints as the user needed to make sure to start the server program. With the migration to remote MCP servers, thanks to the streamable HTTP transport method for JSON-RPC messages, new analytics methods become possible. In the next sections, we will focus exclusively on remote MCP servers. # Application Layer Analytics for MCP Servers Application layer analytics means adding a logging or metrics library directly into your MCP server’s application code. As remote MCP servers follow the same principles as tr...
First seen: 2025-09-30 20:39
Last seen: 2025-10-01 01:40