+We can look at early adopters for evidence. The official GitHub MCP server provides a wide array of tools, which results in massive context consumption. Its initial implementation exposed 100+ tools that consumed 64,000 tokens. That’s before the agent does any work. This issue was [improved](https://github.com/github/github-mcp-server/discussions/1182) over time, but even today, the server still uses 30,000 tokens on load. For reference, Claude Opus 4.5 - one of the best frontier models - context window is limited to 200,000 tokens.
0 commit comments