For agentic systems to succeed, two levels of interoperability must be supported. Agents will need access to external information and to collaborate with external tools and agents. The grey box at the bottom right of this image is a critical part of agentic architecture. No information or AI platform will stand alone. This article is more technically focused than most, but I’ll make it easy to understand and leave out the code.
It’s critical for nontechnical audiences to get exposure to technical topics. As more of the business’s strategy relies on technology, understanding that technology becomes critical for running the business.
It’s equally critical for technical audiences to see their implementations from the business’s perspective. What problems does this solve, and what’s the value of solving them? That must be explained alongside every technical tutorial and introduction.
The business is a graph, and so are agentic platforms. An AI platform’s most efficient architecture matches the shape of the thing it supports. Businesses don’t understand themselves well enough to define the structure of their graphs rigorously. That’s why most companies are dreadfully inefficient and (shameless plug incoming!!!) why I teach courses on transforming the business from opaque to transparent.
Data and AI are the only technologies that enable the business to understand and define its business and operating model graphs. Data and AI strategy are essential because it defines the graphs and aligns the business, operating, and technology models.
However, the assumption that gets baked into business, operating, and technical architectures is that everything will remain centralized. It’s inefficient to build everything internally and irrational to think a single company will scale to serve an entire marketplace independently. Decentralization is inevitable, and therefore, interoperability is mandatory.
Why Must We Fill In The Grey Box?
Model Context Protocol (MCP) is an open standard introduced by Anthropic to bridge the gap between advanced foundational models and external data sources and tools. It doesn’t support information interoperability…that’s a much larger problem and one I will explain in a later article. In many current AI applications and agentic platforms, models operate in isolation from live or proprietary data, relying on their training data and the prompt alone.
This often leads to outdated or irrelevant responses when the model lacks context. A lack of tooling limits the model’s agency or ability to take action in digital or physical settings. For Amazon Alexa’s ambitions to be realized, it must work with other AI platforms like Uber’s. Building ecosystem business models around AI platforms have featured heavily in my courses for years. However, strategy can only get you so far. Ecosystems and partnerships require technical implementations and interoperability.
MCP addresses this by providing a standard for AI apps, platforms, and systems to securely connect with content repositories, databases, APIs, and other tools where up-to-date data lives. The goal is to make AI assistants more context-aware and capable of retrieving or manipulating external information in real time, which in turn helps them produce better, more relevant responses.
MCP is built to solve the integration problem that arises when using multiple AI models, apps, and data sources. Before MCP, developers had to create custom integrations for each combination of model and data source or use framework-specific plugins, leading to a combinatorial “M×N” challenge (M models × N tools and data sources). Without a standard, you could write one integration to connect GPT-4o to a database and a completely different integration to connect Claude to the same database.
This fragmentation made it hard for AI systems and platforms to access diverse information and maintain context across different tools. MCP is intended to serve as a universal translator between models, tools, and data sources so that any AI agent can connect to any MCP-compatible resource using the same protocol. By standardizing these connections, MCP reduces the need for one-off adapters and allows AI platforms to seamlessly tap into external knowledge or functions on demand.