Direct Answer
The Model Context Protocol (MCP) is an open protocol published by Anthropic in November 2024 that standardises how AI assistants access external data sources and tools. Before MCP, connecting an AI assistant to an external system: like a database, document store, or business application: required custom integration code for each AI tool. MCP provides a common standard: any MCP-compatible AI assistant (Claude, Copilot, Gemini, Agentforce, ChatGPT Enterprise) can connect to any MCP-compliant server and query it using a standard protocol. For enterprise architecture teams, MCP is significant because it enables Sparx EA repositories to become first-class AI context sources: once, via EA GraphLink Interface B: rather than requiring bespoke integrations per AI tool. The EA repository stops being invisible to AI and starts being the governed intelligence layer that AI assistants draw from.
Key Takeaways
- MCP was published by Anthropic in November 2024 as an open, vendor-neutral protocol.
- MCP standardises how AI assistants discover and query external data sources: it replaces custom, per-tool integrations.
- Architecture: MCP Server (exposes resources) + MCP Client (the AI assistant) + MCP Protocol (handles discovery and queries).
- Compatible AI clients: Claude, Microsoft Copilot, Google Gemini, Salesforce Agentforce, ChatGPT Enterprise, Cursor, and any MCP-adopting tool.
- For EA teams: the EA repository becomes AI-queryable via the Sparx EA MCP Server in EA GraphLink Interface B.
- MCP is not the same as function calling or plugins: it is more sophisticated in its discovery model.
- The MDG quality gate in Sparx EA determines the quality of AI answers served from the MCP Server.
Origin and Purpose
Anthropic published the Model Context Protocol specification in November 2024 as an open standard. The motivation was straightforward: large language models (LLMs) are only as useful as the context they have access to. Without access to current, relevant data, an AI assistant answers from training data: which may be stale, incorrect, or insufficiently specific for enterprise questions.
The problem was that connecting AI assistants to data sources required different custom integrations for every AI tool. An integration built for Claude did not work with Copilot. An integration for Copilot did not work with Gemini. Each AI vendor had its own API format, authentication model, and capability description mechanism. The result was an integration proliferation problem that slowed enterprise AI adoption.
MCP addressed this by providing a vendor-neutral standard. If a data source implements MCP and an AI assistant implements MCP, they can interoperate without custom integration. The protocol handles discovery, capability description, authentication negotiation, and query handling in a standardized way.
The open-source release of the MCP specification and reference implementations was followed by rapid adoption: Microsoft, Google, Salesforce, OpenAI, and dozens of enterprise software vendors adopted MCP in 2024–2025, making it the de facto standard for AI context integration.
The MCP Architecture
MCP has three components:
MCP Server: A software component that exposes data sources and tools to AI assistants. An MCP Server:
- Declares its capabilities (what resources it exposes, what tools it provides)
- Responds to standardized discovery requests from AI clients
- Handles queries and returns structured results
- Manages its own authentication and access controls
MCP Client: The AI assistant or AI orchestration layer that connects to MCP Servers. A client:
- Discovers available MCP Servers (registered via configuration)
- Understands what each server offers without custom code
- Decides when to query a server based on conversation context
- Incorporates server responses into its generated answers
MCP Protocol: The standardized communication layer that governs discovery messages, capability declarations, query formats, and response structures. The protocol handles the handshake between client and server, making them interoperable regardless of vendor.
In practice, the flow for an EA query works like this:
- User asks Copilot: “Which applications support our Supply Chain capabilities?”
- Copilot (MCP Client) recognizes this as a query that the EA MCP Server can answer
- Copilot sends a standardized MCP query to the EA MCP Server
- The EA MCP Server queries the Sparx EA repository via EA GraphLink
- The EA MCP Server returns the relevant elements and relationships
- Copilot synthesizes a natural-language answer and presents it to the user
How MCP Differs from Function Calling and Plugins
Function calling / tool use: An earlier AI integration pattern where AI assistants could invoke specific, pre-defined functions with structured parameters. Each function had to be individually defined in the AI assistant’s configuration. Adding a new capability meant adding a new function definition. This was per-AI, per-function effort.
AI plugins: Plugin architectures (like the early OpenAI plugin system) allowed AI assistants to call external APIs. Each plugin required custom development and was registered with the specific AI assistant’s plugin framework. Not portable across AI vendors.
MCP: MCP is more sophisticated than either predecessor:
- Discovery is dynamic: The AI client discovers what the MCP Server offers without each capability being individually pre-defined
- Vendor-neutral: One MCP Server works with any MCP-compatible client: Claude, Copilot, Gemini, Agentforce, all at once
- Richer capability model: MCP supports resources (data that AI can read), tools (actions the AI can invoke), and prompts (pre-built query templates): a richer set than simple function calling
What the Sparx EA MCP Server Exposes
EA GraphLink Interface B is Sparx Services’ MCP Server implementation for Sparx EA. Connected AI assistants can query:
Elements by type: All ArchiMate element types (Business Process, Application Component, Technology Node, Capability, etc.): queryable as first-class typed objects, not generic records.
Relationships: The connections between elements: which applications realize which capabilities, which capabilities are used-by which processes, which nodes host which artefacts.
Tagged values: The governance attributes attached to elements: owner, lifecycle status, business domain, technology tier, health rating, and any custom attributes defined in MDG Technology extensions.
Diagrams and views: Diagram names and metadata (not images: AI assistants receive data, not rendered diagrams).
Documentation: Element notes and documentation fields stored in the repository.
Package structure: The organisational hierarchy of the repository: which elements belong to which domains and sub-domains.
Why MDG Technology Is the Quality Gate
The Sparx EA MCP Server serves data through a semantic layer that is MDG-aware. This means:
- An ArchiMate
Application Componentis queryable as “Application Component”: not as a generic UMLClasswith a stereotype attribute that needs parsing - Custom MDG stereotypes defined by your organization (e.g.,
SaaS Application,Core Platform,Strategic Initiative) are first-class queryable dimensions - Tagged value definitions from MDG extensions (e.g.,
Lifecycle Statuswith valuesActive,End of Life,Decommissioned) are structured enumeration dimensions that AI can filter and group
If elements in the repository use the correct MDG stereotypes consistently, the AI receives precise, semantically rich data. If elements are inconsistently typed: generic UML classes instead of ArchiMate elements, ad-hoc tagged values instead of governed MDG extensions: the AI receives imprecise data and its answers are correspondingly imprecise.
This is the MDG quality gate: the quality of EA repository governance directly determines the quality of AI-generated architecture intelligence. It is not a technical constraint: it is a governance requirement. Sparx Services addresses this through the Amplify engagement (MDG governance design) as a prerequisite or parallel workstream to the Connect engagement (EA GraphLink deployment).
Which AI Tools Are MCP-Compatible
As of 2026, MCP-compatible AI assistants include:
- Anthropic Claude (all current versions)
- Microsoft Copilot (M365 Copilot, Copilot Studio)
- Google Gemini (Workspace and cloud versions)
- Salesforce Agentforce
- OpenAI ChatGPT Enterprise
- Cursor (AI-enhanced IDE)
- Azure OpenAI (with MCP client configuration)
- Any AI assistant built on frameworks that support MCP (including open-source LLM frameworks)
The MCP standard continues to be adopted by new tools. The EA MCP Server, once deployed, automatically becomes available to any new MCP-compatible tool that your organization adds: without additional integration work.
The Strategic Significance for EA Practices
Before MCP, EA repositories were invisible to AI. AI assistants could not query your architecture data without custom integration: and most EA teams lacked the development resources to build and maintain custom integrations per AI tool.
With MCP, the EA repository becomes a governed AI context source that any MCP-compatible AI in the organization can draw from. The strategic implication is significant:
Architecture intelligence is now available where work happens. Executives get architecture answers in Teams or Slack without navigating a separate portal. Project teams get application dependency answers in Copilot while planning workstreams. Operations teams get infrastructure coverage answers in service management tools that integrate with MCP.
The EA team’s governance work directly shapes AI outputs across the organization. The architecture decisions, ownership assignments, and lifecycle classifications that the EA team maintains now inform AI answers used across the business. The value of EA governance is no longer abstract: it is visible in the quality of AI responses.
Frequently Asked Questions
Q: Is MCP an Anthropic-specific technology? No. MCP was created by Anthropic but published as an open standard. Anthropic released the specification and reference implementations under an open license. Microsoft, Google, Salesforce, OpenAI, and many other vendors have adopted MCP as a standard integration mechanism. Claude, Copilot, Gemini, and Agentforce are all MCP-compatible. MCP is a vendor-neutral protocol.
Q: How is MCP different from a regular API? A regular API (REST or GraphQL) is a fixed contract: you define specific endpoints, call them with specific parameters, and receive structured responses. The calling system must know exactly what to request. MCP adds a discovery layer: the MCP Client (AI assistant) asks the MCP Server what it offers, and the server describes its capabilities dynamically. The AI then decides what to query based on the conversation context, without each query being pre-programmed. MCP is appropriate when the integrating “client” is an AI making context-driven decisions, not a deterministic application calling pre-defined endpoints.
Q: Can the EA MCP Server write back to Sparx EA, or is it read-only? The EA MCP Server implementation in EA GraphLink Interface B is currently read-only. Connected AI assistants can query and retrieve EA data but cannot modify the Sparx EA repository via MCP. This is an appropriate constraint for most use cases: AI-assisted query and reporting, not AI-driven model modification. Write-back capability is under development for future EA GraphLink releases.
Q: How is security managed for MCP connections? MCP connections are secured at the transport layer (HTTPS/TLS) and at the authentication layer. EA GraphLink Interface B supports API key and OAuth 2.0 authentication. Only authenticated, authorized AI assistants can connect to the EA MCP Server. Within the server, access can be restricted to specific packages or element types if needed. Network-level controls (private endpoints, VPN requirements) can further restrict connectivity to the MCP Server. Sparx Services configures security as part of the Connect engagement.
Q: Does MCP require specific versions of Sparx EA or Pro Cloud Server? EA GraphLink Interface B (MCP Server) requires the current version of EA GraphLink and a compatible Pro Cloud Server deployment. Sparx Services advises on version requirements during the Connect engagement. If your Sparx EA or Pro Cloud Server installation requires an upgrade before EA GraphLink can be deployed, this is identified and addressed in the engagement scope.
Q: Can multiple AI assistants connect to the same MCP Server at once? Yes. MCP Servers support concurrent connections from multiple MCP clients. If your organization uses both Microsoft Copilot and Salesforce Agentforce, both can be simultaneously connected to the EA MCP Server and query the repository independently. Each AI assistant accesses the same underlying EA data via the same MCP Server: there is no need to deploy separate MCP Servers per AI tool.
Ready to Deploy the Sparx EA MCP Server?
Sparx Services’ Connect engagement deploys EA GraphLink Interface B, registers your EA repository as an MCP Server, and connects it to your organization’s AI assistants: whether Claude, Copilot, Gemini, Agentforce, or others.
We include MDG quality assessment, authentication configuration, and initial query testing to ensure your AI assistants receive accurate, governed architecture intelligence from day one.