Direct Answer
REST APIs and the Model Context Protocol (MCP) are not alternatives to each other: they operate at different layers and solve different problems. REST APIs are point-to-point integration contracts: you define an endpoint, call it with specific parameters, and receive a structured response. MCP is a standardized protocol that enables AI assistants to discover and query context sources dynamically, without custom integration code per tool. MCP does not replace REST APIs underneath: it is a discovery and interaction layer above them. For EA teams, the practical significance is this: before MCP, connecting an AI assistant to your Sparx EA repository required custom API integration work for each AI tool. With MCP (via EA GraphLink Interface B), any MCP-compatible AI assistant discovers your EA repository and can query it without bespoke integration: once, not repeatedly per tool.
Key Takeaways
- REST APIs = pull-based, point-to-point, specific endpoint contracts. You call a URL; you get a response.
- MCP = a standardized discovery-and-query protocol for AI assistants. The AI discovers what’s available and queries it using natural language context.
- MCP does not replace REST: it operates above REST at the AI interaction layer.
- Before MCP: each AI tool required custom API integration to access EA data.
- With MCP (EA GraphLink Interface B): any MCP-compatible AI assistant connects once via a standard protocol.
- Supported MCP clients: Claude, Microsoft Copilot, Google Gemini, Salesforce Agentforce, ChatGPT Enterprise, Cursor, and any tool adopting the MCP standard.
- The MDG quality of your Sparx EA repository determines the quality of MCP-served answers.
What REST APIs Are and How EA Teams Have Used Them
REST (Representational State Transfer) is the dominant style for web API design. A REST API exposes resources at URLs (endpoints) and uses HTTP methods (GET, POST, PUT, DELETE) to interact with them. A response is typically JSON.
For EA data access, REST APIs have traditionally worked like this:
- Export and transform. Sparx EA data is exported (CSV, XML, or via the automation API) and loaded into another system. This is a batch process: the data is stale the moment the export completes.
- Custom API wrapper. Organizations build a REST API wrapper around the Sparx EA database, exposing specific queries as endpoints. An application calls
GET /api/applications?lifecycle=end-of-lifeand receives a JSON list. This requires custom development and ongoing maintenance.
- PowerShell or Python scripts. Scheduled scripts pull data from Sparx EA and load it into downstream systems. This is point-to-point automation: each integration is its own custom script.
The common thread: REST API integration with EA data has historically been custom, point-to-point, and bespoke per consuming system. Adding a new AI tool meant writing a new integration. Adding a new AI assistant meant another integration project.
What MCP Is and Why It Exists
The Model Context Protocol (MCP) was published by Anthropic in November 2024 as an open protocol. Its purpose is to solve a specific problem: AI assistants (Claude, Copilot, Gemini, etc.) need to access external data sources to be useful beyond their training data, but every integration was custom: each AI tool had its own API format, authentication model, and capability description mechanism.
MCP standardises this. An MCP Server:
- Exposes resources and tools in a standardized, discoverable format
- Describes its capabilities so an AI client knows what it can ask and how
- Handles queries from any MCP-compatible client using a common protocol
An MCP Client (the AI assistant):
- Discovers what MCP Servers are connected to it
- Understands what each server offers without custom configuration
- Queries context sources naturally: the AI decides when and how to use MCP context based on the conversation
The key difference from REST: with a REST API, you decide when to call the endpoint and with what parameters. With MCP, the AI decides when to query the context source and formulates the query based on the conversation context. This is appropriate for AI-driven workflows where the AI is the integrating agent rather than a human-coded application.
How Sparx EA’s MCP Server Works
EA GraphLink Interface B is Sparx Services’ MCP Server implementation for Sparx EA. Here is what it does:
Exposes the repository as an MCP context source. The MCP Server makes available: all element types and instances, relationships between elements, tagged values, diagram metadata, package structure, and documentation. The schema is MDG-aware: ArchiMate element types (Business Process, Application Component, Technology Node) are first-class queryable types, not generic UML elements.
Registers with MCP-compatible AI assistants. Once EA GraphLink Interface B is deployed, it is registered as an MCP Server with the AI tools in your environment. Claude knows it has access to your EA repository. Copilot knows it can query your EA repository. Gemini knows it. Each AI assistant received a description of what the EA MCP Server contains and what kinds of queries it can handle.
Handles natural-language queries. When a user asks Claude “which applications support our Customer Onboarding capability?”, Claude queries the EA MCP Server using the appropriate protocol call. The MCP Server returns the relevant elements and relationships from the Sparx EA repository. Claude synthesizes this into a natural-language answer.
Governs with MDG. The MCP Server’s query layer is MDG-aware. If your ArchiMate elements are consistently stereotyped with the correct MDG types, the AI receives semantically precise data. If elements are inconsistently tagged or use ad-hoc stereotypes, the AI’s answers reflect that imprecision. The MDG quality gate is the AI quality gate.
The Practical Difference: A Before-and-After
Before MCP (REST API era):
> An architect wants to connect Claude to the Sparx EA repository. > This requires: building a REST API wrapper around the EA database, writing custom prompt templates that reference the REST endpoints, implementing authentication, handling pagination, and building the integration once for Claude. > When Copilot is added to the organization, the same integration is rebuilt for Copilot’s format. When Gemini is piloted, it is rebuilt again.
After MCP (EA GraphLink Interface B):
> EA GraphLink Interface B is deployed. It registers as an MCP Server. > Claude, Copilot, Gemini, Agentforce, and ChatGPT Enterprise: any MCP-compatible client: automatically discover the EA MCP Server and can query the repository. > No per-tool custom integration. One deployment serves all MCP-compatible AI assistants.
This is why MCP matters for EA teams: the integration cost scales linearly with AI tools under REST; it is effectively fixed under MCP.
REST APIs Still Matter: They Are Not Going Away
MCP does not replace REST APIs for all EA integration patterns. REST remains the right approach for:
Application-to-application integration. If a business application (service management tool, project management system, CMDB) needs to synchronize data with the Sparx EA repository, REST API integration (or direct database integration) is appropriate. These are machine-to-machine integrations where the consuming system knows exactly what to request.
Reporting pipelines. Scheduled data pulls from EA into BI platforms, data warehouses, or reporting systems use REST-style API patterns (via the EA GraphLink GraphQL endpoint: GraphQL is a query language that operates over HTTP, closely related to REST patterns).
Export and archiving. Point-in-time exports for regulatory or audit purposes are batch REST operations.
MCP’s domain is AI assistant context provision: situations where the integrating “client” is an AI assistant rather than a deterministic application.
Why This Matters for EA Strategy
The rise of MCP changes the EA team’s role in integration planning. Previously, connecting AI tools to EA data required significant custom integration development: which meant EA teams either built it themselves (high effort, bespoke maintenance) or went without AI connectivity.
With MCP, the integration is standardized. The effort moves from building integrations to governing the repository. An EA team with a well-governed Sparx EA model (consistent MDG, complete tagged values, reliable ownership data) can offer AI connectivity to every MCP-compatible tool in the organization by deploying EA GraphLink Interface B once.
The governance quality of the EA repository: which is the domain of the EA team: directly determines the quality of what every AI assistant in the organization can say about enterprise architecture. That is a significant elevation of the EA practice’s strategic value.
Frequently Asked Questions
Q: Is MCP a replacement for GraphQL? No. GraphQL is a query language for APIs: it is how EA GraphLink Interface A exposes the repository to BI tools like Power BI and Tableau. MCP is an AI assistant interaction protocol: it is how EA GraphLink Interface B exposes the repository to AI assistants like Claude and Copilot. They are complementary: GraphQL serves the BI analytics layer; MCP serves the AI intelligence layer. Both use the same underlying EA repository data.
Q: How is MCP different from function calling in AI assistants? Function calling (also called tool use) is an earlier pattern where AI assistants could invoke specific, pre-defined functions with structured inputs and outputs. MCP is a more sophisticated evolution: instead of defining each function individually, MCP Servers describe their full capabilities dynamically. The AI assistant discovers what a server offers rather than having each capability hard-coded. MCP also standardises the protocol across AI vendors: function calling implementations differ between OpenAI, Anthropic, and Google; MCP is vendor-neutral.
Q: Does EA GraphLink Interface B require internet access? No. EA GraphLink Interface B (MCP Server) can be deployed in an on-premises or private cloud environment. The AI assistants that connect to it require connectivity to the MCP Server, but the MCP Server itself does not require internet access to the Sparx EA repository. The specific network configuration depends on whether your AI assistants are cloud-hosted (like Copilot in M365) or locally deployed.
Q: Can we use MCP with an AI assistant our organization builds internally? Yes. MCP is an open protocol. Any AI assistant built on a framework that supports MCP (including Azure OpenAI with MCP support, open-source LLM frameworks, and custom AI applications) can connect to the EA MCP Server. This is one of MCP’s architectural advantages: it is not locked to commercial AI products.
Q: What authentication does EA GraphLink Interface B use for MCP connections? EA GraphLink Interface B supports standard authentication mechanisms including API keys and OAuth 2.0. The specific authentication configuration is set during deployment and depends on the AI assistants being connected. Sparx Services configures authentication as part of the Connect engagement, ensuring appropriate access controls are in place before AI assistants are granted repository access.
Q: How does the AI know what to query in the EA repository? The MCP Server provides a schema description to the AI client: this describes the element types, relationship types, tagged value dimensions, and query patterns available. When a user asks the AI a question about enterprise architecture, the AI uses this schema description to formulate an appropriate query. The MDG Technology configuration in Sparx EA directly shapes this schema description: well-named, well-typed elements with clear MDG stereotypes produce a schema that the AI can reason about accurately.
Q: Is MCP secure for sensitive architecture data? MCP connections are secured at the transport layer (HTTPS/TLS) and at the authentication layer. The EA MCP Server can be configured with read-only access to the repository, limiting what the connected AI can do (query data) vs what it cannot (modify the model). For sensitive architecture data, network-level controls (private endpoints, VPN access requirements) can be applied. Sparx Services addresses security configuration as part of the Connect engagement.
Q: How do REST and MCP coexist in the EA GraphLink architecture? EA GraphLink exposes two interfaces from the same underlying Sparx EA repository: Interface A is GraphQL over HTTP (used by BI tools: effectively a REST-style query interface) and Interface B is the MCP Server (used by AI assistants). Both interfaces coexist and can be active simultaneously. BI dashboards and AI assistant queries can run concurrently from the same repository via their respective interfaces.
Ready to Deploy EA GraphLink Interface B?
Sparx Services’ Connect engagement deploys EA GraphLink, including the MCP Server interface, and connects it to your organization’s AI assistants. We handle the authentication configuration, schema optimisation, and MDG quality assessment that determine how well your AI assistants can query your EA repository.