Enterprise Architecture

Building an EA Center of Excellence: The 12-Month Roadmap

By Ryan Schmierer  ·  August 26, 2025

Building an EA Center of Excellence: The 12-Month Roadmap

An EA CoE isn’t a team with a good tool. It’s five dimensions built together: platform, governance, stakeholder relationships, architect capability, and AI integration. Building only one or two dimensions produces a team that looks like a CoE but cannot deliver like one. This article provides the 12-month roadmap — what to build when, what the critical dependencies are, and what failure looks like before it becomes irreversible.

Key Takeaways


Five Dimensions of an Effective EA CoE

Most EA CoE programs fail because they treat CoE-building as a single-threaded activity — usually tool configuration or framework training. An effective EA CoE has five dimensions that must be developed together. They are not sequential; they are parallel, interdependent workstreams.

Dimension 1: Platform

The platform dimension is Sparx EA — configured, governed, and available to the architects who need it. This is not just installation. It means: MDG profiles configured for the frameworks in use, package structure aligned to the CoE’s scope, access controls appropriate to the team structure, and integration with adjacent systems (SharePoint, ServiceNow, Confluence, or whichever systems the organization actually uses for governance).

A platform that is technically installed but poorly configured is actively harmful — it generates inconsistent data and trains architects in bad habits.

Dimension 2: Governance

Governance is the set of rules, processes, and controls that make the EA repository a reliable source of truth. It includes: naming conventions, MDG enforcement of element types and tagged values, a model review process, a change management process for the repository itself, and a data quality standard with measurable criteria.

Without governance, the repository degrades over time. The more architects use it without governance, the harder it becomes to govern.

Dimension 3: Stakeholder Relationships

EA produces value when its outputs influence decisions. That requires stakeholder relationships — with the CIO, with business unit leaders, with project delivery teams, and with IT governance functions. Building these relationships takes longer than building the repository. It requires consistent delivery of useful intelligence, not just framework compliance reports.

The stakeholder relationships dimension is where most EA CoEs fail silently — the repository grows, the frameworks are compliant, but the business does not use the architecture function’s outputs to make decisions.

Dimension 4: Architect Capability

The capabilities of the architecture team — ArchiMate fluency, TOGAF literacy, Sparx EA proficiency, stakeholder communication skill, and judgment about when to use which framework — determine the quality of everything the CoE produces. Capability gaps here produce repositories that are technically populated but architecturally shallow.

Dimension 5: AI Integration

In 2026 and beyond, AI integration is not optional for a competitive EA CoE. EA GraphLink, Kernaro AI Hub, and Microsoft Copilot integration transform a well-built EA repository into an organizational intelligence asset. But AI integration is the last dimension to build — it requires the other four to be in place first. A poorly governed repository with low MDG quality produces misleading AI outputs, which undermines trust faster than any other failure mode.


The 12-Month Roadmap

Period Phase Key Activities
Month 1-3 Foundation Platform configuration, MDG profiles, package structure, governance policy, initial team training
Month 3-5 First Value First complete domain architecture (typically Business Architecture + Application Portfolio), initial stakeholder briefings
Month 5-8 Stakeholder Engagement Regular architecture briefings, integration with project delivery gate reviews, CIO reporting established
Month 8-11 Maturity Full TOGAF alignment, cross-domain models, architecture-driven decisions tracked and reported, team capability assessed and gaps addressed
Month 11-14 AI Integration EA GraphLink deployment, Power BI portfolio dashboards, Copilot integration pilot, Kernaro AI Hub planning

Month 1-3: Foundation

This is the phase most organizations rush. Month 1-3 is where the platform is properly configured, MDG profiles are built for the frameworks in use, the package structure is designed to support the CoE’s scope, and the governance model is documented and agreed. It is also where the first architect training happens — not just “how to use Sparx EA” but “how we use Sparx EA here, with these MDG profiles, following these conventions.”

The temptation to skip ahead and start producing architecture deliverables in Week 2 is the most common early failure. Architecture deliverables produced before governance is in place will need to be reworked.

Month 3-5: First Value

By Month 3, the foundation is in place and the team is ready to build architecture content. First value should be a complete domain architecture — typically Business Architecture (capability map, process inventory, stakeholder register) and Application Portfolio (all applications modeled as ArchiMate Application Components with required tagged values).

These two outputs are the most frequently requested by CIOs and business unit leaders. Delivering them by Month 5 establishes the CoE’s credibility.

Month 5-8: Stakeholder Engagement

Architecture that does not reach decision-makers does not influence decisions. Month 5-8 is where stakeholder engagement is built into the CoE’s operating model: regular architecture briefings (monthly or quarterly), integration with project gate reviews, and a reporting cycle that connects architecture output to the CIO’s agenda.

This is also where the architecture-driven decisions metric is established. Start tracking which decisions the EA CoE influenced, what the architecture input was, and what the outcome was with and without architecture involvement. This dataset becomes the CoE’s business case in Year 2.

Month 8-11: Maturity

By Month 8-11, the CoE should have: a complete multi-domain architecture, an active governance process, regular stakeholder engagement, and a capability assessment that identifies where the team needs development. This phase also deepens TOGAF alignment — connecting the repository structure to formal ADM phase deliverables for programs that require it.

Month 11-14: AI Integration

EA GraphLink deployment transforms the repository into a live data source for Power BI and Microsoft Copilot. This phase assumes MDG quality is sufficient — if it is not, the AI Integration phase must wait while governance is improved. Deploying AI integration against a low-quality repository produces low-quality intelligence, which damages trust in both the EA function and the AI tools.


The Metric That Matters Most

Architecture-driven decisions: the count of significant IT or business decisions in the reporting period that had explicit architecture input that changed or confirmed the outcome.

This is the only metric that connects EA activity to organizational value. Repository population, framework compliance scores, diagram counts, and TOGAF phase completion are activity metrics — they say nothing about whether architecture influenced anything that mattered. Architecture-driven decisions is an outcome metric. Build the process to capture it from Month 8 onward.


Four Common Failure Modes

Tool-first failure. The organization buys Sparx EA, installs it, and expects the tool to create an EA practice. Tools do not create practices. MDG configuration, governance policy, and architect capability create practices. Tools support them.

Framework compliance failure. The EA CoE becomes an TOGAF certification exercise. The repository fills with TOGAF-compliant deliverables that nobody outside the EA team reads. Stakeholders stop engaging because the output is framework-language, not business intelligence. The CoE is eventually defunded as “theoretical.”

Capability gap failure. The architects using Sparx EA lack the modeling depth to produce high-quality architecture content. ArchiMate layer confusion, inconsistent notation, and shallow models accumulate. By the time the capability gap is recognized, the repository requires significant remediation.

Sponsorship failure. The EA CoE has a sponsor at the start who moves on, is replaced, or loses budget authority. Without executive sponsorship, the CoE cannot get into the decision-making processes that generate architecture-driven decisions. The practice continues to operate but has no organizational impact.


FAQ

What is an EA Center of Excellence? An EA Center of Excellence is an organizational structure — typically a team — responsible for building and maintaining enterprise architecture capability in the organization. This includes the EA tooling, repository governance, architect capability development, stakeholder engagement, and the methodologies and frameworks in use. An effective EA CoE delivers architecture intelligence that influences organizational decisions.

How long does it take to build an effective EA COE? A functional EA CoE — one that is producing reliable architecture content and influencing decisions — takes 10-14 months from initial setup. The foundational phases (platform, governance, first content) take 3-5 months. Stakeholder engagement and maturity take the next 6 months. AI integration adds another 3-4 months in Year 2 for practices with sufficient MDG quality.

What governance structure should an EA COE have? The minimum governance structure includes: an Architecture Review Board (ARB) or equivalent decision body with business and IT representation, a repository governance role (often the Lead Architect), defined architecture principles, a change management process for the repository, and a model quality standard. The ARB provides the organizational legitimacy for architecture-driven decisions.

What metrics should an EA COE track? Architecture-driven decisions (primary), repository completeness by domain (secondary), model quality score (secondary), stakeholder engagement frequency (secondary), and architect capability assessment (periodic). Do not report repository population or diagram counts as primary metrics — they measure activity, not value.

What is the most common reason EA COEs fail? Sponsorship failure combined with framework compliance focus. The EA CoE produces technically correct architecture deliverables that decision-makers do not read, sponsored by an executive who eventually loses interest. Without stakeholder engagement that connects architecture outputs to decisions, the practice cannot demonstrate value, and cannot sustain its budget.

How does AI integration change what an EA COE can deliver? AI integration via EA GraphLink, Kernaro AI Hub, and Microsoft Copilot transforms the EA CoE from a team that produces architecture deliverables into a team that provides always-available architecture intelligence. Stakeholders can query the architecture portfolio, ask capability questions, and get answers from the repository without scheduling an architecture review. This dramatically increases the surface area of architecture influence — and the architecture-driven decisions metric — for practices that have the repository quality to support it.


Build the Practice, Then the Intelligence Layer

The Sparx Services Discover engagement is designed for organizations starting the EA CoE journey — assessing what is in place, identifying the gaps across all five dimensions, and producing the roadmap that avoids the four common failure modes.

Talk to Sparx Services about Discover →

Share this article

Ready to make your EA investment work harder?

Talk to a Sparx Services architect about where your organization is on the journey and what the next stage looks like.