What Senior EA Teams Look for When Hiring: Skills, Tools and Mindset in 2026
Senior EA teams are not primarily looking for TOGAF certificates. They are looking for architects who can do the work: produce governed models in the right notation, navigate a shared repository without damaging it, communicate architectural decisions to stakeholders who push back, and think in terms of what problems the architecture actually solves. In 2026, the shortlist of valued skills has grown: AI integration literacy — understanding what MDG quality means for AI output, and whether a candidate has worked with tools like Kernaro Assist or EA GraphLink — is now a genuine differentiator. This article describes what hiring managers assess, what they find lacking, and what strong candidates do differently.
Key Takeaways
- Portfolio evidence (real models, governed repository contributions) outweighs certification in hiring assessment
- Repository governance — MDG discipline, naming standards, element type consistency — is assessed in technical interviews
- Stakeholder communication ability is evaluated directly, not inferred from credentials
- AI integration literacy is a 2026 differentiator: architects who understand MDG quality for AI systems are increasingly valued
- Mindset matters: the question “what problem does this architecture solve?” separates consulting-oriented architects from documentation producers
What Senior EA Teams Actually Need
To understand what hiring managers look for, start with what they need: someone who can contribute to a governed EA practice without being managed at the level of a junior. That means:
- Taking a domain brief from a stakeholder and producing architecture content that is correct, typed, governed, and communicable — without being told what elements to use or how to relate them
- Navigating the shared repository without introducing structural problems: using MDG profile elements, following naming conventions, maintaining package discipline
- Presenting architecture to a mixed audience — technical and non-technical — and fielding questions without deferring everything to the lead architect
- Understanding what the architecture is for: not “I was asked to document the application layer” but “the business needs to understand its application redundancy before the consolidation programme — this view answers that question”
When senior architects describe what they have hired wrong, they consistently describe the same profile: candidates who have the credentials, present well in interviews, but produce slides instead of architecture and cannot maintain repository discipline when working independently.
Technical Skills: What Is Assessed
Modelling Language Fluency
For most enterprise EA roles, ArchiMate fluency is the baseline. In technical assessments, fluency means more than “I know what an Application Component is.” It means:
- Correct layer placement: an Application Component is in the Application Layer. A Node is in the Technology Layer. A Business Service is in the Business Layer. Confident placement without hesitation signals genuine understanding.
- Relationship selection: “I used Realisation here because the Application Service realises the Business Service — they are different abstractions of the same function.” Architects who can explain relationship choices distinguish themselves from those who use Association for everything they are unsure of.
- View construction: taking a business scenario and constructing a view that answers a specific question — with the right elements, the right relationships, and the right level of abstraction for the intended audience.
For roles with a systems engineering component — particularly in defence, aerospace, or complex infrastructure — SysML proficiency replaces or supplements ArchiMate as the core notation requirement.
Sparx EA Proficiency
In Sparx EA environments (which is most large EA practices outside North America), Sparx EA proficiency is assessed functionally. Can the candidate:
- Navigate a structured repository and find architecture content across multiple packages?
- Create and modify elements using the MDG profile correctly — not by creating custom types?
- Build a multi-layer ArchiMate diagram with typed relationships?
- Generate a report or structured view for stakeholder output?
- Describe the role of Pro Cloud Server in a multi-user environment?
These are practical questions. Candidates who have used Sparx EA in real projects can answer them immediately. Candidates who have only used it for self-study exercises often show uncertainty at the level of specific functionality.
MDG Governance Understanding
Repository governance understanding is the differentiator at mid-to-senior level. Hiring managers look for candidates who understand:
- What MDG Technology does and why it matters — not as a conceptual answer but with practical examples (“MDG restricts element creation to defined types, which means when someone creates a node in the Business Layer, the tool flags it — this is what keeps the repository consistent enough to be queryable”)
- How to detect governance drift: what does a poorly governed repository look like, and how do you identify it?
- How to maintain governance discipline when working alongside less experienced architects
Candidates who describe MDG as “the settings menu” or “the profile thing” have surface familiarity. Candidates who describe it as a metamodel governance mechanism and can discuss specific configuration choices are demonstrating genuine depth.
Scripting and Automation Awareness
Not all EA roles require scripting, but awareness of Sparx EA’s scripting capability (JScript, VBScript automation, the Sparx EA Automation Interface) is valued in senior roles. For architects who will configure or extend the tool, hands-on scripting experience is expected. For those who will not configure it themselves, understanding what automation is possible is a useful signal of platform depth.
Practice Skills: What Is Evaluated
Repository Governance Behaviour
Portfolio review in senior interviews often includes a repository walk-through — either the candidate’s own work or a deliberately flawed repository they are asked to assess. The questions: What is the package structure? How are elements named? Are the MDG element types used consistently? Are there informal elements that should be typed? Are cross-layer relationships used correctly?
Candidates who can conduct this assessment fluently — and who have opinions about what good looks like — are demonstrating practice maturity, not just theoretical knowledge.
Stakeholder Communication
Most technical interview processes for senior EA roles include a stakeholder communication exercise: present an architecture view to a panel that includes non-technical assessors, and handle challenge. The assessment is not whether the architecture is perfect — it is whether the candidate can explain their choices in business terms, handle questions without becoming defensive, and adjust the level of technical detail to the audience.
Architects who have only presented to other architects frequently struggle here. Those who have genuinely engaged with business stakeholders — explaining why the application layer matters for a capability rationalisation, or why the technology architecture view changes a proposed timeline — demonstrate this fluency naturally.
Architecture Governance Participation
Evidence of architecture governance participation is valued: have you attended or contributed to an architecture review board? Have you written architecture decision records? Have you managed an architecture contract on a programme? These are signals that the candidate has operated in a serious EA practice rather than a notional one.
Decision Documentation
The ability to document an architectural decision clearly — what was decided, what alternatives were considered, what the rationale was, what constraints apply — is assessed in senior roles. This is often done via a short written exercise. Strong candidates produce concise, reasoned documents. Weak candidates produce long, descriptive ones that avoid taking a position.
2026 Additions: AI Integration Literacy
AI integration literacy is a genuine differentiator in 2026 hiring for EA roles at organisations that have deployed or are deploying AI-augmented architecture practice.
What is being assessed:
MDG quality for AI output. Does the candidate understand that EA GraphLink and similar AI connectivity tools query the repository using element types and relationships — and that an ungoverned repository produces low-quality AI output? This is not a deep technical question. It is: “What would you need to be true about the repository for an AI system to reliably answer ‘which applications support capability X?'”
Kernaro Assist or similar tool exposure. Has the candidate used or configured an AI-assisted architecture querying tool? Can they describe the interaction between repository quality and AI response quality? Even familiarity-level exposure is valued — it signals that the candidate has engaged with the direction the practice is moving.
MCP integration awareness. Do they know what MCP (Model Context Protocol) is in the context of EA tooling — and why it matters for connecting AI agents to a Sparx EA repository? This is a new and specific area of knowledge that differentiates candidates who are following EA tool evolution from those who are not.
Mindset: The Differentiator That Is Hardest to Teach
The mindset distinction that most consistently separates strong EA candidates from credentialled ones:
Consulting orientation. Strong architects think “what problem does this architecture solve?” before they think “what does this architecture look like?” They arrive in an interview and can describe the business outcome their architecture enabled, not just the modelling choice they made. “We built this application portfolio map because the consolidation programme had no agreed list of applications to rationalise — the map gave them a decision-making baseline they used for the first phase gate” is a consulting-oriented answer. “We built an application portfolio map using ArchiMate Application Components with tagged values” is a documentation-oriented answer.
The former demonstrates understanding of architecture’s purpose. The latter demonstrates knowledge of a tool. Hiring managers who are building practices that deliver value — not practices that produce documents — consistently select for the former.
What Strong Candidates Do Differently in Applications and Interviews
They bring a portfolio. Not a printed slide deck of diagrams — a working repository or screenshots of real governance work. “Here is a capability map I built for a financial services client; here is the package structure it lives in; here is the MDG profile we used to govern it; here is the stakeholder feedback it generated.”
They talk in outcomes. “This view answered the question of which applications were candidates for decommissioning” rather than “this is an ArchiMate Application Layer view.”
They acknowledge limitations. Strong architects know what they do not know. They can say “I have not used Kernaro Assist directly but I understand what it does and I have been following the tooling developments” rather than either claiming expertise they do not have or dismissing the question.
They ask about the practice. Strong candidates ask about the repository governance, the MDG profiles in use, the team’s approach to stakeholder communication, and the direction of AI integration. These questions signal practice maturity and genuine interest in doing the work well.
FAQ
Should I include TOGAF certification prominently in my EA job application? Include it, but do not lead with it. Certification signals baseline process knowledge and professional seriousness. It does not differentiate you because many candidates have it. Lead with what differentiates you: specific modelling work, repository governance experience, stakeholder communication outcomes, and any AI integration exposure. Put TOGAF certification in the credentials section where it belongs, not in the opening summary.
What should an EA portfolio include? Real models — ArchiMate views built for real or realistic scenarios, with the architecture decisions documented. Repository governance evidence — package structure, naming conventions, MDG profile configuration choices. Stakeholder deliverables — views designed for a specific audience question, with context on what question they answered. Process artifacts if you have them — architecture decision records, governance review outputs, compliance assessments. The portfolio should demonstrate that you build architecture, not that you know what architecture is.
How do hiring managers assess repository governance experience if I have been working in a governed environment? By asking specific questions: What MDG profiles were in use in your team’s repository? What naming conventions did you follow and why? What happened when someone introduced informal elements — how was that identified and corrected? What would you change about the governance approach if you were setting it up fresh? Architects who have genuinely lived in a governed environment answer these fluently. Those who have been adjacent to one — or worked in an ungoverned repository — show uncertainty on the specifics.
How important is Sparx EA proficiency for roles that are advertised as “tool-agnostic”? Even roles advertised as tool-agnostic often involve Sparx EA in practice, particularly in large enterprise, government, and infrastructure contexts. If the role description mentions MDG, PCS, or EA repository governance, it is almost certainly a Sparx EA environment. The term “tool-agnostic” in a job description often means “we want architects who understand the practice, not just one vendor’s tool” — not “we do not use any specific tool.” Sparx EA proficiency is rarely a disadvantage and is frequently decisive.
What is the most common gap between senior candidate expectations and actual ability? Repository governance discipline. Senior candidates with strong credentials often overestimate how well they have understood and practiced MDG governance. When presented with a repository assessment exercise, many cannot identify governance drift, explain why informal elements are a problem, or describe what a correctly configured MDG profile enables. This is the most consistent gap between stated experience and demonstrated capability.
How does the Architect Development program prepare candidates for senior EA hiring? Sparx Services Architect Development is explicitly designed to build the skills and portfolio evidence that hiring managers value: ArchiMate modelling depth, MDG repository governance, Sparx EA administration, stakeholder communication, and AI integration literacy. Participants leave with real governed repository work, documented architecture decisions, and the ability to demonstrate practice maturity in interview — not just certification.
Ready to Build the Skills That Senior EA Teams Are Looking For?
Sparx Services Architect Development builds your EA capability across the dimensions that hiring managers actually assess: modelling fluency, MDG governance, Sparx EA depth, stakeholder communication, and AI integration literacy. Develop the portfolio evidence that differentiates you.