IVI Framework Viewer

Architecture Roles and Engagement

A2

Define the roles and responsibilities required for enterprise architecture management. Determine the depth of architect coverage based on the organization's needs, and describe the type and engagement of architects within the organization's context.

Improvement Planning

Practices-Outcomes-Metrics (POM)

Representative POMs are described for Architecture Roles and Engagement at each level of maturity.

2Basic
  • Practice
    Define roles for enterprise architecture staff.
    Outcome
    Clearly defined roles enable the development of staff and help streamline enterprise architecture activities.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
  • Practice
    Define architecture handover interfaces for business and technical architecture (e.g. where does the data architecture layer finish and data modellers take over, with more detailed data flow and data structure analysis and design techniques).
    Outcome
    Scope conflicts are avoided and handovers are smooth and effective in the architecture layers or views of interest.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
  • Practice
    As a minimum, provide job specific training for enterprise architecture and related roles.
    Outcome
    Staff know how to do the job and performance improves.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
3Intermediate
  • Practice
    Ensure that only competent staff are assigned enterprise architecture roles.
    Outcome
    Architecture quality is enhanced.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
  • Practice
    Develop career paths and a variety of ways to progress along those paths for staff.
    Outcome
    Staff are more easily recruited and/or selected for training.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
  • Practice
    Facilitate interaction between enterprise architecture staff and the users for whom the architecture guidance is provided.
    Outcome
    Teamwork becomes the norm and cooperation enhances utility and benefits realization.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
4Advanced
  • Practice
    Cross-train staff so that they have a good understanding of each other's roles — e.g. the artefact supplier fully understands how it is to be used and the user fully understands how it was developed and how it is intended to be used.
    Outcome
    Enhanced cooperation and facilitation are enabled with a ‘get it right first time, every time’ attitude emerging.
    Metrics
    • Counts and trends of governance compliance and governance exceptions.
    • Elapsed time and trends for governance steps such as approvals cycles.
    • Counts and trends for first time fixed and count issues re-opened.
    • Pareto of missing information types that cause delays in decision-making.
    • Frequency and time in use of tool features.
    • Cost, cycle-time, and resource utilization impact of automation and architecture driven process changes.
    • Baseline measurements of business (unit) operations before an architecturally significant change implementation so that its impact can subsequently be determined more accurately.
    • Comprehensive project portfolio, programme, and project metrics.
    • Stakeholder surveys on communications, architecture engagement, and architecture artefact utility.
5Optimized
  • Practice
    Develop an architecture improvement programme to regularly address the handovers at each architecture layer.
    Outcome
    All potential contributors and users of business and technical enterprise architecture artefacts are enabled and empowered.
    Metrics
    • Pareto of architecture guidance principles explicitly used in decision-making.
    • Count of decisions adjusted by enterprise architecture governance committee.
    • % of decisions getting enterprise architecture approval without modification.
    • Cost to projects of technology debt.
    • Complexity metrics (e.g. # of decisions in process, divergent process path counts, count of tasks requiring experts/consultant level staff).
    • Satisfaction rating surveys that are value focused (i.e. how valuable is the enterprise architecture function to the stakeholders).
    • Benefits realization metrics from project portfolios, programmes, and the benefits realization function.