IVI Framework Viewer

Architecture Framework

C1

Provide the overarching framework of standards, templates, and specifications for organizing and presenting a description of the business and technical architectures.

Improvement Planning

Practices-Outcomes-Metrics (POM)

Representative POMs are described for Architecture Framework at each level of maturity.

2Basic
  • Practice
    Evaluate, rate, and rank frameworks for their suitability.
    Outcome
    Better framework selection decisions are reached.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
  • Practice
    Evaluate frameworks across the full life cycle of conceptualization, create, modify, utility, and retire.
    Outcome
    Better framework selection decisions are reached.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
  • Practice
    Evaluate both start-up features and advanced functions (with external advice if necessary).
    Outcome
    Technical debt in the management of architecture frameworks is avoided.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
  • Practice
    Choose, acquire, and implement the organization's enterprise architecture framework.
    Outcome
    Understanding the available choices helps focus stakeholders on what is crucial for the organization's enterprise architecture function.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
3Intermediate
  • Practice
    Design, develop, and advocate the use of architecture templates for descriptions of organization structure, process, and technology architecture artefacts.
    Outcome
    Reduced variance and reduced training and education costs are evident.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
    • Ratio of technology debt reduction budget to technology debt size.
  • Practice
    Develop framework usage to encompass planning of future states that achieve business and technical objects (e.g. business or technology transformation or smaller objectives like lean or job size agility).
    Outcome
    The framework is expanded from a modelled or conceptualized set of supported features to include ‘what-if’ analysis and ‘to-be’ state designing and modelling.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
    • Ratio of technology debt reduction budget to technology debt size.
  • Practice
    Build domain knowledge and expertise that are familiar with business and technology options, processes, and tools, and with the competency levels available to the organization to enable better planning.
    Outcome
    Enhanced domain knowledge and expertise improve the quality and utility of enterprise architecture artefacts.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
    • Ratio of technology debt reduction budget to technology debt size.
4Advanced
  • Practice
    Use advanced features of the framework to enable the organization to realize its vision and strategic objectives.
    Outcome
    The enterprise architecture capability is enhanced and is better at dealing with adaptability and complexity.
    Metrics
    • Counts and trends of governance compliance and governance exceptions.
    • Elapsed time and trends for governance steps such as approvals cycles.
    • Counts and trends for first time fixed and count issues re-opened.
    • Pareto of missing information types that cause delays in decision-making.
    • Frequency and time in use of tool features.
    • Cost, cycle-time, and resource utilization impact of automation and architecture driven process changes.
    • Baseline measurements of business (unit) operations before an architecturally significant change implementation so that its impact can subsequently be determined more accurately.
    • Comprehensive project portfolio, programme, and project metrics.
    • Stakeholder surveys on communications, architecture engagement, and architecture artefact utility.
  • Practice
    Develop end-to-end views and a set of holistic approaches to problem-solving.
    Outcome
    The development of end-to-end views enables better analysis of the organization and its activities.
    Metrics
    • Counts and trends of governance compliance and governance exceptions.
    • Elapsed time and trends for governance steps such as approvals cycles.
    • Counts and trends for first time fixed and count issues re-opened.
    • Pareto of missing information types that cause delays in decision-making.
    • Frequency and time in use of tool features.
    • Cost, cycle-time, and resource utilization impact of automation and architecture driven process changes.
    • Baseline measurements of business (unit) operations before an architecturally significant change implementation so that its impact can subsequently be determined more accurately.
    • Comprehensive project portfolio, programme, and project metrics.
    • Stakeholder surveys on communications, architecture engagement, and architecture artefact utility.
  • Practice
    Develop a set of optimization approaches to value-stream management, process management, tool utilization, and so forth.
    Outcome
    Optimization and automation improve efficiencies and reduce any enterprise architecture errors.
    Metrics
    • Counts and trends of governance compliance and governance exceptions.
    • Elapsed time and trends for governance steps such as approvals cycles.
    • Counts and trends for first time fixed and count issues re-opened.
    • Pareto of missing information types that cause delays in decision-making.
    • Frequency and time in use of tool features.
    • Cost, cycle-time, and resource utilization impact of automation and architecture driven process changes.
    • Baseline measurements of business (unit) operations before an architecturally significant change implementation so that its impact can subsequently be determined more accurately.
    • Comprehensive project portfolio, programme, and project metrics.
    • Stakeholder surveys on communications, architecture engagement, and architecture artefact utility.
5Optimized
  • Practice
    Leverage the latest research and industry best practice to ensure the organization is using the best available framework for its enterprise architecture needs.
    Outcome
    The organization uses a framework in which it has confidence and is achieving its enterprise architecture objectives consistently.
    Metrics
    • Pareto of architecture guidance principles explicitly used in decision-making.
    • Count of decisions adjusted by enterprise architecture governance committee.
    • % of decisions getting enterprise architecture approval without modification.
    • Cost to projects of technology debt.
    • Complexity metrics (e.g. # of decisions in process, divergent process path counts, count of tasks requiring experts/consultant level staff).
    • Satisfaction rating surveys that are value focused (i.e. how valuable is the enterprise architecture function to the stakeholders).
    • Benefits realization metrics from project portfolios, programmes, and the benefits realization function.