IVI Framework Viewer

Architecture Funding

A3

Develop approaches to funding enterprise architecture management and architecture improvement initiatives.

Improvement Planning

Practices-Outcomes-Metrics (POM)

Representative POMs are described for Architecture Funding at each level of maturity.

2Basic
  • Practice
    Begin cross-project funding for a limited number of architecture capabilities.
    Outcome
    Some cross-project investments are made, typically in infrastructure — e.g. shared storage, virtualization.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
3Intermediate
  • Practice
    Put funding mechanisms in place for a central enterprise architecture function and architecture management practices.
    Outcome
    Enterprise architecture planning begins to take a longer term view and leverages synergies across projects.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
  • Practice
    Provide separate funding for architecturally significant projects.
    Outcome
    Architecture maintenance costs for such projects are adequately funded.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
  • Practice
    Put funding in place for a target architecture landscape (in principle for a 3-year vision, with confirmation for the current funding cycle).
    Outcome
    An architecture vision and a roadmap are funded and enabled.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
  • Practice
    Implement architecture components in the project portfolio for appropriate prioritization and funding.
    Outcome
    Enterprise architecture activity is subjected to and passes project selection/authorization criteria.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
4Advanced
  • Practice
    Expand funding mechanisms to support a larger suite of architecture capabilities, enabling innovation and value across the whole organization.
    Outcome
    Leveraging the enterprise architecture knowledge base enhances the return on investment.
    Metrics
    • Counts and trends of governance compliance and governance exceptions.
    • Elapsed time and trends for governance steps such as approvals cycles.
    • Counts and trends for first time fixed and count issues re-opened.
    • Pareto of missing information types that cause delays in decision-making.
    • Frequency and time in use of tool features.
    • Cost, cycle-time, and resource utilization impact of automation and architecture driven process changes.
    • Baseline measurements of business (unit) operations before an architecturally significant change implementation so that its impact can subsequently be determined more accurately.
    • Comprehensive project portfolio, programme, and project metrics.
    • Stakeholder surveys on communications, architecture engagement, and architecture artefact utility.
5Optimized
  • Practice
    Implement a lifetime TCO model to fund services beyond the lifetime of individual platforms.
    Outcome
    Technological debt from unfunded end-of-life events is avoided.
    Metrics
    • Pareto of architecture guidance principles explicitly used in decision-making.
    • Count of decisions adjusted by enterprise architecture governance committee.
    • % of decisions getting enterprise architecture approval without modification.
    • Cost to projects of technology debt.
    • Complexity metrics (e.g. # of decisions in process, divergent process path counts, count of tasks requiring experts/consultant level staff).
    • Satisfaction rating surveys that are value focused (i.e. how valuable is the enterprise architecture function to the stakeholders).
    • Benefits realization metrics from project portfolios, programmes, and the benefits realization function.
  • Practice
    Start collaborative investments with customers, partners, and/or suppliers to build architectural capabilities in support of the business ecosystem (e.g. common data model, electronic marketplace).
    Outcome
    Shared capabilities are developed which are visible to customers, partners, or suppliers (e.g. establishment of open industry standard data models for data exchange).
    Metrics
    • Pareto of architecture guidance principles explicitly used in decision-making.
    • Count of decisions adjusted by enterprise architecture governance committee.
    • % of decisions getting enterprise architecture approval without modification.
    • Cost to projects of technology debt.
    • Complexity metrics (e.g. # of decisions in process, divergent process path counts, count of tasks requiring experts/consultant level staff).
    • Satisfaction rating surveys that are value focused (i.e. how valuable is the enterprise architecture function to the stakeholders).
    • Benefits realization metrics from project portfolios, programmes, and the benefits realization function.