IVI Framework Viewer

Architecture Tools and Techniques

C4

Determine and select the tools and techniques required to ensure consistent and repeatable delivery of architecture across the organization.

Improvement Planning

Practices-Outcomes-Metrics (POM)

Representative POMs are described for Architecture Tools and Techniques at each level of maturity.

2Basic
  • Practice
    Match a toolset to the architecture approach and frameworks selected by the organization.
    Outcome
    An architecture framework is better integrated and easier to implement.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
  • Practice
    In budget planning, acquire funds for the acquisition and deployment of the toolset, associated training, process and procedure creation or updates, and the ongoing maintenance and provision of the toolset.
    Outcome
    A budget is available for an architecture toolset.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
  • Practice
    Select training providers for enterprise architecture toolsets.
    Outcome
    The use of approved training and education providers ensures standards and consistency.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
  • Practice
    Procure and make training on toolsets available.
    Outcome
    Staff have access to enterprise architecture toolset training.
    Metrics
    • Survey results that identify where organization support and resistance to architecture projects lie.
    • Satisfaction ratings for the architecture function.
    • Basic counters and trends for architecture artefacts creation and usage.
    • Budgets, costs, and resource usage.
    • Aggregated project portfolio, programme, and project metrics for dash boards, and project phase and task metrics for diagnosis.
3Intermediate
  • Practice
    Ensure that staff assigned to architecture roles are trained and certified as necessary in the use of the enterprise architecture toolset.
    Outcome
    Staff are capable of delivering a competent professional architecture service.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
    • Ratio of technology debt reduction budget to technology debt size.
  • Practice
    Begin identifying advanced toolset features that the organization should use and sources of training for these.
    Outcome
    Training on advanced toolset features is identified in advance of the need.
    Metrics
    • Counts, averages, variances, and associated trends of errors and rework.
    • Variance from targets and mean.
    • Issue counts and trends (by severity and urgency).
    • Open to close metrics (e.g. time, total cost of fix).
    • Metadata enabled aggregate metrics by department, function, business unit, geospatial region, product, service, and so forth.
    • Surveys on awareness of, usability, and use of architecture artefacts and guidance.
    • Project portfolio, programme, and project life-cycle metrics.
    • Ratio of technology debt reduction budget to technology debt size.
4Advanced
  • Practice
    Provide training on advanced features of the enterprise architecture toolset.
    Outcome
    Staff are capable of effectively using advanced toolset features for enterprise architecture.
    Metrics
    • Counts and trends of governance compliance and governance exceptions.
    • Elapsed time and trends for governance steps such as approvals cycles.
    • Counts and trends for first time fixed and count issues re-opened.
    • Pareto of missing information types that cause delays in decision-making.
    • Frequency and time in use of tool features.
    • Cost, cycle-time, and resource utilization impact of automation and architecture driven process changes.
    • Baseline measurements of business (unit) operations before an architecturally significant change implementation so that its impact can subsequently be determined more accurately.
    • Comprehensive project portfolio, programme, and project metrics.
    • Stakeholder surveys on communications, architecture engagement, and architecture artefact utility.
  • Practice
    Mandate the use of advanced toolset features on some projects.
    Outcome
    The organization gains the benefit of advanced toolset features on appropriate projects.
    Metrics
    • Counts and trends of governance compliance and governance exceptions.
    • Elapsed time and trends for governance steps such as approvals cycles.
    • Counts and trends for first time fixed and count issues re-opened.
    • Pareto of missing information types that cause delays in decision-making.
    • Frequency and time in use of tool features.
    • Cost, cycle-time, and resource utilization impact of automation and architecture driven process changes.
    • Baseline measurements of business (unit) operations before an architecturally significant change implementation so that its impact can subsequently be determined more accurately.
    • Comprehensive project portfolio, programme, and project metrics.
    • Stakeholder surveys on communications, architecture engagement, and architecture artefact utility.
  • Practice
    Automate where appropriate.
    Outcome
    Cost and quality improvements are evident.
    Metrics
    • Counts and trends of governance compliance and governance exceptions.
    • Elapsed time and trends for governance steps such as approvals cycles.
    • Counts and trends for first time fixed and count issues re-opened.
    • Pareto of missing information types that cause delays in decision-making.
    • Frequency and time in use of tool features.
    • Cost, cycle-time, and resource utilization impact of automation and architecture driven process changes.
    • Baseline measurements of business (unit) operations before an architecturally significant change implementation so that its impact can subsequently be determined more accurately.
    • Comprehensive project portfolio, programme, and project metrics.
    • Stakeholder surveys on communications, architecture engagement, and architecture artefact utility.
5Optimized
  • Practice
    Develop and sustain a programme of continuous improvement for enterprise architecture tools provisioning based on emerging research ideas, vendor advocacy, and stakeholder feedback.
    Outcome
    The organization has available an excellent enterprise architecture toolset that it can proficiently utilize.
    Metrics
    • Pareto of architecture guidance principles explicitly used in decision-making.
    • Count of decisions adjusted by enterprise architecture governance committee.
    • % of decisions getting enterprise architecture approval without modification.
    • Cost to projects of technology debt.
    • Complexity metrics (e.g. # of decisions in process, divergent process path counts, count of tasks requiring experts/consultant level staff).
    • Satisfaction rating surveys that are value focused (i.e. how valuable is the enterprise architecture function to the stakeholders).
    • Benefits realization metrics from project portfolios, programmes, and the benefits realization function.