IVI Framework Viewer

Assessment Execution

B2

Conduct the assessment, including such activities as running awareness campaigns, employing assessment tools, conducting evaluation interviews, and gathering information about existing practices in the organization.

Improvement Planning

Practices-Outcomes-Metrics (POM)

Representative POMs are described for Assessment Execution at each level of maturity.

1Initial
  • Practice
    Use custom-made tools to run the assessments.
    Outcome
    Some assessment data can be gathered through use of custom-made tools.
    Metric
    % of IT capability assessments conducted with custom-made tools.
  • Practice
    Practices with respect to collecting evidence/data that accurately reflects current-state performances are ad hoc, occurring on a best endeavour basis.
    Outcome
    _
    Metric
    _
2Basic
  • Practice
    Use standardized, manual assessment tools (e.g. standardized questionnaires).
    Outcome
    There is reduction of costs through scale effects.
    Metric
    % of IT capability assessments conducted with custom-made tools. % of IT capability assessments conducted with standardized tools.
  • Practice
    Capture some key current-state data from stakeholders in IT.
    Outcome
    Time series comparison and benchmarking over time is possible.
    Metric
    % of requested respondents providing current-state data for assessments.
3Intermediate
  • Practice
    Execute assessments based on the assessment plan and include standard elements of the selected framework (such as kick-off meetings, distribution of pre-assessment information to participants, etc).
    Outcome
    Consistent assessment execution results in scale effects and increases buy-in.
    Metric
    % of assessment adherence to a process/blueprint for conducting IT capability assessments.
  • Practice
    Set up tools leveraging some automation techniques such as scripting (e.g. data collection via spreadsheets that are then analysed in different spreadsheets).
    Outcome
    Scale effects from automated tools are experienced.
    Metric
    % of IT capability assessments conducted with custom-made tools. % of IT capability assessments conducted with standardized tools.
  • Practice
    Capture current-state data from all relevant stakeholders within IT and some key stakeholders in other business units.
    Outcome
    Necessary assessment data/evidence is collected from all key IT stakeholders and some other business units stakeholders. Time series comparison and benchmarking over time becomes more reliable.
    Metric
    % of requested respondents providing current-state data for assessments.
4Advanced
  • Practice
    Introduce an integrated, central assessment tool.
    Outcome
    There is a high degree of automation across the assessment process.
    Metric
    % of IT capability assessments conducted with custom-made tools. % of IT capability assessments conducted with standardized tools.
  • Practice
    Capture all necessary current-state data and evidence from all key stakeholders organization-wide.
    Outcome
    The data gathered from organization-wide stakeholders is valuable in setting future improvement targets.
    Metric
    % of requested respondents providing current-state data for assessments.
5Optimized
  • Practice
    Regularly review and improve the assessment tool following feedback from assessments undertaken.
    Outcome
    Regular improvement of the assessment tool supports effective organization-wide capability assessments.
    Metric
    % of IT capability assessments conducted with custom-made tools. % of IT capability assessments conducted with standardized tools. Yes/No indicators re the existence of a regular assessment tool review and improvement cycle.
  • Practice
    Proactively secure the availability of data sources in the business ecosystem for upcoming assessments and optimize the process for so doing through feedback and learning from previous assessments.
    Outcome
    Data from the business ecosystem is always available on request and is valuable in informing the setting of future improvement targets.
    Metric
    % of requested respondents providing current-state data for assessments.