MAF 2022 to 2023 Innovation Management Methodology

Innovation Area of Management – Context

As part of the 2022-23 Management Accountability Framework (MAF) assessment, the Innovation AoM (previously the Experimentation AoM) centers on management excellence in innovation for Canadians and public servants. The AoM aligns with the 2016 Experimentation Direction for Deputy Heads, which encourages organizations to solve persistent problems using rigorous methods of comparison. The AoM supports sound innovation management by encouraging organizations to generate rigorous evidence and use iterative improvements to find what works. It focuses on three strategic themes: 1) Committing Resources 2) Generating Rigorous Evidence, and 3) Evidence-Informed Decisions.

The Government of Canada has a long history of innovation and evidence-informed policymaking. The focus of this AoM is on further strengthening this culture of innovation by ensuring there is a strong link between problem-solving, evidence generation and management decision-making. Rigorously testing our innovations in real-world settings ensures we continue to achieve value for money, while improving social, environmental, and economic outcomes for Canadians and public servants.

MAF results will provide Deputy Heads and the Treasury Board of Canada Secretariat (TBS) with an overall view of organizational performance across the three strategic themes. Results will also inform the Innovation Community on management excellence, best practices, and areas for improvement.

How the Innovation AoM is assessed

Organizations respond to four questions that are each assigned different points:

Question Maximum number of points

Q1: To what extent is senior leadership committing resources towards generating evidence to support innovation?

3 points

Q2: To what extent did the organization use rigorous methods of comparison to support innovation?

3 points

Q3: For the projects in Q2, to what extent was there potential for high impact for Canadians or public servants?

3 points

Q4: For the projects in Q2, to what extent were the results used to inform timely decision-making at an executive-level governance body?

3 points

Maximum number of points for the Innovation AoM

12 points

Organizations will receive an overall maturity levelbased on total points scored:

Points scored by organization Maturity level

0 points

No activity reported

1-3 points

Developing innovation management

4-6 points

Established innovation management

7-9 points

Excelling at innovation management

10-12 points

Leader in innovation management

Strategic Theme 1 – Committing Resources

Strategic Theme Overview

The Committing Resources theme encourages organizations to foster a culture of innovation and measurement by funding plans to test new approaches and learn what works. Organizations increase their ability to rigorously compare solutions and generate evidence on departmental and government-wide priorities when they are supported through senior leadership approval and funding. The theme enables Deputy Heads and TBS to determine to what extent evidence-driven innovation has been embedded into an organization’s management practices.

Question 1 New

To what extent is senior leadership committing resources towards generating evidenceto support innovation?

  • A great extent. Evidence generation was funded in three places (Treasury Board submission, Major initiative, AND Innovation fund)
  • A moderate extent. Evidence generation was funded in two places (Treasury Board submission, Major initiative, OR Innovation fund)
  • A minimal extent. Evidence generation was funded in one place (Treasury Board submission, Major initiative, OR Innovation fund)
  • Not present. The organization does not have evidence of committing resources

See Q1 Annex C for more information and the AoM Glossary for definitions.

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Question 1 focuses on steps that senior leadership can take to ensure that innovations are supported through early planning and resource commitments in Treasury Board submissions, major initiatives, and innovation funds. The 2016 Experimentation Direction encourages organizations to instill a culture of innovation and measurement for testing what works when designing and delivering new initiatives. This ensures high quality, real-world evidence is available to support informed decision-making, improved performance, and results for Canadians.

Category

  • Policy Compliance
  • Performance
  • Baseline

Expected Results

A great extent

Assessed Organizations

Large Departments and Agencies

Period of Assessment

2021-22 fiscal year

Calculation Method

  • Three points for having committed resources towards evidence generation in three places (Treasury Board submission, Major initiative, AND Innovation fund) (maximum score)
  • Two points for having committed resources towards evidence generation in two places (Treasury Board submission, Major initiative, OR Innovation fund)
  • One point for having committed resources towards evidence generation in one place (Treasury Board submission, Major initiative, OR Innovation fund)
  • Zero points for no evidence of committing resources

How this question is assessed

Organizations must show that they have committed resources towards generating evidence. This question is worth three points. To receive the maximum score, organizations must provide one example of committing resources in each place:

  1. A Treasury Board submission (TB submission)
  2. A Major initiative
  3. An Innovation fund

Resources can include financial and/or human resources (FTEs).

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

  • TBS provided coversheet (in MAF Portal)
  • Additional evidence submitted in MAF portal

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No (Question was changed, but will be compared going forward)

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B

Experimentation Direction for Deputy Heads

Annex C

Treasury Board submission

Organizations must show that they have included plans to generate evidence in one TB submission that serves Canadians or public servants. These plans are usually found in the Design, Delivery and Implementation section and must clearly demonstrate a commitment to use structured comparisons to learn what works, but precise methods or details are not expected at this stage. The TB submission must have been approved during the 2021-22 Fiscal Year (FY). Note that the focus of this question is on evidence generation. While it is important to reference past evaluations, studies, or experiments in a TB submission, these are not eligible examples for this question.

Examples of eligible TB submissions can include commitments to:

  • Iterative design and testing new or multiple policy options
  • Being upfront about evidence gaps or unknowns that need to be explored
  • Real-world testing that will de-risk new policy, program or service delivery models
  • Rapid or iterative innovation through pilots or testing before full implementation

Major initiative

Organizations must show that they have plans to generate evidence to inform a significant change to one major departmental program or service (see glossary). A major program or service refers to a high-spending area or an area that affects a large number of Canadians or public servants based on the departmental context. The commitment of resources must have occurred during the 2021-22 FY. Note that the program or service can focus on serving Canadians (public facing) or serving government (internal services).

Examples of significant change can include:

  • A new approach that pushes beyond the status quo
  • Redesigning policy, program, or service delivery models
  • De-risking bold public sector innovations
  • Challenging the underlying assumptions of policies
  • Changing core business processes

Innovation fund

Organizations must show that they have included plans to generate evidence within a department-wide innovation fund. The fund could support special projects, employee-led innovations, or other departmental priorities. The fund could include financial resources or dedication of FTEs towards innovation projects. The fund must include a process for rigorously testing some or all of the innovations in real-world settings.

Strategic Theme 2 – Generating Rigorous Evidence

Strategic Theme Overview

The Generating Rigorous Evidence theme encourages organizations to use rigorous methods of comparison to support innovation across a diversity of organizational functions and where there is a potential for high impact for Canadians or public servants. By generating rigorous evidence on what works, we can de-risk innovations, support sound fiscal management, and achieve better results for Canadians. The theme enables Deputy Heads and TBS to monitor the extent to which organizations are testing their innovations and measuring outcomes in real-world settings.

Question 2 New

To what extent did the organization use rigorous methods of comparison to support innovation?

  • A great extent. Three projects used rigorous methods of comparison across three organizational functions
  • A moderate extent. Two projects used rigorous methods of comparison across two organizational functions
  • A minimal extent. One project used rigorous methods of comparison in one organizational function
  • Not present. The organization does not have evidence of using rigorous methods of comparison

See Q2 Annex C for more information and the AoM Glossary for definitions.

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Question 2 focuses on the use of rigorous methods of comparison to support innovation when serving Canadians or government. The 2016 Experimentation Direction encourages organizations to solve persistent problems by using rigorous methods to learn what works. Real world testing contributes to sound fiscal management and results for Canadians by finding more efficient, effective, and scalable solutions.

Category

  • Policy Compliance
  • Performance
  • Baseline

Expected Results

A great extent

Assessed Organizations

Large Departments and Agencies

Period of Assessment

2021-22 fiscal year

Calculation Method

  • Three points for three projects that used rigorous methods of comparison across three organizational functions (maximum score)
  • Two points for two projects that used rigorous methods of comparison across two organizational functions
  • One point for one project that used rigorous methods of comparison across one organizational function
  • Zero points for no evidence of having used rigorous methods of comparison

How this question is assessed

Organizations must show that they have used rigorous methods of comparison to support innovation. This question is worth three points. To receive the maximum score, organizations must submit three examples of projects that used rigorous methods of comparison, each of which must have supported a different organizational function (see table below). There is no requirement to provide projects under both categories of serving Canadians and serving government.

Note that feasibility and technical studies are not eligible. The indicators must directly measure social, environmental or economic outcomes for Canadians or public servants (see glossary for more details).

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

  • TBS provided coversheet (in MAF Portal)
  • Additional evidence submitted in MAF portal

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No (Question was changed, but will be compared going forward)

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

Experimentation Direction for Deputy Heads

Annex C

Rigorous methods of comparison

In this AoM, we have identified three types of rigorous methods of comparison that are eligible for the MAF assessment, all of which include structured and intentional comparisons:

  1. Randomized methods of comparison, such as randomized controlled trials or A/B testing. Must involve randomly assigning participants to at least two groups, such as an intervention group and control group.
  2. Non-randomized methods of comparison, such as quasi-experimental designs or pretest-posttest control group designs. Must include structured comparisons of the same indicators between at least two groups, such as the intervention group and comparison group.
  3. Innovation methods with structured comparisons,such as before/after studies or iterative pilots with repeated measures. Must include structured comparisons of the same indicators for one group over time (e.g., against a baseline or business-as-usual). Note that trying new things in isolation without a strong element of comparison is not eligible.

Definition of rigour

Given considerations such as ethics, timeliness, feasibility, equity and resources, organizations should strive to use the most rigorous method that is appropriate to their context. To meet a minimum standard of rigour, projects must include the following two features:

  1. Structured comparison: The same variables (quantitative or qualitative) are measured in the same way multiple times.
  2. Sufficient sample size: Sample is large enough to produce robust results and yield useful information for the context.

Given the diversity of mandates and operating contexts, it is up to organizations to explain how their chosen projects meet these two features. See FAQ in MAF portal for more detailed guidance and examples.

Timeframe for projects

  • Previously, projects that used rigorous methods of comparison were accepted in the MAF based on whether their intervention (i.e., the testing or comparing) occurred in the year of assessment. This is no longer the case.
  • Projects are eligible for inclusion based on the year that their findings are made available, whether internally or externally.
  • Organizations must submit evidence that findings were made available in the 2021-22 FY (e.g., through a presentation, research report, email of results, etc.)

Organizational functions

Serving Canadians (public facing)

Functions

  • Programs
  • Service delivery
  • Grants and contributions
  • Policies
  • Regulations/legislation
  • Other

Serving Government (internal services)

Functions

  • Human resources
  • Communications
  • Information management/technology
  • Financial management
  • Acquisitions and assets
  • Legal services
  • Other

Strategic Theme 2 – Generating Rigorous Evidence (continued)

Strategic Theme Overview (continued)

The Generating Rigorous Evidence theme encourages organizations to use rigorous methods of comparison to support innovation across a diversity of organizational functions and where there is a potential for high impact for Canadians or public servants. By generating rigorous evidence on what works, we can de-risk innovations, support sound fiscal management, and achieve better results for Canadians. The theme enables Deputy Heads and TBS to monitor the extent to which organizations are testing their innovations and measuring outcomes in real-world settings.

Question 3 New

For the projects in Q2, to what extent was there potential for high impact for Canadians or public servants?

  • A great extent. Three projects tested a significant innovation OR solutions to a large-scale problem
  • A moderate extent. Two projects tested a significant innovation OR solutions to a large-scale problem
  • A minimal extent. One project tested a significant innovation OR solutions to a large-scale problem
  • Not present. The organization does not have evidence of projects with a potential for high impact or did not have eligible projects in Q2

See Q3 Annex C for more information and the AoM Glossary for definitions.

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Question 3 focuses on testing innovations where there is a potential for high impact for Canadians or public servants. The 2016 Experimentation Direction encourages organizations to innovate and measure outcomes so that public resources are invested where they are likely to have the greatest impact. Management excellence in innovation involves not only using rigorous methods of comparison but applying them to de-risk significant innovations and tackle large-scale problems in an iterative manner.

Category

  • Policy Compliance
  • Performance
  • Baseline

Expected Results

A great extent

Assessed Organizations

Large Departments and Agencies

Period of Assessment

2021-22 fiscal year

Calculation Method

  • Three points for three projects that had potential for high impact (maximum score)
  • Two points for two projects that had potential for high impact
  • One point for one project that had potential for high impact
  • Zero points for no evidence of high-impact or rigorous comparisons

How this question is assessed

Organizations must show that the projects submitted in Q2 either tested a significant innovation OR solutions to a large-scale problem. This question is worth three points. To receive the maximum score, organizations must demonstrate that all three projects had the potential for high impact for Canadians or public servants. There is no requirement to show that a project tested both a significant innovation and solutions to a large-scale problem—projectsthat have both these features will not receive additional points. There is also no requirement to show that the project found positive results. The focus is on generating evidence where there is the potential for high impact, not on the realized impact of the innovation or solution (i.e., the results of submitted projects can be positive, negative, neutral, or inconclusive).

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

  • TBS provided coversheet (in MAF Portal)
  • No additional evidence required in MAF portal

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No (New question, but will be compared going forward)

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

Experimentation Direction for Deputy Heads

Annex C

Significant innovation

Must have tested a fundamental aspect of a departmental program or service, which may be serving Canadians (public facing) or serving government (internal services). Examples of significant innovation can include:

  • A new approach that pushes beyond the status quo
  • Redesigning program or service delivery models
  • De-risking bold or transformative initiatives
  • Challenging the underlying assumptions of policies
  • Changing core business processes

Solutions to a large-scale problem

Must have tested solutions or improvements that 1) directly address problems faced by a large number of Canadians (millions) or public servants (hundreds), a significant proportion of a target demographic (50% or more) OR 2) relate to a high-spending area (over $5 million).

Examples of large-scale problems can include:

  • Persistent, complex or wicked problems
  • Areas facing uncertainty, where the right solution is not immediately obvious
  • Challenges identified in mandate letter commitments

While significant innovation often involves testing one big change, solutions to large-scale problems can involve testing a smaller change (e.g., a behaviourally informed ‘nudge’) OR making iterative changes that in their aggregate aim to address a large-scale problem.

Strategic Theme 3 – Evidence-Informed Decisions

Strategic Theme Overview

The Evidence-Informed Decisions theme encourages organizations to use the results of rigorous comparisons to inform timely decision-making. An evidence-driven decision-making process ensures new ideas are sufficiently tested before implementation and existing programs and services are course corrected as needed. The theme enables Deputy Heads and TBS to determine to what extent organizations are making timely decisions at executive-level governance bodies and instilling a culture of innovation and measurement.

Question 4 New

For the projects in Q2, to what extent were the results used to inform timely decision-making at an executive-level governance body?

  • A great extent. Results from three projects were used to inform decision-making within 6 to 18 months
  • A moderate extent. Results from two projects were used to inform decision-making within 6 to 18 months
  • A minimal extent. Results from one project were used to inform decision-making within 6 to 18 months
  • Not present. The organization does not have evidence of results informing decision-making within 6 to 18 months or did not have eligible projects in Q2

See Q4 Annex C for more information and the AoM Glossary for definitions.

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Question 4 focuses on making timely, evidence-informed decisions at executive-level governance bodies. The 2016 Experimentation Direction encourages evidence-based policymaking by mobilizing the results from ethical and rigorous testing of innovations. Management excellence in innovation entails grounding executive decision-making in the best available evidence, so that programs and services are continually improving and meeting the needs of Canadians.

Category

  • Policy Compliance
  • Performance
  • Baseline

Expected Results

A great extent

Assessed Organizations

Large Departments and Agencies

Period of Assessment

2021-22 fiscal year

Calculation Method

  • Three points for three projects whose findings were used to inform decision-making (maximum score)
  • Two points for two projects whose findings were used to inform decision-making
  • One point for one project whose findings were used to inform decision-making
  • Zero points for no evidence of findings informing decision-making

How this question is assessed

Organizations must show that they have used the results of the projects submitted in Q2 to inform timely decision-making at an executive-level governance body. This question is worth three points. To receive the maximum score, organizations must demonstrate that the results from all three projects were used to inform decision-making within 6 to 18 months of the findings being made available.

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Data collection Method

  • TBS provided coversheet (in MAF Portal)
  • Additional evidence submitted in MAF portal

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No (Question was changed compared to the previous cycle, but will be compared going forward)

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A

N/A

Annex B (if applicable)

Experimentation Direction for Deputy Heads

Annex C

Decision-Making

The following three types of decisions are accepted as evidence of decision-making:

  1. Course correcting an existing program or service
  2. Scaling or implementing a new program or service
  3. Requesting additional evidence generation

For example, if your project tested artificial intelligence (AI) to automate processes for human resources, an eligible decision could be:

  • To course correct your human resources procedures by updating them to add the new AI process
  • To scale-up by implementing the AI process across all human resource teams OR in another area (e.g., legal team)
  • To request additional evidence generation or a new pilot because the AI results were mixed or inconclusive
  • Not eligible: A decision to end the AI pilot or trial.

Executive-level governance body

Results must have informed decision-making at an executive-level governance body that includes Assistant Deputy Minister level or above.

Timeframe for decisions

Timely decisions must be made within 6 to 18 months from the date the project findings were made available. This timeframe enables organizations to submit decisions made in the 2021-22 FY or up until the MAF draft submission deadline. In other words, organizations have an additional 6-month grace period after the end of March 31, 2022 but before the MAF draft submission deadline in which they can show a decision was made for their project. The 6-to-18-month timeframe was chosen to be inclusive of projects whose findings were made available early in the 2021-22 FY (have up to 18 months to show a decision) or late in the 2021-22 FY (have a 6-month grace period to show a decision).

Innovation AoM Glossary

Fiscal Year (FY)
The reporting period for this AoM is based on the fiscal year, i.e., from April 1, 2021, to March 31, 2022.
Rigorous evidence
The term rigorous evidence refers to the evidence generated from using rigorous methods of comparison. We have identified three types of rigorous methods of comparison (see Question 2), all of which feature structured and intentional comparisons. The level of rigour varies among these methods. Given considerations such as ethics, timelines, feasibility, equity and resources, organizations should strive to use the most rigorous method that is appropriate to their context.
Program or service
This AoM uses the term program or service to include the full spectrum of government activities, whether they serve Canadians or public servants. Rigorous methods of comparison can be applied to internal services that support the corporate obligations of an organization. They can also be applied to public facing programs which are individual or groups of services, activities, or combinations thereof that are managed together within the department.
Projects
This AoM uses the term projectto refer to examples that organizations submit of using rigorous methods of comparison. It does not refer to the specific definition of a project as per the Directive on the Management of Projects and Programmes.
Real-world settings
This term refers to testing innovations in practice with Canadians or public servants (e.g., field experiments) or in close to real-world conditions (e.g., sandbox environments and simulations).
Feasibility and technical studies

Science and technology organizations undertake primary research, laboratory testing, and feasibility studies as part of their core mandate. This work often involves the use of rigorous methods of comparison. While this work is critical, the specific focus of the Innovation AoM is on the application of these same methods to learn what works for policies, program design and implementation, and other organizational functions.

For example:

  • Not an eligible project for the Innovation AoM: A feasibility study that compared three types of drones to find which ones work in extreme cold weather. (This is a technical study that does not directly measure outcomes for Canadians or public servants)
  • Eligible project for the Innovation AoM: A study that shows drones are half as expensive and twice as fast compared to traditional inspection approaches. (This is a study that measures and compares two key indicators between business-as-usual and the new innovation: 1) value-for-money, an outcome for Canadian taxpayers and 2) faster performance, an outcome for public servant inspectors)

Page details

Date modified: