MAF 2020 to 2021 Results Management Methodology
On this page
Methodology Overview
Government programs should be effective, achieving tangible results, and of good value for Canadians. Sound performance measurement is a hallmark of good management practices which ultimately impact the level of results achievement. Performance measurement enables organizations to identify business needs, articulate objectives and measure the degree to which those objectives are being met. They also indicate areas where improvements can be made.
The 2020-21 Results Management Methodology is aligned with the Policy on Results as well as the Experimentation direction for Deputy Heads. In this vein, it continues to examine the quality and availability of performance information as well as departmental experimentation efforts. In addition, one question has been added to better understand how departments collected actual results data during the pandemic.
Use of MAF Results
The 2020-21 Results Management MAF methodology will provide the following information to the four key stakeholder groups listed below.
Deputy heads:
- Understand the quality of Performance Information Profiles in order to effect improvements where needed; and
- Support the availability of performance information for use in decision-making.
Results Management functional community:
- Provide an understanding of the state of results management in departments and agencies and potential areas for improvement.
Experimentation communities of practice:
- Provide an understanding of the state of experimentation in departments and agencies and potential areas for improvement.
Treasury Board of Canada Secretariat:
- Improve TBS’ insight into Program performance management and challenge function on Results Appendices in Treasury Board submissions, by communicating general areas of weakness to departments and monitoring improvements; and,
- Develop outreach and training for departments to build capacity in areas for improvement.
Period of assessment
While the period of assessment for each indicator may vary depending on the information required, the overall timeframe for this year’s assessment falls within the November 2018 to October 2020 timeframe. Timeframe for each question can be found below in the methodology for each question.
Impact on Departments
Below is a summary of the impact on departments in terms of the number of questions, and submission of evidence, compared to 2019-20. Overall, the impact is neutral, as one assessment question was taken out and replaced by a survey question for information purposes only.
Results Management | 2019-20 | 2020-21 |
---|---|---|
Total number of questions | 6 (4 Results & 2 experimentation) | 6 (4 Results & 2 experimentation) |
Total number of questions which require the submission of evidence | 2 | 2 (1 experimentation & 1 COVID survey question) |
- Three core questions have been preserved from 2019-20 to provide trend analysis on the quality of Program Information Profiles, missing data collection information and availability of results data.
- There is also one new question on whether and how the COVID-19 pandemic has affected the way departments collect actual results data.
Overall outcomes
The methodology will continue to identify areas of strength and opportunities for improvement in the implementation of the Policy on Results, including within the context of a pandemic. It will also provide insights about how departments are experimenting to advance organizational and program objectives.
MAF 2020-21 Results Management Questionnaire
Question #1 Indicators Preserved from MAF 2019-20
What is the quality of the Performance Information Profile (PIP) measurement information?
Based on a scale of 1-to-5, where
- 1 - unacceptable
- 5 - excellent
Note: Departments will be given an exact score out of 5
Rationale
The majority of departmental performance information is held in PIPs, which are a key management tool for the measurement of Program performance.
Assessing the quality of PIPs allows the Treasury Board Secretariat (TBS) to understand the department’s capacity to produce quality Program performance measurement information and enables the department to make improvements for better Program performance measurement.
This indicator relates to the Policy on Results’ expected result 3.2.1, “Departments are clear on what they are trying to achieve and how they assess success.”
Category
- Policy compliance
- Performance
- Other
Target (where applicable)
N/A
Calculation method (where applicable)
A random sample of one third of each department’s PIPs, with a minimum of 3 and a maximum of 10, will be collected by TBS.
Assessments will be based on the following elements:
Quality of Outcomes – 30%
Criteria:
- Are outcomes clear? (Is the language easy to understand? Can the concepts be easily interpreted? Is the language an outcome statement?)
- Are outcomes appropriate relative to the program’s resources, stated goals, etc.?
-
Quality of indicators – 30%
Criteria:
- Are indicators valid measures of the outcomes? (i.e. are indicators appropriate and actually measure the outcomes they are used to measure?)
- Are indicators comprehensive – i.e. do they measure all significant aspects of the outcomes?
A clearly articulated program logic – 15%
Criteria:
- Are the essential elements included, e.g. activities, outputs and outcomes?
- Is there a clear logical progression of outcomes?
- Is there a clear explanation of the linkages between the outputs and various outcomes? (Does the program logic clearly establish relevant links between immediate, intermediate and long-term outcomes, i.e., how the achievement of one immediate outcome contributes to the achievement of one or more intermediate outcomes and so on?)
-
Complete attributes of the indicators are included (data source, data collection frequency, target value, and date to achieve the target) – 25%
Criteria:
- For each indicator is the data source and/or frequency included?
- For each indicator is the target and/or date to achieve target included?
Evidence source and document limit
TBS to answer
This question applies to the following organizations:
- Large departments and Agencies
Data source: Program Information Profiles
Date of data extraction: October 2020
Data collection method: Documentary evidence
Evidence: TBS will provide departments with a randomly generated list of Programs from their Program Inventory following the formal launch of the MAF cycle. Departments will submit their PIPs for those Programs by the end of October 2020.
The PIPs will be submitted to TBS via email, i.e. outside of the MAF Portal. TBS will request a departmental contact for the exchange of the PIPs.
Document limit: N/A
Period of assessment: Up to October 15, 2020.
Department or agency to answer
Reference materials
Treasury Board policy reference or Government of Canada priority
Question #2 Indicators Preserved from MAF 2019-20
What percentage of performance indicators included in approved Treasury Board Submission Results Appendices has both data collection frequency and data source information?
Rationale
Performance measurement depends on having the data needed to calculate and/or collect actual results. Therefore, having data source and collection frequency information are prerequisites for measuring performance.
Results Appendices in Treasury Board Submissions are intended to provide details on the outcomes and performance indicators used to ensure that results are being achieved. Accordingly, performance indicators should have complete data collection frequency and data source information to ensure it is feasible to collect results information for them.
This indicator measures the extent to which departments have well-conceived and feasible indicators in Results Appendices. This indicator relates to the Policy on Results’ expected result 3.2.1: “Departments are clear on what they are trying to achieve and how they assess success.”
Note: This question seeks the same information as in MAF 2019-20, however the language has been modified slightly to add clarity. Last year the question was “What percentage of performance indicators included in approved Treasury Board Submission Results Appendices is missing data collection frequency and/or data source information?”
Category
- Policy compliance
- Performance
- Other
Target (where applicable)
N/A
Calculation method (where applicable)
Number of performance indicators in a department’s approved Treasury Board submission Results Appendices with missing data collection frequency and/or data source, divided by the total number of performance indicators included in those Appendices.
Note: This question is based on all indicators included in Treasury Board submission Results Appendices approved between November 1, 2018 and June 30, 2019. The database created for this question is also used in Question 3 below and is based on the previous year to allow for a minimum of 12 months for the first data collection.
Evidence source and document limit
TBS to answer
This question applies to the following organizations:
- Large departments and Agencies
Data source: Treasury Board Submission Results Appendices
Date of data extraction: July 31, 2019
Data collection method: Documentary evidence
Evidence: TBS will collect indicator data collection frequency and data source information directly from approved Treasury Board Submission Results Appendices, submitted between November 1, 2018 and June 30, 2019.
Document limit: N/A
Period of assessment: N/A
Department or agency to answer
Reference materials
Treasury Board policy reference or Government of Canada priority
Question #3 Indicators Preserved from MAF 2019-20
What percentage of performance indicators included in approved Treasury Board submission Results Appendices has actual results data available on schedule?
Rationale
The availability of indicator actual results data is a prerequisite to using the data for decision-making.
This indicator measures the extent to which departments have actual results data available for managing their programs, as committed to in their Treasury Board submissions.
This indicator relates to the Policy on Results’ expected result 3.2.2 “Departments measure and evaluate their performance, using the resulting information to manage and improve programs, policies and services.”
Category
- Policy compliance
- Performance
- Other
Target (where applicable)
N/A
Calculation method (where applicable)
Number of performance indicators which have actual results data on schedule, divided by total number of performance indicators (in a sample – see below) which were scheduled to have actual results data.
A random sample of one fourth of indicators from approved Results Appendices, within a minimum of 10 and a maximum of 50, will be used in the calculation to reduce the burden on departments. This sample will be randomly selected from all available indicators included in Treasury Board submission Results Appendices approved between September 1, 2016 and June 30, 2020 which should have actual results.
In instances where it was not possible for a department to have data (e.g., the program became active several months after the Treasury Board submission was approved), TBS will work with departments and agencies to exclude them from the sample.
Note:
The random sample for this question is based on two sets of data:
- All indicators from TB Submissions’ Results Appendices approved from September 1, 2016 to December 31, 2018 (these may contain indicators that may have a first, second, third, and fourth data point at this stage).
- All indicators with data collection frequency from TB Submissions’ Results Appendices approved from January 1, 2018 and June 30, 2019. The database created for this question is based on the previous years’ submissions to allow for a minimum of 12 months for the first data collection after TB approval. It is understood that in some cases 12 months are not sufficient for the first data collection due to a lag in the approval of a TB submission and program implementation.
Evidence source and document limit
TBS to answer
This question applies to the following organizations:
- Large departments and Agencies
Data source: Treasury Board Submission Results Appendices
Date of data extraction: October 2020
Data collection method: Documentary evidence
Evidence: TBS will provide departments with a pre-populated template of a sample of indicators for which departments and agencies committed to have actual results data by early November. Departments and agencies will enter the actual results data for the sample indicators in a blank column provided in the pre-populated template.
The completed template will be submitted to TBS by email, i.e. outside of the MAF Portal. TBS will request a departmental contact for the exchange of the template.
Document limit: N/A
Period of assessment: N/A
Department or agency to answer
Reference materials
Treasury Board policy reference or Government of Canada priority
Question #4 Indicators Preserved from MAF 2019-20
Has the department or agency been experimenting, in the 2020 calendar year, to find ways for addressing persistent problems and improving outcomes for Canadians? (Select all that apply)
- Yes - by committing resources (financial, HR) to experimentation
- Yes - by running experiments (i.e. randomized, quasi-experimental, or structured pre-post methodology)
- Yes - by developing a coordinated department-wide framework or approach to experimentation (e.g. a strategy, plan, statement etc.)
- Yes - by integrating experimentation into departmental governance bodies
- Yes - by building departmental capacity or awareness to do experimentation in ways different from the approaches noted above (please explain)
- No - (please explain)
Rationale
Experimentation encourages the testing of new and old approaches to learn what works and what does not work by using a rigorous method (as set out in the 2016 Experimentation Direction to Deputy Heads). Collecting this information allows the Treasury Board Secretariat to support departments and agencies in understanding the different types of experimental methodologies being used to support departments and agencies develop their experimentation capacity.
Category
- Policy compliance
- Performance
- Other
Target (where applicable)
N/A
Calculation method (where applicable)
Provide specific examples of how the organization supported experimentation. Definition and examples of experimentation are provided in the Glossary.
Evidence source and document limit
TBS to answer
Department or agency to answer
This question applies to the following organizations:
- Large departments and Agencies
Data source: N/A
Date of data extraction: N/A
Data collection method: Documentary evidence (MAF Portal)
Evidence:
- A short description outlining how the department supports experimentation and/or how the examples provided are evidence of supporting experimentation (500 words, maximum);
- Specific examples (from existing non-secret documents, reports or presentations) outlining how the department supports experimentation.
Document limit: Maximum of two documents for each “yes” selection
Period of assessment: 2020 Calendar Year
Reference materials
Treasury Board policy reference or Government of Canada priority
Question #5 Indicators Preserved from MAF 2019-20
If your department/agency has run experiments, are the results of these experiments being used to inform decision making in 2020?
- Yes. The initiative was scaled-up/ expanded because it was working
- Yes. The initiative was shut down/ scaled back because it wasn't working
- Yes. The results allowed for course-correction (i.e. changes to the existing initiative)
- Yes. The results were disseminated (at least department-wide) for decision-making or used to improve the design of another initiative
- Yes. Other (please specify)
- No.
Rationale
Experimentation encourages the testing of new and old approaches to learn what works and what does not work by using a rigorous method (as set out in the 2016 Experimentation Direction to Deputy Heads). Collecting this information allows the Treasury Board Secretariat to support departments and agencies in understanding the possible impact experimentation has had.
Category
- Policy compliance
- Performance
- Other
Target (where applicable)
N/A
Calculation method (where applicable)
Departments and agencies will self-assess their answers to Q5 above.
Definition and examples of experimentation are provided in the Glossary.
Evidence source and document limit
TBS to answer
Department or agency to answer
This question applies to the following organizations:
- Large departments and Agencies
Data source: N/A
Date of data extraction: N/A
Data collection method: Self assessment
Evidence: No evidence required
Document limit: N/A
Period of assessment: 2020 Calendar Year
Reference materials
Treasury Board policy reference or Government of Canada priority
Question #6 New Indicators for MAF 2020-21
In response to the COVID-19 pandemic, did your organization have to adjust its existing approaches to collecting actual results data?
- Yes
- No
Rationale
This question is designed to gain insights on steps departments took to be able to collect actual results data during the pandemic. This question is not for assessment of performance but for insights and, potentially, for sharing of innovative and good practices among departments.
Category
- Policy compliance
- Performance
- Other
Target (where applicable)
N/A
Calculation method (where applicable)
A template will be distributed as below:
In response to the COVID pandemic, did your organization have to adjust its existing approaches to collecting actual results data? Yes/No
If the answer is yes, please identify what changes have been made in a succinct manner in the format of a list:
- First change:
- Title. For example, “Focused only on indicators with the highest value”
- Explanation: 100 words or less
- Second change:
- Title
- Explanation: 100 words or less
Evidence source and document limit
TBS to answer
Department or agency to answer
This question applies to the following organizations:
- Large departments and Agencies
Data source: N/A
Date of data extraction: N/A
Data collection method: Documentary evidence (MAF Portal)
Evidence: Organizations will receive a template when they receive actual results data template for Q3. Departments can fill out the template and return it to TBS.
Document limit: 1
Period of assessment: 2020 Calendar Year
Reference materials
Treasury Board policy reference or Government of Canada priority
Glossary
- Actual results data:
- Performance indicator results data.
- Data collection frequency:
- Refers to the frequency at which a department or agency collects actual results data related to an indicator.
- Data source:
- Refers to the source from which a department or agency collects actual results data related to an indicator.
- Experimentation:
Activities which seek to first explore, then test and compare, the effects and impacts of policies and interventions in order to inform evidence-based decision-making and improve outcomes for Canadians, by learning what works and what doesn’t. Experimentation is related to, but distinct from innovation (the trying of new things), because it involves a rigorous comparison of results.
For example, using a new website to communicate with Canadians could be an innovation, while systematically testing that new website against existing outreach tools (or even an old website) to see which one leads to more engagement is an experiment.
- On schedule:
- Refers to the timeline committed to in a Treasury Board submission Results Appendix based on the data frequency and/or the date to achieve target.
- Performance Information Profile:
- The document that contains the performance information for each Program from the departmental Program Inventory.
- Results Appendix:
- Mandatory Treasury Board submission appendix in which organizations provide details on the immediate, intermediate and ultimate outcomes associated with funding proposals, the performance indicators for these outcomes as well as any current baseline data and future targets. Organizations also explain how performance information will be used to monitor progress to ensure that the intended results are being achieved.
For additional definitions of the terms and expressions used in this questionnaire, please refer to the Results Portal.
Page details
- Date modified: