MAF 2018 to 2019 results management methodology

On this page 

Methodology Overview

In this section

Objectives

Sound performance measurement and evaluation are hallmarks of good management practices that ultimately impact the level of results achieved. Performance measurement and evaluation enable organizations to:

  • identify business needs
  • articulate objectives
  • measure the degree to which objectives are being met
  • indicate areas where improvements can be made

The overall objectives of the Results Management methodology of the Management Accountability Framework (MAF) for the 2018 to 2019 fiscal year are to ensure that:

  • departments and agencies have quality performance information and evaluations in order to make better decisions
  • the Treasury Board has quality information on performance measurement and evaluation to effectively scrutinize new funding requests

Given these objectives, the indicators of this methodology have been grouped into two key areas:

  • availability of quality performance information for decision-making
  • use of performance information to improve programs

In addition, as an interim measure, a question on experimentation has been included in the methodology this year.

Use of Management Accountability Framework (MAF) results

The 2018 to 2019 Results Management MAF results will provide information to the following three key audiences:

  • deputy heads will use the information to:
    • understand the quality of appendices on results in Treasury Board submissions in order to make improvements where needed
    • ensure that information on performance is available for decision-making
    • ensure that evaluation recommendations are being implemented
  • the results management functional community will use the information to:
    • gain an understanding of the state of results management in departments and agencies and potential areas for improvement
  • the Treasury Board of Canada Secretariat (TBS) will use the information to:
    • improve TBS’s challenge function on results appendices in Treasury Board submissions by:
      • communicating general areas of weakness to departments and agencies
      • monitoring improvements in new appendices
    • develop outreach and training for departments and agencies to build capacity in areas for improvement

Questionnaire

In this section

Availability of Performance Information for Decision-Making

Outcome(s):

  • Proposed spending in Treasury Board submissions have high-quality performance measures to determine the results achieved
  • Departments and agencies collect data on actual results on schedule for timely availability of results achieved

Question 1: What is the quality of results appendices in the first draft of Treasury Board submissions received from the department or agency?

  • High
  • Medium
  • Low
  • Unable to Assess
Category
  1. Policy compliance
  2. Results or performance
  3. Service standard
  4. Baseline information
  5. Descriptive statistic
Type
  1. Core
  2. Spot-check
Rationale

Results appendices provide insights into:

  • what outcomes departments and agencies are trying to achieve for the proposed funding
  • how departments and agencies intend to measure success

Assessing the quality of draft results appendices:

  • allows TBS to understand the department’s or agency’s capacity to produce high-quality information on performance measurement
  • enables a department or agency to improve the quality of the draft before submitting the final version to the Treasury Board

This indicator relates to the Policy on Results expected result 3.2.1: “Departments are clear on what they are trying to achieve and how they assess success.”

Target (where applicable)

n/a

Calculation method

TBS uses a set of criteria to support this assessment. The criteria are found in the “Delivery and Expected Results: Deputy Head Commitment” section of Detailed Guidance for Writers: Writing a Treasury Board Submission.
“Delivery and Expected Results: Deputy Head Commitment” is a mandatory appendix to a Treasury Board submission and should include the following:

  • quality of outcome(s) or result(s)
  • performance indicator(s)
  • target(s)
  • data strategy (data sources and frequency of data collection)

Assessments of each of these criteria will be combined to determine the overall quality rating. TBS will assess all draft Treasury Board submissions that:

  • have results appendices
  • were approved by the Treasury Board in the 2018 calendar year
Evidence source and document limit
TBS to answer

Data source: First draft of Treasury Board submissions that have results appendices

Date of data extraction:

Department or agency to answer

Evidence: n/a

Document limit: n/a

Period of assessment: 2018 calendar year

Treasury Board policy reference or Government of Canada priority
  • Treasury Board policy reference or Government of Canada priority
    Policy on Results, subsection 4.3.10

Question 2: What is the quality of results appendices in final Treasury Board submissions received from the department or agency?

  • High
  • Medium
  • Low
  • Unable to Assess
Category
  1. Policy compliance
  2. Results or performance
  3. Service standard
  4. Baseline information
  5. Descriptive statistic
Type
  1. Core
  2. Spot-check
Rationale

Results appendices provide insights into:

  • what outcomes departments and agencies are trying to achieve for the proposed funding
  • how departments and agencies intend to measure success

Assessing the quality of final results appendices provides insights into the quality of information on performance measurement that will be available to demonstrate the results achieved. The assessment:

  • allows TBS to impose requirements for improvements through the Treasury Board submission process, if needed
  • provides insights into a department’s or agency’s capacity to make improvements to information on performance measurement from the first draft version to the final version

This indicator relates to the Policy on Results’ expected result 3.2.1: “Departments are clear on what they are trying to achieve and how they assess success.”

Target (where applicable)

n/a

Calculation method

TBS uses a set of criteria to support this assessment. The criteria are found in the “Delivery and Expected Results: Deputy Head Commitment” section of Detailed Guidance for Writers: Writing a Treasury Board Submission.
“Delivery and Expected Results: Deputy Head Commitment” is a mandatory appendix to a Treasury Board submission and includes the following:

  • quality of outcome(s) or result(s)
  • performance indicator(s)
  • target(s)
  • data strategy (data sources and frequency of data collection)

Assessments of each criteria are combined to determine the overall quality rating. TBS will assess all draft Treasury Board submissions that:

  • have results appendices
  • were approved by the Treasury Board in the 2018 calendar year
Evidence source and document limit
TBS to answer

Data source: Results appendices of final Treasury Board submissions

Date of data extraction:

Department or agency to answer

Evidence: n/a

Document limit: n/a

Period of assessment: 2018 calendar year

Treasury Board policy reference or Government of Canada priority

Question 3: What percentage of performance indicators included in approved Treasury Board submissions that have results appendices are missing information on the frequency of data collection and/or a data source?

Category
  1. Policy compliance
  2. Results or performance
  3. Service standard
  4. Baseline information
  5. Descriptive statistic
Type
  1. Core
  2. Spot-check
Rationale

Performance measurement depends on having the data needed to calculate or collect actual results. Knowing the data source and frequency of collection is required in order to measure performance.

Performance indicators with missing data collection frequency or data source may not be well-conceived or feasible measures of performance.

This indicator measures the extent to which departments and agencies have well-conceived and feasible indicators in results appendices. This indicator relates to the Policy on Results’ expected result 3.2.1: “Departments are clear on what they are trying to achieve and how they assess success.”

Target (where applicable)

n/a

Calculation method

Number of performance indicators in the results appendices of a department’s or agency’s approved Treasury Board submission that do not indicate the frequency of data collection and/or the source of data
÷  Total number of performance indicators included in those appendices

Note: Question 3 is based on all indicators included in the results appendices of Treasury Board submissions approved between , and . The database created for this question is:

  • used in Question 4
  • based on the previous year to allow for a minimum of 12 months for the first collection of data
Evidence source and document limit
TBS to answer

Data source:TBS will collect indicators of the frequency of data collection and information on data sources directly from results appendices of approved Treasury Board submissions that were submitted between , and

Date of data extraction:

Department or agency to answer

Evidence: n/a

Document limit: n/a

Period of assessment: 2017 calendar year

Treasury Board policy reference or Government of Canada priority

Question 4: What percentage of performance indicators included in the results appendices of approved Treasury Board submissions have actual results data available on schedule?

Category
  1. Policy compliance
  2. Results or performance
  3. Service standard
  4. Baseline information
  5. Descriptive statistic
Type
  1. Core
  2. Spot-check
Rationale

The availability of actual results data is required in order to use the data for decision-making.

This indicator will show the extent to which departments and agencies have actual results data available for managing their programs, as committed to in their Treasury Board submissions.

This indicator relates to the Policy on Results’ expected result 3.2.2: “Departments measure and evaluate their performance, using the resulting information to manage and improve programs, policies and services.”

Target (where applicable)

n/a

Calculation method

Number of performance indicators that have actual results data on schedule
÷  Total number of performance indicators that were scheduled to have data on actual results

To reduce burden on departments and agencies, a random sample of indicators from approved results appendices will be used in the calculation. This sample will be randomly selected from all available indicators included in results appendices of Treasury Board submissions approved between , and . Such submissions should have actual results.

In instances where it was not possible for a department or agency to have data (such as where the program became active several months after the Treasury Board submission was approved), TBS will work with departments and agencies to exclude such instances from the sample.

Note: The random sample for Question 4 is based on two sets of data:

  1. Results appendices of Treasury Board submissions approved between September 2016 to December 2016 that:
    • were used in last year’s MAF assessments
    • contain indicators that may have a first or second data point at this stage
  2. Results appendices of Treasury Board submissions approved between , and

The database created for these questions is based on the previous year to allow for a minimum of 12 months for the first collection of data after Treasury Board approval. It is understood that, in some cases, a period of 12 months is not sufficient for the first collection of data due to a lag between the approval of a Treasury Board submission and program implementation.

Evidence source and document limit
TBS to answer

Data source: TBS will provide departments and agencies with a pre-populated template of a sample of indicators for which departments and agencies committed to have actual results data by . Departments and agencies will complete the template by providing the actual results data for the sample indicators.

The completed template will be submitted to TBS outside the MAF Portal.

Date of data extraction:

Department or agency to answer

Evidence: n/a

Document limit: n/a

Period of assessment: 2017 calendar year

Treasury Board policy reference or Government of Canada priority

Use of Performance Information to Improve Programs

Outcome(s): Departments and agencies make improvements to programs based on performance information and evaluation findings.

Question 5: What percentage of evaluation management action plan items due to be implemented in the 2017 to 2018 fiscal year were actually implemented?

Category
  1. Policy compliance
  2. Results or performance
  3. Service standard
  4. Baseline information
  5. Descriptive statistic
Type
  1. Core
  2. Spot-check
Rationale

Evaluations provide in-depth insights into:

  • program relevance
  • program success
  • areas for improvement

Management action plan items set out the improvements that program management commits to implement.

This indicator measures the extent to which programs are being improved by tracking the percentage of action items implemented according to the commitment.

This indicator relates to the Policy on Results expected result 3.2.2: “Departments measure and evaluate their performance, using the resulting information to manage and improve programs, policies and services.”

Target (where applicable)

100%

Calculation method

Number of management action plan items scheduled to be implemented in the 2017 to 2018 fiscal year that have been fully implemented
÷ Total number of management action plan items scheduled for implementation in the 2017 to 2018 fiscal year

Note: Obsolete recommendations are excluded from the calculation.

Evidence source and document limit
TBS to answer

Data source: Capacity Survey on Results Functions (CSR), 2017 to 2018

Date of data extraction:

Department or agency to answer

Evidence: n/a

Document limit: n/a

Period of assessment: 2017 to 2018 fiscal year

Treasury Board policy reference or Government of Canada priority

Experimentation

Outcome(s):

  • TBS creates a baseline of departmental and agency experimentation activities against which it can measure progress of the Experimentation Direction for Deputy Heads: December 2016
  • Departments and agencies explore how best to start operationalizing the Experimentation Direction for Deputy Heads: December 2016
  • Departments and agencies start improving how they generate and use evidence in decision-making

Question 6: Consistent with Experimentation Direction for Deputy Heads: December 2016, has the department or agency been experimenting in the 2018 calendar year with:

  • new approaches to find new ways to address persistent problems?
  • improving outcomes for Canadians? (see definition of “experimentation” in the Glossary)

Descriptions or examples to support your answer are to be provided (see details below; select all that apply).

  • Yes, by committing resources (financial and human resources) to experimentation
  • Yes, by running experiments
  • Yes, by developing a coordinated department-wide framework or approach to experimentation
  • Yes, by integrating experimentation into departmental governance bodies or decision-making processes
  • Yes, by building departmental capacity or awareness to undertake experimentation in ways that are different from the three approaches noted above (explain)
  • No (explain)
Category
  1. Policy compliance
  2. Results or performance
  3. Service standard
  4. Baseline information
  5. Descriptive statistic
Type
  1. Core
  2. Spot-check
Rationale

To establish an initial baseline of the level of effort and number of departments and agencies that generate and use high-quality data derived from experiments to inform evidence-based decision-making

Target (where applicable)

n/a

Calculation method

Provide specific examples of how the department or agency supported experimentation (see the definition and examples of “experimentation” in the Glossary).

Evidence source and document limit
TBS to answer

Data source: n/a

Date of data extraction: n/a

Department or agency to answer

Evidence:

  • A short description that outlines how the department or agency supports experimentation (500 words maximum)
  • Specific examples (from existing non-secret documents, reports or presentations) that outline how the department or agency supports experimentation

Document limit: 2 (for each selection)

Period of assessment: 2018 calendar year

Treasury Board policy reference or Government of Canada priority

Glossary

actual results data
Performance indicator results data.
data collection frequency
The frequency at which a department or agency collects data on actual results related to an indicator.
data source
The source from which a department or agency collects data on actual results related to an indicator.
experimentation
Activities that seek to explore, test and compare the effects and impacts of policies and interventions in order to inform evidence-based decision-making and improve outcomes for Canadians by learning what works and what doesn’t. Experimentation is related to but distinct from innovation (which is the trying of new things) because it involves a rigorous comparison of results.

For example, using a new website to communicate with Canadians could be an innovation, while systematically testing that website against existing outreach tools (or an old website) to see which one leads to more engagement is an experiment.
management action plan
Actions that management has committed to take to address recommendations in an evaluation report.
on schedule
Meeting of the timeline committed to in a results appendix of a Treasury Board submission based on the data frequency and/or the date to achieve target.
performance information profile
A document that contains performance information for each program from the department’s or agency’s program inventory.
persistent problems
Complex policy problems that traditional government approaches have failed to solve. Persistent problems are difficult or impossible to solve because of incomplete, contradictory, and changing requirements and contexts that are often difficult to recognize. Moreover, because of complex interdependencies, the effort to solve one aspect of a persistent problem may reveal or create other problems.
results appendix
A mandatory section of a Treasury Board submission in which departments and agencies provide details on:
  • the immediate, intermediate and ultimate outcomes associated with funding proposals
  • the performance indicators for these outcomes
  • any current baseline data
  • future targets

Definitions of other terms

For definitions of additional terms and expressions used in this questionnaire, refer to the Results Portal and to the following:

Page details

Date modified: