MAF 2022 to 2023 Results Management Methodology

On this page

AoM Context

Purpose

The PoR and the CGBA are pillars for enhanced value and management excellence.

Overview

Questions are intended to ensure that assessed departments and agencies have a basis to support improvements to their results management practices.

DRFs perform a central role in fulfilling the PoR. However, given that they have never been formally assessed under the MAF, to further support this effort, the Results Division (RD) proposes that this year’s MAF exercise focus on their development and use, in alignment with guidance provided by TBS.

Further, while the Canadian Gender Budgeting Act requires that analysis of impacts of programs on gender and diversity be released annually, gaps remain in this reporting. Similar to last year, a question is proposed to assess departmental capacity to report on impacts on gender and diversity based on the prevalence of programs’ data collection plans.

The methodologies of questions will allow the Treasury Board Secretariat to identify:

Departmental Results Frameworks Capacity

Questions 1 and 2 assess the extent of the capacity that federal organizations have to both implement key policy requirements for, and perform results-based management activities with respect to their DRFs.

The methodology for these questions builds on results-based management (RBM) elements found in previous self-directed assessment tools such as:

Question 1 – Implementing Policy Requirements New

At what stage is your organization in its capacity to implement key policy requirements pertaining to the Departmental Results Framework?

*Please note that the following stages are incremental, each stage encompasses the elements of earlier stages. For example, by selecting Stage 3, it is implied that your organization features all elements of Stages 1 and 2, in addition of featuring those in Stage 3.

Stage 1 Stage 2 Stage 3 Stage 4 Stage 5

Design Stage/Undeveloped

Early Stages of Development

Established practice

Advanced Practice implementation

Best practice

Department established, implemented and maintained a DRF that sets out the Department’s Core Responsibilities, Departmental Results, and Departmental Results Indicators (DRIs)

Obtains the Secretary of the Treasury Board’s approval for changes to the Departmental Results and Departmental Results Indicators contained in the DRF

Department does the following in an unsystematic manner:

  • keeps track of adherence to the PoR requirements related to DRF
  • develops proposals to address gaps, difficulties and compliance issues in adhering to the PoR requirements related to DRF
  • brings to the attention of the Secretary of the TBS, along with proposals, gaps, difficulties and compliance issues in adhering to the PoR requirements related to DRF

Department has a sound system in place to systematically:

  • keep track of adherence to the PoR requirements related to DRF
  • ensure that, whenever appropriate, proposals are developed to address gaps, difficulties and compliance issues in adhering to the PoR requirements related to DRF
  • ensures that, whenever appropriate, gaps, difficulties and compliance issues in adhering to the PoR requirements related to DRF are brought to the attention of the Secretary of the TBS, along with proposals

Department takes note of the:

  • alignment between its DRF and the Program Inventory
  • advice provided by the Heads of Performance Measurement (HoPM) to the Performance Measurement and Evaluation Committee (PMEC) re. the availability, quality, utility and use of indicators in the DRF
  • advice provided by the Head of Evaluation (HoE) to PMEC regarding the validity and the reliability of the DRF’s DRIs – including their utility for evaluation

Department takes steps to:

  • ensure alignment between its DRF and the Program Inventory
  • act on advice provided by the HoPM to PMEC re: the availability, quality, utility and use of indicators in the DRF
  • adjust according to the advice provided by the HoE to PMEC regarding the validity and the reliability of the DRF’s DRIs – including their utility for evaluation

* Please note that for all stages (1 to 5), the organization is required to submit evidence to substantiate the stage selected. Organizations will need to identify the documents being submitted, describing the type of evidence shared, along with rationale linking each evidence (or group of evidence) provided with the elements in the description of the selected stage.

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

The capacity to meet key policy requirements for Departmental Results Frameworks (DRFs) have never been assessed. Such a review would identify capacity limitations and support improvement in fulfilling oversight responsibilities. This offers departments the opportunity to enhance the conditions for improved information on results that matter to decision makers and the public.

Category

  • Policy Compliance
  • Performance
  • Baseline

Policy on Results, sections 4.3.1, 4.3.2, 4.6.1, 4.6.2, 4.6.3,

Directive on Results, 4.1.1.1, 4.2.5, 4.4.2

To be in compliance with the Policy on Results and the Directive on Results, large departments and agencies must take action so to be at a minimum of Stage 3.

Note: All baseline questions will be included in MAF reporting products.

Expected Results

Target: Stage 4

Based on the timeframe since the PoR implementation, most organizations should have achieved a certain degree of maturity by now.

Assessed Organizations

The scope of this assessment will include all DRFs from large departments and agencies. A list of organizations not assessed against this Question is provided in Annex A.

Period of Assessment

2021–22 to 2022–23 Fiscal Years up to the end of the MAF assessment period ending on February 21, 2023.

Calculation Method

Based on the descriptions of each of the stages provided in the table accompanying the question statement, each organization will select the maturity stage that best describes the organization.

The maturity stage selected by the organization will correspond to the organization’s score. For example, if an organization selects ‘Stage 1’, then it will have a score of ‘1’. If an organization selects ‘Stage 2’, it will receive a score of ‘2’, and so forth. An average score in the assessed organization and across surveyed organizations can be computed.

In accordance with the guidance provided, if the evidence submitted does not align with what the organization has identified as their maturity stage, TBS may follow-up with the organization to seek additional evidence, or modify the maturity stage.

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Examples of acceptable evidence that can be submitted are: documents related to Performance Measurement and/or Evaluation Committee discussions and decisions.

Data collection Method

Documentary evidence supporting the following:

  • How the organization ensures ongoing requirements of the Policy on Results related to the DRF, including alignment between the DRF and the Program Inventory
  • Identifying to the Secretary of TBS issues and proposals related to compliance with Policy on Results requirements related to the DRF
  • How advice from the HoPM to PMEC was used to help improve the availability, quality, utility and use of DRF indicators
  • How advice by the HoE to PMEC was used to help improve the validity and reliability of DRIs, including for evaluations.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Question 2 – Performing Results–Based Management New

Using the maturity scale below, what stage best reflects the organization’s capacity to perform key results-based management activities pertaining to the Departmental Results Framework?

*Please note that the following stages are incremental, each stage encompasses the elements of earlier stages. For example, by selecting Stage 3, it is implied that your organization features all elements of Stages 1 and 2, in addition of featuring those in Stage 3.

Very rarely: 1%-25% Rarely: 26%-50% Sometimes: 51%-75% Often: 76%-99% Very often: 100%

Stage 1 Stage 2 Stage 3 Stage 4 Stage 5

Design Stage/Undeveloped

Early Stages of Development

Established practice

Advanced Practice implementation

Best practice

DRF and its components (CRs, DRs & DRIs) are very rarely used for and/or considered when:

  • defining the organization’s goals
  • planning for or strengthening data capacity, collection & analytics
  • communicating & engaging with all levels of department
  • Managing and improving policies, programs, & services
  • internal management reporting practices
  • external reporting practices

DRF and its components (CRs, DRs & DRIs) are rarely used for and/or considered when:

  • defining the organization’s goals
  • planning for or strengthening data capacity, collection & analytics
  • communicating & engaging with all levels of department
  • Managing and improving policies, programs, & services
  • internal management reporting practices
  • external reporting practices

DRF and its components (CRs, DRs & DRIs) are sometimes used for and/or considered when:

  • defining the organization’s goals
  • planning for or strengthening data capacity, collection & analytics
  • communicating & engaging with all levels of department
  • Managing and improving policies, programs, & services
  • internal management reporting practices
  • external reporting practices

DRF and its components (CRs, DRs & DRIs) are often used for and/or considered when:

  • defining the organization’s goals
  • planning for or strengthening data capacity, collection & analytics
  • communicating & engaging with all levels of department
  • Managing and improving policies, programs, & services
  • internal management reporting practices
  • external reporting practices

DRF is updated in a timely manner on the basis of lessons learned and changing circumstances

DRF and its components (CRs, DRs & DRIs) are very often used for and/or considered when:

  • defining the organization’s goals
  • planning for or strengthening data capacity, collection & analytics
  • communicating & engaging with all levels of department
  • Managing and improving policies, programs, & services
  • internal management reporting practices
  • external reporting practices

DRF is updated in a timely manner on the basis of lessons learned and changing circumstances

* Please note that for all stages (1 to 5), the organization is required to submit evidence to substantiate the stage selected. Organizations will need to identify the documents being submitted, describing the type of evidence shared, along with rationale linking each evidence (or group of evidence) provided with the elements in the description of the selected stage.

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

Departmental Results Frameworks (DRFs) were a central component brought about by the Policy on Results which have never been assessed. This question will inform the extent to which organizations can apply their DRF to support results–based management activities, as well as how the DRF has been embedded into the organization’s practices and its operations. Organizations in turn can apply the results of this question to improve and more closely consider alignment between its DRF and the operations of the organization, which in turn supports improved decision making and accountability.

Category

  • Policy Compliance
  • Performance
  • Baseline

Note: All baseline questions will be included in MAF reporting products.

Expected Results

Target: Stage 4

Based on the timeframe since the PoR implementation, it is expected that most organizations would have achieved a certain degree of maturity by now.

Assessed Organizations

The scope of this assessment will include all DRFs from large departments and agencies. A list of organizations not assessed against this Question is provided in Annex A.

Period of Assessment

2021–22 to 2022–23 Fiscal Years up to the end of the MAF assessment period ending on February 21, 2023.

Calculation Method

Based on the descriptions of each of the stages provided in the table accompanying the question statement, each surveyed organization will select the maturity stage that best describes the organization. The maturity stage selected by the organization will correspond to the organization’s score. For example, if an organization selects ‘Stage 1’, then it will have a score of ‘1’. If an organization selects ‘Stage 2’, it will receive a score of ‘2’, and so forth. An average score across surveyed organizations can be computed.

Of note, TBS will review the evidence provided by the organization and, if the evidence does not align to what the organization has identified, TBS may follow-up with the organization to invite submission of additional evidence, or modify the maturity stage.

Evidence Requirements

  • Department to provide evidence*
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

* Please note that for all stages (1 to 5), the organization is required to submit any evidence it sees fit to substantiate the stage selected. Organizations are encouraged to limit the quantity of evidence provided to a minimum. They are also encouraged to create a one-pager describing the type of evidence shared, along with a rationale (e.g. a Word document called ‘Rationale for Question 2’) linking each evidence (or group of evidence) provided with the elements in the description of the selected stage.

Data collection Method

Documentary evidence indicating how the organization’s DRF:

  • Supports defining the organization’s goals
  • Supports planning for or strengthening data capacity, collection and analytics
  • Is part of communications and engagements with all levels of the department
  • Supports managing and improving policies, programs and services
  • Supports internal management reporting
  • Supports external reporting
  • Is updated in a timely manner on the basis of lessons learned and changing circumstances.

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Gender-based Analysis Plus Reporting

The Canadian Gender Budgeting Act requires gender and diversity budgeting in the federal government’s budgetary and financial management process. This Act lays out responsibilities for the Minister of Women and Gender Equality Canada (WAGE) and the President of the Treasury Board in support of the principles of GBA Plus. WAGE provides leadership on GBA Plus across the Government of Canada by providing guidance, best practices, and expertise on the subject, whereas the President of the Treasury Board is required to make an analysis of the impacts of existing Government of Canada expenditure programs on gender and diversity available to the public each year. Departments provide this information in GBA Plus Supplementary Information Tables that are annexes to their Departmental Results Reports.

Question 3 – Collecting Diversity and Inclusion Related Data Preserved (Q2 from 2021-22)

What is the percentage of Programs from your organization’s Program Inventory, excluding internal services, that have data collection plans for reporting on impacts on gender and diversity?

To answer:

  • TBS to answer
  • Department or Agency to answer
  • Both TBS and Department or Agency to answer

Rationale

  • Since 2018, the Canadian Gender Budgeting Act requires the President of Treasury Board to make analysis of impacts of program expenditures on gender and diversity publicly available on an annual basis.
  • Since 2018-19, the gender-based analysis plus (GBA Plus) Supplementary Information Table associated with the Departmental Results Reports have requested that departments report on their data collection plans.
  • Formally assessing GBA Plus data collection plans through the MAF process allows Treasury Board Secretariat to proactively identify organizations and Programs where improvements are required. It also facilitates ongoing dialogue between the Secretariat and the performance measurement and evaluation communities on common challenges more broadly.
  • The absence of either current data, or a plan to collect and analyze future data on the impacts of Programs on gender and diversity is the most frequent challenge observed in providing GBA Plus analysis in support of decision-making by senior officials or Ministers.
  • As GBA Plus remains a developing capability within the Government of Canada, as per last year’s assessment, the assessment will be based on a binary “Yes/No” assessment rather than an assessment of quality. The results from this assessment will be used to inform ongoing dialogue among Treasury Board Secretariat, Women and Gender Equality Canada, other central agencies, and organizations about how to advance the application of GBA Plus in Programs.

Category

  • Policy Compliance
  • Performance
  • Baseline

Note: All baseline questions will be included in MAF reporting products.

Expected Results

100% of Programs have a GBA Plus data collection plan.

Assessed Organizations

The scope of this assessment will include all DRFs from large departments and agencies. A list of organizations not assessed against this Question is provided in Annex A.

Period of Assessment

2021-22 Fiscal year

Calculation Method

  • The scope of the assessment will include all Programs from the Program Inventory, excluding Internal Services.
  • Section 2 of the GBA Plus Supplementary Information Tables (“Gender and Diversity Impacts, by Program”) released on departmental and agency websites following the tabling of the 2021-22 Departmental Results Reports in Parliament will be assessed on whether or not they contain one or more of the following information for each Program:
    • a GBA Plus Data Collection Plan with specific notable actions and dates
    • Key Program Impacts on Gender and Diversity i.e., detailed qualitative and/or quantitative information on distribution of benefits or impacts on gender and diversity
    • a well supported rationale for not providing information in the other two categories above. A well supported rationale explains how gender and diversity considerations cannot be integrated to the program management.
  • Programs which provide any of the above information will be assessed as a “Yes.” Programs which do not provide any of the information will be assessed as a “No.”
  • The overall departmental result will be calculated by dividing the number of Programs from the Program Inventory, excluding Internal Services, assessed as “Yes” by the total number of Programs in the organization’s Program Inventory (excluding Internal Services).

Evidence Requirements

  • Department to provide evidence
  • TBS to provide evidence
  • Other evidence to be considered (please provide)

Section 2 of the GBA Plus Supplementary Information Tables (“Gender and Diversity Impacts, by Program”) released on departmental and agency websites following the tabling of the 2021-22 Departmental Results Reports in Parliament.

Date of data extraction:

Following tabling of 2021-22 DRRs in the Fall of 2022 or draft GBA Plus Supplemental Information Tables if tabling is to be after December 15, 2022

Data collection Method

Section 2 of the GBA Plus Supplementary Information Tables (“Gender and Diversity Impacts, by Program”) released on departmental and agency websites following the tabling of the 2021-22 Departmental Results Reports in Parliament

Government-wide Comparison

  • Yes the results of the indicator will be used for comparison across departments
  • No (please provide an explanation)

Year-Over-Year Analysis

  • Yes with the caveat that this year departments must go beyond indicating an intent to have data collection plans, and need to list specific actions and dates
  • No

Departmental Results Framework (DRF) (TBS use only)

Is this indicator used in the TBS DRF?

  • Yes
  • No

Annex A: Departments and Agencies that will not be assessed against Questions 1, 2, and 3

Annex B: Examples of Evidence for Questions 1 and 2

Question 1: At what stage is your organization in its capacity to implement key policy requirements pertaining to the Departmental Results Framework?

Question 2: Using the maturity scale below, what stage best reflects the organization’s capacity to perform key results-based management activities pertaining to the Departmental Results Framework?

Departments are expected to provide documents and/or links that can serve as the evidence to substantiate the self-assessment and to understand why a specific stage was selected. TBS will use the evidence to validate the response provided.

The following is a list of evidence that departments can use to answer Questions 1 and 2. This is not an exhaustive list. Departments can consider any relevant documentation that supports implementation of key policy requirements and capacity to perform key results-based management activities pertaining to the Departmental Results Framework.

Page details

Date modified: