MAF 2015 to 2016 management of integrated risk, planning and performance methodology

Table of contents

Methodology overview

To ensure effective and efficient management of federal organizations and their programs, expenditure management decisions need to be supported by metrics and analytics. The objective of the Management of Integrated Risk, Planning and Performance Area of Management (AoM) is to capture a system-wide view of planning and performance measurement practices, and the use of performance measurement information to strengthen management of risk, planning and priority setting, program management and decision making, and reporting. Collectively, these activities form the annual integrated planning cycle.

The purpose of the annual planning cycle is to ensure that departments and agencies use quality performance information in a timely manner to manage their programs so that they achieve their expected results and advance the organization’s mandate.

This AoM will allow Deputy Heads and the Treasury Board Secretariat (TBS) to ensure departments and agencies are positioned to create, use, and report quality performance information and to identify areas requiring further guidance or attention, and enable organizations to share leading practices related to performance measurement.

The methodology for this AoM is organized following three key areas of focus, which represent the integrated planning cycle:

  1. Creation of Quality Performance Information
  2. Use of Performance Information in Decision Making
  3. Reporting on Performance Internally and Externally

Approach for 2015-16 MAF

As noted in the 2014-15 MAF, the methodology for this AoM is following a phased-in approach. In 2014-15, many expectation-setting questions were asked and a baseline of information was gathered to enable future trend analyses. Many of those questions yielded positive system-wide results, notably: departments and agencies have attained a high level of maturity in the development and implementation of corporate risk and results-based management tools over time; Program Alignment Architectures (PAAs) are at acceptable levels of quality; and departments have been strategically refining and integrating their planning processes (e.g., Integrated Business Plans and RPPs) to increase operational efficiency.

System-wide gaps in performance measurement were also identified through the 2014-15 MAF. For example, the majority of large departments and agencies (LDAs) did not have results available to support trend analysis of program results for more than 75% of their indicators at the Program level over two consecutive years (beginning in fiscal year 2012-13). In another instance, a third of LDAs did not have performance information available for reporting at more than 90% of their lowest level programs.

As a result, the methodology for this AoM has been updated to reflect the 2014-15 findings. The evidence used to support performance information has been expanded to include performance information sources beyond Performance Measurement Frameworks (PMFs) such as Performance Measurement Strategies (PM Strategies), Corporate Risk Profiles, Audits, and Evaluations.

Questions pertaining to the mature tools of risk management and strategic planning have been refined to focus on the practices (i.e., the identification and monitoring) of risk and planned priorities. Additionally, a set of new questions have been added to determine whether and how organizations are using their existing performance measurement data to manage their programs. The intent of these questions is to understand the mechanisms organizations employ to translate their performance information into a useful tool for managing programs.

The 2015-16 MAF methodology reflects today’s context of the government-wide need for better metrics and analytics to support strengthened government management. The three new areas of inquiry noted above (Creation, Use, and Reporting) aim to ensure that departments and agencies are positioned to create quality performance information, which is accessible, and can be used to support program expenditure management decisions within their own organizations and in central agencies.

This methodology was developed in collaboration with departments and agencies. Unless otherwise indicated in individual questions, the reference period for the Management of Integrated Risk, Planning and Performance AoM questionnaire is to .

Following the 2015-16 MAF year, TBS will review the government-wide findings of the new and ongoing questions contained in this methodology with the view to maintain stability for conducting a three-year trend analysis of findings following the 2016-17 MAF. Where appropriate, TBS may revise questions to continue implementing the principles of MAF 2.0 including, posing questions that provide Deputy Heads and TBS with useful information to improve management performance.

Questionnaire

Creation of quality performance information

Outcome statements: Department or agency has clear collection strategies in place to create quality performance information.

Indicators and Calculation Method (where applicable) Policy Referencetable 1 note * Evidence Source and Document Limit Category

Table 1 Notes

Table 1 Note 1

See the Glossary at the end of this document for the long version of the abbreviated policy and policy instrument names.

Return to table 1 note * referrer

  1. Is the department or agency’s PMF of record at an acceptable quality to support trend analysis and managing for results?

    • Yes
    • No

    Rationale: 

    There is an expectation that PMFs be at an acceptable quality, as assessed by TBS.

Policy on MRRS, 6.1.1.2

MRRS Instructions, Section 5

TBS to assess as part of annual review process of departmental PMFs of record, based on criteria identified in the MRRS Instructions

No Evidence to Submit

Performance

  1. For what percentage of performance indicators in the department or agency’s PMF have methodologies been developed?

    • 0-29% of indicators
    • 30-59% of indicators
    • 60-89% of indicators
    • 90-100% of indicators

    Rationale: 

    Departments and agencies are encouraged to ensure that methodologies be developed for all the performance indicators they use. The question is limited to indicators in the PMF to facilitate calculation.

    Calculation of measure: 

    # of PMF indicators for which methodologies have been developed (not necessarily submitted)

    (divided by)

    The total # of PMF indicators (excluding Internal Services)

Policy on MRRS

MRRS Instructions, 6.3.1

Performance indicator methodology template and evidence of calculation by the established methodology.

If methodologies have been submitted through HYLX, a note to that effect can be used as evidence instead of the methodology template but evidence of calculation must still be provided.

Up to Two Pieces of Evidence

Performance

  1. Have the department or agency’s PMF and PM Strategies been mapped to show alignment between the two tools?

    • Yes, Fully
    • Yes, Partially
    • No

    Rationale: 

    Departments and agencies are encouraged to document linkages between their PMF and PM Strategies and to ensure the alignment and coherence of information between the two tools for the same program, and avoid duplication of efforts.

Policy on MRRS

MRRS Instructions, 5.2

Guide to Developing PM Strategies, 6.3

Crosswalk, map or other evidence to show alignment or linkages between the two measurement frameworks.

One Piece of Evidence

Practice

  1. What percentage of the department or agency’s lowest-level PAA programs are covered by a PM Strategy that has been actively implemented?

    • 0-29% of lowest-level programs covered by a PM Strategy
    • 30-59% of lowest-level programs covered by a PM Strategy
    • 60-89% of lowest-level programs covered by a PM Strategy
    • 90-100% of lowest-level programs covered by a PM Strategy

    Rationale: 

    Departments and agencies are encouraged to ensure that sufficient performance information is being collected through their PM Strategies to support program evaluations and management decision-making for the entire PAA (even if evaluations are not aligned one-to-one with PAA programs).

    Calculation of measure: 

    # of lowest-level programs covered by a PM strategy that has been actively implemented (i.e., data as per the PM Strategy framework that is being collected as planned so that credible and reliable performance data is available to effectively support evaluation and management decision-making)

    (divided by)

    the total # of lowest-level programs (excluding Internal Services)

Policy on MRRS

MRRS Instructions, 5.2

Directive on the Evaluation Function, 6.2.1

Evidence of calculation by the established methodology

One Piece of Evidence

Performance

Use of performance information in decision-making

Outcome statements: Department or agency uses performance information to identify and monitor progress against priorities, risks and program results.

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. Does senior management use performance information on program efficiency and effectiveness to identify risks, establish priorities and/or support resource allocation decisions? (Select all that apply)

    • Yes, to identify risks
    • Yes, to establish priorities
    • Yes, to support resource allocation decisions
    • Performance information is sometimes used for some or all of the above purposes, but not in a consistent manner
    • No, performance information is not used for any of the above purposes (Go to Question 7)

    Rationale: 

    Departments and agencies are encouraged to use performance information to inform their planning and resource allocation decisions. The choices provided in Question 6 regarding the tool(s) used are to help establish a baseline.

Policy on MRRS, 5.2

MRRS Instructions, 6.4

Policy on Evaluation, 6.1.5

N/A

Practice

  1. If senior management uses performance information for any of the purposes in question 5, which results-based tool(s) serve as a source of the performance results used? (Check all that apply)

    • PMF
    • PM Strategies
    • Evaluations
    • Dashboard/Scorecard
    • Other (e.g., Audits, Corporate Risk Profile, internal databases, Project Briefs, Service Management Strategy, etc.)

N/A

Risk identification, priority setting or resource allocation exercise material, i.e., deck, record of decision, annotated agendas or similar product/tool

Up to Five Pieces of Evidence

Practice

  1. Does senior management use performance information to monitor progress in year against risk mitigation strategies, planned priorities, and/or to make adjustments to resource allocations? (Check all that apply)

    • Yes, to monitor progress against risk mitigation strategies
    • Yes, to monitor progress against planned priorities
    • Yes, to make adjustments to resource allocations
    • Performance information is sometimes used in year for some or all the above purposes, but not in a consistent manner
    • No, performance information is not used for any of the above, or only annually (Go to Question 10)

    Rationale: 

    Departments and agencies are encouraged to monitor progress against planned priorities in-year (not just annually) and ensure appropriate alignment of risk mitigation strategies and allocation of resources. The choices provided in Questions 8 and 9 regarding the frequency and the tool(s) used are to help establish a baseline.

N/A

N/A

Practice

  1. If senior management uses performance information for any of the purposes in question 7, how often does senior management do this?

    (Check the most common time period that applies)

    • Monthly
    • Quarterly
    • Semi-annually
    • Other timeframe

N/A

List of meetings, agenda, record of decisions, etc.

One Piece of Evidence

Practice

  1. If senior management uses performance information for any of the purposes in question 7, which results-based management tool(s) is/are being used? (Check all that apply )

    • PMF
    • PM Strategies
    • Evaluations
    • Dashboard/Scorecard

    Other (e.g., Audits, Corporate Risk Profile, internal databases, Project Briefs, Service Management Strategy, etc.)

N/A

Documents that show internal monitoring of risk mitigation strategies, planned priorities and resource allocations

Up to Five Pieces of Evidence

Practice

  1. Does the department or agency use performance information from its PMF, PM Strategies, evaluations and/or other results-based management tools to support the proposals to Cabinet committees (i.e., Treasury Board Submissions and/or Memoranda to Cabinet)? (Check all that apply)

    • Yes, performance information from the PMF is used in all proposals
    • Yes, performance information from PM Strategies is used in all proposals
    • Yes, performance information from Evaluations and/or other results-based management tools is used in all proposals
    • Performance information is sometimes used in proposals but not in all of them
    • No, performance information is never used, or almost never used in proposals

    Rationale: 

    Departments and agencies are encouraged to use performance information stemming from their PMF, PM Strategies and Evaluations to support their proposals to Cabinet committees.

Policy on MRRS, 5.2

MRRS Instructions, 6.4

Policy on Evaluation, 7.3.1

No Evidence

Practice

Internal and external performance reporting

Outcome statements: Department or agency has performance information available for internal and external reporting.

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. Do the department or agency’s internal tracking and reporting tools align to the PAA to report on expenditures and full-time equivalents (FTEs) to lowest level programs, including the 10 Internal Services (IS) sub-programs?

    • Not aligned
    • Somewhat aligned
    • Fully aligned

    Rationale: 

    Departments and agencies are encouraged to align their internal tools to their PAA to help track and report on expenditures and FTEs for each program at all levels of the PAA.

    Definitions: 

    Not aligned
    Tracking and reporting tools are not aligned to the PAA to support reporting both on expenditures and FTEs to lowest level programs or can only support reporting on either one of them (expenditures or FTEs).
    Somewhat aligned
    Tracking and reporting tools are aligned to the PAA to support reporting both on both expenditures and FTEs for some but not all lowest level programs.
    Fully aligned
    Tracking and reporting tools are aligned to the PAA to support reporting both on expenditures and FTEs for all lowest level programs.

Policy on MRRS, 6.1.6

Guide on Internal Services Expenditures, 3.2

Screenshot(s) of Departmental tracking and reporting tools

One Piece of Evidence

Practice

  1. Do the department or agency’s internal tracking and reporting tools align to the PAA to report on program results to lowest level programs, including the 10 Internal Services (IS) sub-programs?

    • Not aligned
    • Somewhat aligned
    • Fully aligned

    Rationale: 

    Departments and agencies are encouraged to align their internal tools to their PAA to help track and report on program results for each program at all levels of the PAA.

    Definitions: 

    Not aligned
    Tracking and reporting tools are not aligned to the PAA to support reporting on program results to lowest level programs.
    Somewhat aligned
    Tracking and reporting tools are aligned to the PAA to support reporting on program results for some but not all lowest level programs.
    Fully aligned
    Tracking and reporting tools are aligned to the PAA to support reporting on program results for all lowest level programs.

Policy on MRRS, 6.1.6

Guide on Internal Services Expenditures, 3.2

Screenshot(s) of Departmental tracking and reporting tools

One Piece of Evidence

Practice

  1. For what percentage of lowest-level programs in the PAA are expenditures and the number of full-time equivalents (FTEs) available and reported in 2014-15 (including for the 10 Internal Services sub-programs, if applicable)?Footnote 1

    • 0-29% of lowest-level programs
    • 30-59% of lowest-level programs
    • 60-89% of lowest-level programs
    • 90-100% of lowest-level programs

    Rationale: 

    Departments and agencies are encouraged to ensure that financial and human resources are available and reported to the lowest level program, including the 10 Internal Services sub-programs.

    Calculation of measure: 

    # of lowest-level PAA programs for which financial and human resources were reported in 2014-15

    (divided by)

    The total number of lowest-level PAA programs (including Internal Services sub-programs, if applicable).

Policy on MRRS, 5.2.3, 6.1.6

Guide on Internal Services Expenditures, 3.2

TBS to assess as part of submission of departmental 2014-15 Actual Financial and FTE data sheets and the 2014-15 DPR.

No Evidence to Submit

Performance

  1. For what percentage of lowest-level programs in the PAA are program results available and reported in 2014-15 Departmental Performance Reports?

    • 0-29% programs
    • 30-59% programs
    • 60-89% programs
    • 90-100% programs

    Rationale: 

    Departments and agencies are encouraged to ensure that results against performance indicators are available and reported to the lowest level program.

    Calculation of measure: 

    # of lowest-level PAA programs for which performance results were reported in 2014-15

    (divided by)

    The total number of lowest-level PAA programs (excluding Internal Services)

Policy on MRRS, 5.2.3, 6.1.6

TBS to assess as part of submission of departmental 2014-15 Actual Results data sheets and the 2014-15 DPR.

No Evidence to Submit

Performance

  1. For what percentage of indicators from the 2014-15 PMF does the department or agency have performance data available (actual results) to support trend analysis over at least two consecutive data points, beginning in the 2013–14 fiscal year or before?

    • 0-29% indicators
    • 30-59% indicators
    • 60-74% indicators
    • 75-89% indicators
    • 90-100% indicators

    Rationale: 

    While it is recognized that some indicators may be refined from year to year to address methodology issues or changes in program mandate, intended results and methods of intervention, departments are encouraged to ensure overall stability of the PMF to support trend analysis for their programs. Indicator counts are used as a proxy unit for programs to facilitate standard calculation.

    Calculation of measure: 

    # of indicators from the 2014-15 PMF for which current and past results are available for at least two consecutive data points

    (divided by)

    The total # of indicators in the 2014-15 PMF (excluding Internal Services)

Policy on MRRS, 6.1.5

Evidence of calculation by the established methodology

(Source: 2013-14 and 2014-15 DPR)

The 2013-14 fiscal year will serve as a baseline year

One Piece of Evidence

Performance

  1. What percentage of the department or agency’s 2014-15 Evaluation reports were posted online in a timely fashion as required by the respective policy?

    • 0-49% reports
    • 50-74% reports
    • 75-89% reports
    • 89-99% reports
    • 100% reports

    Rationale: 

    Departments and agencies must ensure that reports containing information on important aspects of their performance, such as Evaluation reports are made easily available online to Canadians in a timely manner.

    Calculation of measure: 

    Number of Evaluation reports posted online in a timely fashion (within 120 days of approval by the Deputy Head)

    (divided by)

    The total number of Evaluation reports completed in 2014-15.

Policy on Evaluation

Directive on the evaluation function 6.1.5 (2), 6.3.1 (8)

Evidence of calculation by the established methodology

One Piece of Evidence

Performance

  1. What percentage of the department or agency’s 2014-15 Internal Audit reports were posted online in a timely fashion as required by the respective policy?

    • 0-49% reports
    • 50-74% reports
    • 75-89% reports
    • 89-99% reports
    • 100% reports

    Rationale: 

    Departments and agencies must ensure that reports containing information on important aspects of their performance, such as Internal Audit reports are promptly made available online. Ninety days is the timeframe used in OCG guidance “Submitting Internal Audit Products to the Office of the Comptroller General (OCG)”.

    Calculation of measure: 

    Number of Internal Audit reportsFootnote 2 posted online in a timely fashion (within 90 days of approval by the Deputy Head)

    (divided by)

    The total number of Internal Audit reports completed in 2014-15.

    If no Internal Audit report was completed in 2014-15, please submit a note to that effect in place of the evidence of calculation.

Policy on Internal Audit (s. 6.1.5)

Guidance: Submitting Internal Audit Products to the Office of the Comptroller General (OCG)

Internal Auditing Standards for the Government of Canada (s. 4.2)

Evidence of calculation by the established methodology

One Piece of Evidence

Performance

Glossary

Business planning
The process of planning initiatives and activities to achieve the future direction and priorities of an organization, typically over the next twelve to eighteen months.
Evaluation
In the Government of Canada, evaluation is the systematic collection and analysis of evidence on the outcomes of programs to make judgments about their relevance, performance and alternative ways to deliver them or to achieve the same results.
Performance Measurement Framework (PMF)
Information, which supports the PAA, refers to program expected results, performance indicators, targets, full-time equivalents (FTE), actual results, etc.
Performance Measurement Strategy (PM Strategy)
A PM Strategy is the selection, development and ongoing use of performance measures to guide program or corporate decision making. In this guide, the recommended components of the PM Strategy are the program profile, logic model, Performance Measurement Strategy Framework and Evaluation Strategy.
Program Alignment Architecture (PAA)
information refers to program alignment, program titles and program descriptions.
Strategic planning
The process of setting the future direction and priorities of an organization, typically over the next three to five years.

The following terms are considered synonymous: integrated business plan, corporate plan, business plan, annual plan, and operational plan.

Abbreviated policy and policy instrument names used in the “Policy Reference” column:

Policy on MRRS
Policy on Management, Resources and Results Structures
MRRS Instructions
Instructions to Departments for Developing a Management, Resources and Results Structure
Guide on Internal Services Expenditures
Guide on Internal Services Expenditures: Recording, Reporting and Attributing
Guide to Developing PM Strategies
Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies

For additional definitions of the terms and expressions used in this questionnaire, please refer to Appendix B of the MRRS Instructions and Annex A of the Policy on Evaluation.

Page details

Date modified: