Final Audit Report - Audit of Performance Reporting

June 2013

Table of Contents

Executive summary

The focus of the audit is departmental performance reporting. Federal departments are required to report plans and performance results to demonstrate accountability for the funds received vis-à-vis the programs and services delivered. Quality reporting of both financial and non-financial information allows informed consideration in Parliament, provides an important input into management's decision-making processes and strengthens public accountability. Expectations for performance reporting are anchored on the Treasury Board Policy on Management, Resources and Results Structures which requires that each department and agency put in place a common framework for the collection, use and reporting of financial and non-financial information on programs.

The objective of the audit is to provide assurance that the governance, risk management and controls related to Health Canada performance reporting are adequate and effective. In the professional judgment of the Chief Audit Executive, sufficient and appropriate procedures were performed and evidence gathered to support the accuracy of the audit conclusion.

Overall, Health Canada does have an adequate and effective management control framework to support performance reporting requirements. In particular, it was noted that performance reporting roles and responsibilities related to positions and governance bodies' accountabilities are well understood and are applied as intended. The process that exists in branches for the review and scrutiny of performance results is supported by senior management however, the quality assurance practices for reviewing source data varies from branch to branch. Also, while departmental guidance exists in various forms, some updates are needed to more fully outline expectations for all key types of reporting.

The audit found that performance reports are balanced but could be better integrated. More specifically, it was noted that while management considers all aspects of performance, evidence shows that performance reports are presented and discussed separately.

Expected results committed to in the Report on Plans and Priorities and Departmental Operational Plan are being reported as planned. However, performance results specified in the mid-year review at the branch-level were difficult to understand, substantiate or trace back to the performance metrics in the branch operational plans.

Various departmental frameworks outline how risks as well as financial and non-financial results are to be used in decision-making, but the frameworks could be strengthened by having guidance be more closely aligned and more comprehensive. The audit noted that the budget process is not formally informed by risk and non-financial results yet risk, financial and non-financial information are considered in the resource re-allocation exercises.

Management agrees with the three recommendations and has provided an action plan to improve performance reporting in the Department.

A. Introduction

1. Background

The Treasury Board (TB) Policy on Management, Resources and Results Structures (MRRS) requires that each department and agency put in place a common framework for the identification of programs and for the collection, use and reporting of financial and non-financial information relative to those programs. The MRRS is designed to strengthen public sector management and accountability in that it provides a common basis for reporting on performance to Canadians and to Parliament. In addition to the policy requirements outlined in the MRRS, the Government of Canada has established some key reporting principles which state that departmental performance reporting should: present credible, reliable, balanced information; associate performance with plans, priorities and expected results; explain changes and apply lessons learned; and link resources to results.

The departmental strategic outcomes are outlined in the Program Alignment Architecture (PAA). Moreover, the departmental Performance Measurement Framework (PMF) contains performance indicators related to these outcomes. As such, the PMF provides the basis for assessing and reporting on performance. The Report on Plans and Priorities (RPP) includes both the expected results and performance indicators and related targets identified in the PMF as well as the high level plans to meet organizational priorities over the three-year planning period covered by the report. The Departmental Performance Report (DPR) reports on the progress achieved.

Internally, the Departmental Operational Plan (DOP) is a management tool that serves as the bridge between the detailed branch plans and the strategic commitments made in the RPP. It identifies the top goals and priorities that the Department will implement over the coming fiscal year to support its program activity commitments along with the resulting deliverables and related performance targets. Progress against targets is monitored through a formal mid-year review process. PMF provides the basis for evaluating performance towards program expected results in the Department's performance report. 

In addition to these reports, the Departmental Dashboard is an important internal reporting instrument that captures key performance results. It is a tactical operational monitoring tool for senior management (see Appendix C).

2. Audit objective

The objective of the audit is to provide assurance that the governance, risk management and controls related to performance reporting at Health Canada are adequate and effective to ensure that information for reporting is: effectively gathered and scrutinized; in alignment with the PAA/PMF; integrated and balanced; accessible; and used in support of resource allocation decisions and oversight practices.

3. Audit scope

The audit covers the reporting period of 2011-12 and 2012-13. The scope of the audit was Department-wide, focusing on the following key performance reports and the governance, risk management and control processes that support their production.

Audit scope
Nature of reporting Report
External reporting 2011-12 DPR tabled against the 2011-12 PMF and RPP
Internal reporting 2012-13 departmental dashboards
2011-12  annual review against 2011-12 DOP
2011-12 mid-year reviews against 2011-12 branch operational plans

All branches within the Department are implicated in the performance reporting process however, for the purposes of the audit, only the Healthy Environment Consumer Safety Branch, the First Nations and Inuit Health Branch, the Health Products and Food Branch, the Chief Financial Officer Branch and the Corporate Services Branch were examined in detail. The audit examined the governance and processes that enable the timely access to key informational inputs used to produce the key reports. The audit provided coverage of those sections of the Policy on the Management Resources and Results Structures that relates directly to performance reporting and considered higher risk. For internal reporting, the audit focused on the production and scrutiny of the Departmental Dashboard and the mid-year review and reporting that is done against the DOP and branch operational plans.

The audit did not examine the appropriateness of the expected results and related performance indicators, nor did it examine the processes and controls associated with their development or selection.

4. Audit approach

The audit examined the design and operation of Health Canada's governance, risk management and control practices in this area against a set of pre-defined audit criteria. The audit approach included but was not limited to: a review of documentation; policies; standards; guidelines and frameworks; interviews and observation; inquiry; testing and analysis.

The audit criteria, outlined in Appendix A, have been drawn from key sources including the TB Policy on Management, Resources and Results Structures and the Office of the Comptroller General Internal Audit Sector's Audit Criteria Related to the Management Accountability Framework: A Tool for Internal Auditors.

5. Statement of conformance

In the professional judgment of the Chief Audit Executive, sufficient and appropriate procedures were performed and evidence gathered to support the accuracy of the audit conclusion. The audit findings and conclusion are based on a comparison of the conditions that existed as of the date of the audit, against established criteria that were agreed upon with management. Further, the evidence was gathered in accordance with the Internal Auditing Standards for the Government of Canada and the International Standards for the Professional Practice of Internal Auditing. The audit conforms to the Internal Auditing Standards for the Government of Canada, as supported by the results of the quality assurance and improvement program.

B. Findings, recommendations and management responses

1. Governance

1.1 Roles, responsibilities and accountabilities

Audit criterion: Roles, responsibilities and accountabilities for performance reporting are defined and effectively communicated. Senior management is held accountable according to these expectations.

Clarity of roles, responsibilities and accountabilities for performance reporting are essential where inputs to key reports are varied and data collection, analysis and reporting requirements are complex.

A review of the terms of reference for the Executive Committee, Health Canada's most senior decision-making body, and the various branch executive committees, noted clarity around roles and responsibilities. Several interviews with departmental officials confirmed that roles, responsibilities and accountabilities with respect to performance reporting are generally well understood. Guidance documents such as the Integrated Planning Framework typically outline the expectations related to roles and responsibilities, although the framework does not outline expectations for the preparation of the Departmental Dashboard.

Interviews and documentary evidence demonstrated a consistent understanding of accountabilities, including accountabilities for verifying source data accuracy, for a review of the reasonableness of the messaging, and for ultimate accountability for performance and non-performance. Evidence shows that the emphasis on accountability has increased considerably over the past few years. The audit also assessed the terms of reference and minutes of branch and departmental governance bodies to obtain assurance that expectations are clearly outlined and that they are operating as intended. Roles and responsibilities were clearly articulated and minutes and records of decisions from senior committees showed evidence that these responsibilities are followed. Interviews with senior management indicated that management is held accountable for results and for the preparation and presentation of their respective performance reports.

1.2 Governance regime

Audit criterion: There is a well-defined and applied governance regime that enables the review, challenge and oversight of the Department's key internal and external performance reports.

Deputy heads are responsible for the effective, ongoing implementation of the Policy on Management, Resources and Results Structures. A well-defined and applied governance regime is essential to support the collection, management and reporting of financial and non-financial information relative to Health Canada programs.

The audit used interviews and process mapping to identify the processes by which governance bodies reviewed and scrutinized the key planning and reporting documents as part of their oversight responsibilities. Interviews and a review of minutes were used to determine the effectiveness of the governance function. The current governance regime within branches and at the department level is robust and appropriately focused on the review and challenge of performance reports. In addition, accountability for sign-off responsibilities is understood. Each of the branches were found to have a similar review and approval process for the Departmental Performance Report, mid-year and year-end reports and branch contribution to the Departmental Dashboard. The oversight process is outlined below.

Oversight process

figure

Text description

Data is collected monthly from many sources of record by program area, directorate, and branch. The responsibility for verification of data rests with the program managers and directors. Data quality assurance practices vary amongst the different branches. Program officials in planning and reporting units consolidate data from the originators and conduct a preliminary analysis of data for accuracy and integrity. The repeatability of the process, the program specific knowledge that resides in these consolidating units and the use of analytical steps such as period to period comparisons, contribute to the quality of the review. Directors and director generals undertake a secondary analysis of the results. This generally takes place at the program management table. Audit evidence suggests that the dialogue on results is relatively robust. Results are formally approved by the program Director General prior to being set for branch-wide consolidation and analysis. Program results are then consolidated at the branch level with branch planning units performing limited analysis. Branch executive committees review the results and assistant deputy ministers sign-off. Reasonableness tests are conducted where trends and anomalies are noted prior to routing to the Chief Financial Officer Branch for department-wide consolidation and production and review by departmental governance bodies. The Chief Financial Officer Branch aggregates branch level results for distribution and review by Executive Committee.

While the extent and nature of quality assurance varies across programs and branches, there are adequate compensatory measures in the form of governance, review and sign-off of reports (see section 3.4). Moreover, senior management indicated that the managerial culture has shifted towards one where monitoring of performance is increasingly institutionalized. It has also been noted that the climate is such that there is a willingness to discuss positive and negative results thereby creating more balanced reporting.

2. Risk management

2.1 Consolidated, comprehensive reporting

Audit criterion: Departmental and branch performance reports consolidate risk information with financial and non-financial performance information.

Performance and risk are closely related. Where performance relates to the achievement of results, risk describes the factors that may undermine achievement. In light of this relationship and of the value of both elements for decision-making, the degree to which key performance reports consolidate risk information with financial and non-financial performance information is important.

Health Canada's Integrated Risk Management Framework states that risk management is an integral part of decision-making to help improve the quality of decisions and the use of resources. Through an analysis of the key performance reports, it was determined that risk information is not uniformly presented. The 2011-12 Departmental Performance Report contains a risk management summary which provides a selection of activities to demonstrate the range of risks Health Canada manages and the concrete interventions made to reduce impacts or probabilities of adverse risk, or to enhance the impacts or probabilities of positive risks. The report also outlines key activities being planned and presents how they relate to and help manage risks identified in the Corporate Risk Profile (CRP). A review of the mid-year against the Branch Operational Plan and the year-end review against the Departmental Operational Plan indicated little evidence of integration between risk and performance monitoring.

In the case of the branch operational plans, activities are also linked to risks from the CRP. However, branches do not present risk consistently at the key activity level. Similarly, mid-year and year-end reviews do not reference risk associated with activities. Without risk information presented at this level across branches, the monitoring of risk and performance at the departmental level is challenging.

The Departmental Dashboard is an internal operational monitoring tool which enables senior management to identify potential risks and operational backlogs. The very tracking of these indicators on a periodic basis allows for trends to be noted, which has the potential to provide management with early warning signs of risks materializing.

While performance reports do not all contain an integrated depiction of risk with the financial and non-financial information, interviews and a review of minutes show that they are discussed in order to support decision-making (see Recommendation 1).

2.2 Use of funds

Audit criterion: The recipient uses funds as per the terms and conditions and for the purposes intended in the funding agreement.

Much of the recipient's research is performed in-house. However, 40 percent of expenditures are for payments made to third parties (researchers and service providers). Approximately $3.1M of the recipient's expenditures are for professional service providers, researchers and other suppliers of goods and services, such communication and publications, computers and web maintenance. Advisory committees and annual symposia require some travel, as experts live across Canada. Therefore, the procurement, remuneration and travel transactions were tested. The audit also examined investments and some receivables.

A total of 29 transactions were selected, based on risk and materiality, from expenses incurred during fiscal years 2010-11, 2011-12 and the first quarter of 2012-13 under these categories: procurement, compensations and benefits, travel and other expenses.

The transaction testing demonstrated that the expenses claimed were supported, approved at the right level and charged to the correct program. Funding was used for the purpose of carrying out activities covered in the funding agreement.

3. Internal controls

3.1 Reporting frames

Audit criterion: Performance reporting frames and corresponding planning and reporting instruments that exist at different levels of the Department are aligned and consistent with one another, where appropriate.

A key control to manage the reporting process is the alignment of reporting frames. Comprised of a set of performance indicators, these frames form the basis of data collection and the backbone for reporting. If the frames of inter-related reports are aligned, data collection and report preparation is made more efficient by allowing management to measure results once and use the information obtained for multiple reporting instruments. Not only does this reduce the reporting burden, but it also provides a measure of assurance that the results story is being told consistently across various reports.

At Health Canada, reporting frames are used as the backbone for various reports and are aligned where appropriate. More specifically, both the Departmental Operational Plan (DOP) and the Departmental Performance Report (DPR) share the same reporting frame, the Performance Measurement Framework. This allows the results collected for mid-year and year-end reviews of the DOP to be rolled up, as applicable, for the DPR production.

Following an analysis of the 2011-12 branch operational plans, it was noted that only one program branch had used the Performance Measurement Framework as its measurement backbone. The program branches could benefit from better aligning the branch operational plans with the departmental Performance Measurement Framework (see Recommendation 1).

Indicators found in the Performance Measurement Framework are sometimes used in the Departmental Dashboard. The Dashboard focuses on tactical indicators of operational performance whereas the DPR reports on the program results in support of the strategic outcomes. Nonetheless, dashboard metrics are related to, and indicative of, the achievement (or not) of the departmental strategic outcomes. Thus, dashboard data should be considered as a suitable source of information in the production of the DPR.

Overall, it was observed that there is alignment of the reporting frames between the DOP and the DPR which assists in rolling up results in an efficient way.

3.2 Common guidance and practice

Audit criterion: Common departmental guidance and practice exist to outline all expectations for performance reporting including, but not limited to control requirements related to data capture and quality assurance.

Clear and common guidance on performance expectations is important to support consistent preparation of performance reports, thus reinforcing accountability and governance.

The nature of the guidance provided in relation to performance reporting was assessed. It was noted that while reasonable guidance material exists, aspects of it can be further clarified and made more comprehensive. First, more clarity and detail should be provided on control requirements such as data capture and quality assurance. As well, the guidance in its current form could be more cohesive. For example, a comprehensive guidance document that articulates how all the key performance reporting instruments inter-connect with the objectives of each individual report would be valuable.

The Integrated Planning Framework generally specifies how performance information should be used to "learn and adjust", but does not specify the expectations for the use of performance reporting as an input into planning and resource allocation/ re-allocation decisions. It was also noted that in some cases, the Performance Measurement Framework and the DOP did not identify targets or target dates. Pre-defined performance targets are essential in supporting effective monitoring, allowing management to gauge gaps in performance and to make any necessary adjustments or re-allocations of resources.

Recommendation 1

It is recommended that the Chief Financial Officer update the Integrated Planning and Reporting Framework to establish and clarify common practices for performance reporting. This should include:

  • the expectations for data scrutiny and accountability;
  • a requirement for establishing targets and target dates for all key initiatives and corresponding outcomes as identified in the Performance Measurement Framework and the Departmental Operational Plan; and
  • an outline of how risk, financial and non-financial performance information is to be used in support of planning and resource allocation decisions.
Management response

Management agrees with the recommendation.

The Integrated Planning and Reporting Framework and guidance documents will be updated to reflect common practices for performance reporting as well as additional requirements as mandated by Treasury Board.

This will include:

  • expectations regarding the type of data required for the Performance Management Framework and the Departmental Operational Plan;
  • the requirement that each branch establish targets and target dates for key initiatives identified in the Performance Management Framework and the Departmental Operational Plan; and
  • information found in the Integrated Planning and Reporting Framework regarding how risk, financial and non-financial information can be used to support planning and resource allocation decisions will be updated.

3.3 Reporting against plans

Audit criterion: Performance reporting is done in accordance with planned results and measures (both financial and non-financial).

Performance reports are accounts of results achieved against planned performance expectations as set out in the respective Report on Plans and Priorities (RPP) and DOP.

In the case of the DPR and the annual review against the DOP, the results committed to in the plans (RPP and DOP respectively) are being reported as expected.

Less clarity exists in relation to the mid-year review at the branch level. Performance results were sometimes difficult to understand, substantiate or trace back to the performance metrics in the branch operational plans. It was further noted, based on a review of a sample of key activities and outputs, that branch assessments for the mid-year review exercise, were sometimes different from the consolidated mid-year review prepared by Chief Financial Officer Branch, against the DOP. In addition, adequate substantiating documentation linked to performance indicators, to explain the results, was not observed. Thus, it was difficult to determine whether key activities are being assessed and are contributing factors to the overall performance assessment. This could impede the clarity of reporting and the ability of management to determine where course corrections are necessary and to hold teams accountable for results.

Recommendation 2

It is recommended that the Chief Financial Officer, in collaboration with branch assistant deputy ministers, establish a formal process for the timely review of branch progress achieved against performance targets established in the departmental planning instruments.

Management response

Management agrees with the recommendation.

The Chief Financial Officer Branch, using Executive Committee - Planning and Accountability, will establish a formal review process to review key activities against performance targets reflected in departmental planning reports.

The Integrated Planning and Reporting Framework and guidance documents will be updated to include guidance on the new review process.

3.4 Information inputs

Audit criterion: The performance reporting process is supported by efficient access to relevant, substantiated and timely informational inputs.

It is expected that performance reports are prepared using readily available and reliable data. Performance reporting also depends on the use of common data definitions for ease of results aggregation and consistent data sources to measure the same expected results across different reports.

The data sources used for the preparation of the DPR were clearly identified; although this was not the case with the branch and departmental operational plans. In a number of cases, inconsistent data definitions made interpretation and aggregation of the results challenging. As well, it could not be concluded whether the same data sources were used across different reports (for example, Dashboard, branch reviews and the DPR), because data sources are not identified in all reporting or planning instruments.

As previously noted, the branch quality assurance practices over the source data varies from sophisticated quality management practices to no data verification. Gaps in data quality assurance practices may result in data integrity issues. However, through interviews with several departmental officials, it was noted that data trends rather than specific data points are the key area of interest (particularly in the Dashboard) and that, for DPR purposes, errors in specific data sets, once aggregated, will not be material. While the governance over the reports offers an important compensating control, efforts should be made to reinforce standardized expectations for data quality.

A large number of the data sources are used to produce the key departmental reports. As such, an analysis of the meaningfulness and traceability of some key results were tested. The metrics used for the Branch Operational Plan were similar to that of the PMF; however, there was not always a corresponding alignment of results from one report to another. The net impact of these results may not be meaningful, nor provide a fulsome depiction of the true outcomes being achieved.

Recommendation 3

It is recommended that the Chief Financial Officer, in collaboration with branch assistant deputy ministers, request that branch inputs to the Departmental Operational Plan include the sources of the information required to assess performance in order that corresponding performance reports use the same metrics.

Management response

Management agrees with the recommendation

The Chief Financial Officer Branch will request that branch inputs to the Departmental Operational Plan include the sources of information which will be used to assess performance.

3.5 Integration and use of performance information

Audit criterion: Consolidated financial and non-financial information is used to support planning, performance reporting, resource allocation and course corrections.

It is expected that performance reports are integrated into the planning and resource allocation functions of the Department. In addition, it is expected that resource allocation decisions are made on the basis of financial and non-financial factors, so that investment decisions are made not just on the basis of financial performance information (for example, variances), but also on the basis of expected results and actual performance.

In order to analyze the integration of information the audit examined the mid-year review process, the year-end review process, the Budget Day and the banking days. The results from the analysis of the mid-year review of the branch operational plans and the annual review of the DOP identified minimal integration of financial information. Whereas a review of the material used for the 2013-14 Budget Day showed that the key consideration is financial information and limited information is integrated on performance (non-financial) or risk information. However, in the resource re-allocation exercise better integration was found as branches provided the Chief Financial Officer Branch with financial and non-financial information, including risk information related to the achievement of departmental priorities.

The integration of financial and non-financial information in these reports is important for a fulsome review of performance, in support of management decision-making, particularly if course corrections or resource allocations are required.

Currently the Department does not have a single framework or guidance to describe how risk as well as financial and non-financial performance information should be integrated in performance reporting and used for decision-making (see Recommendation 1).

C - Conclusion

Performance reporting in large organizations is challenging especially as it relates to coordinating and efficiently leveraging a vast amount of performance information from a variety of disparate sources. Conveying a consistent "performance story" across multiple reports, aimed at different stakeholders, is also complex and characterized by risks related to the comparability of results and the reliability of information.

Much progress has been made in the Department's performance reporting regime. A strong "tone at the top" from senior management is contributing to an emphasis on management and monitoring practices. Each year, the Deputy Head states that managers are expected to implement their plans, track activities, address variances and report against their commitments. This creates a requirement for quality performance reporting and is working well to help reinforce a culture of accountability and dialogue on the results. It is also noted that the Department continues to strengthen its practices. In addition, efforts to streamline reporting requirements are being implemented at the same time as reporting requirements from Treasury Board of Canada Secretariat are themselves evolving.

Overall, it can be concluded that the governance, risk management and control processes in place to support departmental performance reporting are adequate and effective. Some areas of improvement have been noted which will collectively enhance the efficiency of reporting processes and the robustness of information used for results-based decision-making.

Appendix A - Lines of enquiry and criteria

Audit of Performance Reporting
Criteria Title Audit Criteria

Line of Enquiry 1: Governance

The Department has adequate and effective governance mechanisms to ensure accountability, scrutiny and review related to its key performance reports.

1.1 Roles, responsibilities and accountabilities Roles, responsibilities and accountabilities for performance reporting are defined and effectively communicated. Senior management is held accountable according to these expectations.
1.2 Governance regime There is a well-defined and applied governance regime that enables the review, challenge and oversight of the Department's key internal and external performance reports.

Line of Enquiry 2: Risk management

Risk information is integrated into key performance reports to provide a holistic picture of results.

2.1 Consolidated, comprehensive reporting Departmental and branch performance reports consolidate risk information with financial and non-financial performance information.

Line of Enquiry 3: Internal controls

The Department has adequate and effective internal controls in place to structure data collection and reports and to manage and use performance information in support of reporting.

3.1 Reporting frames Performance reporting frames and corresponding planning and reporting instruments that exist at different levels of the Department are aligned and consistent with one another, where appropriate.
3.2 Common guidance and practice Common departmental guidance and practice exist to outline all expectations for performance reporting including, but not limited to control requirements related to data capture and quality assurance.
3.3 Reporting against plans Performance reporting is done in accordance with planned results and measures (both financial and non-financial).
3.4 Information inputs The performance reporting process is supported by efficient access to relevant, substantiated and timely informational inputs.
3.5 Integration and use of performance information Consolidated financial and non-financial information is used to support planning, resource allocation and course corrections.

Appendix B - Scorecard

Scorecard
Criterion Rating Conclusion Rec #

Governance

1.1 Roles, Responsibilities and accountabilities S Performance reporting roles, responsibilities and accountabilities of key positions are clearly articulated and governance bodies are relatively well established, senior managers are held accountable for the results. -
1.2 Governance regime S The governance regime in the Department is robust and appropriately focussed on the review and challenge of performance reports. There is appropriate scrutiny and review of performance reports. -

Risk management

2.1 Consolidated, comprehensive reporting NMO Performance reports do not all contain an integrated depiction of risk as well as financial and non-financial information which may create difficulty for senior managers in making informed decisions. 1

Internal controls

3.1 Reporting frames NMI There is alignment of the reporting frames between the Departmental Operational Plan (DOP) to the Departmental Performance Report (DPR) which assists in rolling up results in an efficient way.  -
3.2 Common guidance and practice NMO Clear and common guidance on performance expectations is important to support consistent preparation on performance reports, thus reinforcing accountability and governance. While reasonable guidance material exists, aspects of it can be further clarified and made more comprehensive. 1
3.3 Reporting against plans NMO For the DPR and the annual review against the DOP, the results committed to in the plans (Reports on Plans and Priorities and DOP respectively) are being reported as expected. Less clarity exists in relation to the mid-year review at the branch level. 2
3.4 Information inputs NMO Sources of data used for the preparation of the DPR were clearly identified; although this was not the case for the branch and departmental operational plans. 3
3.5 Integration and use of performance information NMO The Department does not have a single framework or guidance to describe how risk, financial and non-financial performance information should be integrated in performance reporting and used for decision-making. 1

S = Satisfactory
NMI = Needs Minor Improvement
NMO = Needs MOderate Improvement
NI = Needs Improvement
U = Unsatisfactory
UKN = Unknown / Cannot Be Measured

Appendix C - Linkage between planning and reporting instruments

Linkage between planning and reporting instruments

figure

Text description

Overall description: The purpose of the appendix is to illustrate the linkages and relationships between planning instruments and departmental performance reports. The key internal and external departmental performance reports are the Deputy Minister-Dashboard and the Departmental Performance Report (DPR) respectively. The performance management planning instrument in support of the DPR is the Performance Management Framework (PMF). The PMF sets out the requirements or means by which the Department demonstrates accountability by mapping the financial and non-financial appropriations that federal organizations receive by aligning program activities to a set of high level outcomes.

The graph demonstrates three levels of planning instruments (boxes on the left) which then result into departmental performance reports (boxes on the right).

The first level planning instrument is the Program Alignment Architecture (PAA) and the Performance Measurement Framework (PMF) (1st box on the left) which encompasses:

  • Strategic outcome
  • Program activity
  • Expected results
  • Output
  • Performance indicator
  • Performance indicator details

This planning instrument informs the Departmental Performance Report (DPR) (1st box on the right) which encompasses:

  • Strategic outcome
  • Program activity
  • Expected activity/Expected results
  • Performance indicators
  • Targets
  • Actual results

The second level planning instrument is the Departmental Operational Plan (2nd level box on the left) which encompasses:

  • Key Initiatives
  • Branch and Program Activity Architecture (PAA)
  • Expected Result/Output
  • Performance Indicator
  • Performance Target

This planning instrument informs the Departmental Mid-Year Status & Progress Report (2nd level middle box) which encompasses:

  • Strategic outcome
  • Program activity
  • Performance Result by branch contribution
  • Operational Priorities
  • Branch and Program activity
  • Key Initiatives
  • Performance Assessment

The 3rd level planning instrument is the Branch Operational Plan (3rd level box on the left) which encompasses:

  • Program activity priorities and deliverables
  • Key Initiatives
  • Output
  • Performance Indicator Target

The Branch Operational Plan (3rd level box on the left) informs two elements: 1) the DOP (2nd level box on the left) and 2) the Branch Mid-Year Status and Progress Report (3rd level box in the centre) which encompasses:

  • Strategic outcome
  • Program activity
  • Performance Result by branch contribution
  • Operational Priorities
  • Branch and Program activity
  • Key Initiatives
  • Performance Assessment

The DOP (2nd level box on the left) and the Branch Operational Plan (3rd level box on the left) both inform Branch Input – Year –End Review (2nd level box on the right) which encompasses:

  • Strategic outcome
  • Performance Indicator Targets
  • Performance assessment
  • Operational Priorities
  • Key Departmental Priorities
  • Branch and Program activity
  • Year-end performance status rating
  • Strategic outcome
  • Key Initiatives
  • Branch and Program activity
  • Expected result/output
  • Performance Indicator
  • Performance target
  • Year-end status

The 4th level reporting instrument is the Deputy Minister Dashboard (4th level middle box) which is an operational monitoring tool to identify areas of concern, potential risk and operational back-logs. It encompasses:

Program Activity:

  • Workload Performance of Pre & Post Regulatory:
  • Applications, Decisions & Inspections
  • Workload Targets, Timelines & Backlogs

Internal Services:

  • Financial Position (Budget vs. Actual)
  • Internal Service Performance
  • Performance Targets & Timelines 
Report a problem or mistake on this page
Please select all that apply:

Thank you for your help!

You will not receive a reply. For enquiries, contact us.

Date modified: