Audit of External Reporting on Performance: chapter 4


3. Findings and Recommendations

Overall, the audit found adequate governance was in place at the departmental level for both the Departmental Performance Report (DPR) and Canadian Environmental Sustainability Indicators (CESI). The controls in place over CESI reporting were operating effectively, ensuring that the CESI met the quality attributes. However, the governance for the DPR was not sufficiently supported by branch-level management processes and controls to compile, review and document the performance data and information and ensure it is properly approved and presented for submission to the DPR.  This allowed for variability in the quality of performance data and information presented in the DPR.

3.1 Overall Governance for External Reporting

Governance would be expected to include at least:

  • Documented clear roles and responsibilities at corporate and branch levels;
  • Clear processes for planning, coordinating and reporting (including quality assurance);
  • Clear lines of authority for approvals.

The audit examined the governance exercised over the external reporting on performance for the DPR and the CESI. Consistent with previous Office of the Auditor General of Canada’s reports and the TB Policy on Management, Resources and Results Structures (MRRS), it was expected that Environment and Climate Change Canada would have proper governance in place that ensures:

Coordination and control of the various performance measurement frameworks in the Department

Senior management roles, responsibilities and approval processes were in place for both DPR and CESI reporting. The Deputy Minister has delegated to the Finance Branch responsibility for updating the Performance Measurement Framework (PMF) and managing the cycle for the RPP and DPR according to Treasury Board Secretariat (TBS) guidelines. The Strategic Policy Branch (SPB) has the responsibility for CESI reporting. The Deputy Minister (DM) and assistant deputy ministers (ADMs) sign final approval on all DPR and CESI reporting.

Coordination and control of the various performance measurement frameworks in the Department, such as for DPR and CESI reporting, did not present any notable difficulties. Good informal cooperation among the branches existed to reduce duplication of effort. For example, many of the CESI supported by the Science and Technology Branch (STB) had been used in the preparation of the DPR reporting.

For RPP and DPR reporting, senior management roles and responsibilities have been assigned, there was a functioning system of approvals, and senior management provided overall supervision of the maintenance of the PMF and RPP/DPR reporting.

The governance over CESI reporting was functioning in a proper manner. A description of some of the related governance processes can be found in more detail in section 2.2 on CESI.

Integration of relevant data and information

To integrate relevant data and information into the DPR, the Finance Branch, through its Corporate Management Directorate (CMD), communicates to the various branches the TBS requirements and formats for updating the Program Alignment Architecture (PAA) and PMF and preparing the RPP and DPR. The branches retain the responsibility for the preparation and quality of RPP and DPR content, including performance reporting against expected results.

Each branch has established its own measures and routines for developing and reviewing the content of the DPR. Over the past few years, the CMD has provided considerable informal assistance to the branches for their PAA, PMF and DPR submissions.  For example, the CMD has helped the branches with the development of their outcomes definitions (expected results), the development of their performance indicators and targets, and the wording for inclusion in the DPR. However, there was no additional formal guidance on appropriate management processes and controls to compile, review and document the performance data and information and ensure it is properly approved and presented for submission to the DPR.  

As observed below in section 2.3 on performance data and information in the DPR, branch processes and controls did not always sufficiently support departmental governance for the production and quality of DPR performance information. This led to variability in the quality of performance data and information in the DPR.

A formal system of lessons learned and follow-up

The audit noted that there have been notable improvements in formalizing lessons learned. Until recently, lessons learned for DPR reporting at CMD and in the Branches were not formally recorded and followed-up. Without some form of formal departmental documentation, past lessons learned might have already been lost or may not be retained in future.

During the audit, the Finance Branch, led by the CMD, initiated a formal discussion of lessons learned with selected planners from across the Department, who are also members of the Planning Collaboration Network (PCN). The CMD initiated the use of information sharing / coordination software (SharePoint) to record CMD actions for updating the 2014-2015 PMF and to document PCN activities. By the end of the audit, the PCN had met twice to discuss lessons learned. The members contributed many ideas for improving DPR external reporting, such as the break-down of the overall process into smaller manageable stages to help provide clarity and improve the timeliness of branch submissions. Other suggestions included making better use of the SharePoint webpages, improving CMD and Network written and verbal guidance, and ways to improve both overall timeliness and the approval processes. The CMD documented the meetings on SharePoint. These events occurred near the end of the audit, and we observed no subsequent activities or follow-up on the recorded lessons learned.

3.2 Controls over the quality of Canadian Environmental Sustainability Indicators

ECCC reports its Canadian Environmental Sustainability Indicators (CESI) online. While these indicators are scientific in nature, they are also used to measure progress in terms of the goals and targets of the Federal Sustainable Development Strategy (FSDS), report to Canadians on the state of the environment and describe Canada’s progress on key environmental sustainability issues.

On a daily basis, Canadians access CESI information for personal, academic and economic reasons--on average close to 300 times per day. Over the past two years, ECCC has increased the number of indicators reported to over 50. These indicators are important to ECCC’s profile in the community and its scientific reputation.

The indicators, supported by scientific methodology, are updated as new data become available. The internet posting of an indicator contains links to a “Data Sources and Methods” webpage which describes how the data is collected and its limitations, and to a webpage that describes the indicator in detail. It may also contain links to related topics.

ECCC’s Strategic Policy Branch (SPB) leads the process and its Quality Manual for CESI sets out steps to be followed, provides templates and describes the required documentation procedures. In consultation with branches or other government departments, the SPB develops new indicators and updates existing ones on a cyclical basis.

During the development or cyclical update of an indicator, the SPB prepares a brief summary sheet for discussion with its partners in the internal review. The summary sheet contains a policy relevance rating, a utility rating, a methodology soundness rating and a data availability rating.

Each indicator being developed or updated must go through a rigorous technical review by either the Environment and Climate Change Canada program or other government departments, and the SPB. For the purpose of documenting this review, each comment or suggestion received is put into a disposition table which accumulates all the interventions and their disposition. Once there is consensus that the science is sound and the proper review has been carried out, approval processes are followed which lead to senior management and Deputy Minister (DM) approval before reporting. The SPB uses information sharing / coordination software (SharePoint) to record and store all the documentation of work done. The files are organized by indicator, allowing retrieval for future updates and lessons learned.

The audit examined the controls over the quality of CESI to verify whether they ensured that the reported indicators met the quality attributes of robustness, accuracy, reliability and timeliness according to the audit criteria. Control procedures were examined through inquiry, document examination and the audit of a small sample of indicators posted online to test the application of the quality controls.

The AEB observed that a workable framework and guidance were developed; the development and update of CESI processes were properly documented; and the review and approvals were properly carried out. The controls in place were operating effectively, ensuring that the external reporting of CESI met the quality attributes.

3.3 Performance data and information in the DPR

The Department’s annual RPP and annual DPR play an important role in the governmental accountability structure within the Expenditure Management System. They respectively define the plans and performance for each departmental program, in terms of both devoted resources and achievements.

The 2012-2013 DPR sets out for each program and sub-program a description; the financial resources planned and spent; the human resources planned and used; a table which shows the expected outcome results with related performance indicator(s), target(s) and actual results; and a Performance Analysis and Lessons Learned section.

Parliament and the Treasury Board rely on the RPP and DPR as accountability documents in the Expenditure Management Process. These public documents provide the Department’s performance story to Parliamentarians and Canadians for the departmental programs and sub-programs.

In order to assess the quality of the data and information reported externally in the DPR, the AEB examined a sample of 18 out of a total of 48 expected results for the 34 programs and sub-programs reported in the 2012-2013 DPR. The AEB examined each sample item for its robustness, reliability, accuracy and timeliness. A description of the methodology and of each of the information and data quality attributes are listed in Annex 1.

Branch-level management processes and controls

The audit looked for branch-specific frameworks, controls and documentation for external reporting consistent with a branch’s share of DPR reporting responsibilities under the existing governance regime. It also looked for appropriate management processes and controls at the branch level to compile, review and document the performance data and information, and ensure the data/information presented for submission in the DPR is properly approved, to sufficiently support the departmental governance over DPR reporting. In particular, it looked for processes of planning for external reporting, established roles and responsibilities to exercise controls for technical and edit review, and a hierarchy of approvals for branch submission to the DPR. Based on the AEB’s examination of the sample items, none of the branches examined had a formal process for ensuring that management properly plan for and review material for external performance reporting. The branches relied on their designated coordinators to organize the collection of the required information for submission to the DPR, according to the TBS instructions communicated by the CMD under its annual reporting cycle. Each program or sub-program manager set their own expectations for quality and performed their own technical and edit quality review before giving their approval.

The branches and CMD used emails and a variety of document formats and weblinks to record their work in preparing the DPR. For the purposes of the audit, several programs and sub-programs had some difficulty in producing sufficient documentation to demonstrate in a clear manner the work that had been done to acquire and integrate the necessary data and information to prepare their DPR submissions, and how these were subject to appropriate technical review and approval. The AEB noted that at the Meteorological Service of Canada (MSC), a detailed work instruction is being developed in the context of the branch’s Quality Management System to ensure the appropriate approvals are obtained and the appropriate records are retained.

Although the branch ADMs approved all program material for submission to the DPR, overall, branch-level management and documentation processes and controls had not ensured consistent adequate quality of the data and information reported in the DPR.

Sample Results

For the 18 sample items examined, the AEB observed that four had met all four of the quality attributes and two had met none of the attributes. The detailed results of the testing have been shared with the respective branches. The following table summarizes the results of the sample:

Sample Results
Result Number of sample items
Met all attributes 4
Met some attributes or would have met with sufficient explanation of the stated result in the DPR 12
Met none of the attributes 2
Total 18

Based on the sample examination, the audit found material (see Annex 1 for definition) variability in the robustness, reliability, accuracy and timeliness of the actual results reported in the DPR. As described below, the reasons vary for the number of sample items that did not meet the quality attributes.

Regarding the quality attribute of robustness, in two cases Environment and Climate Change Canada was still trying to find suitable consistent data sources that could be repeatable. In two other cases, Environment and Climate Change Canada was depending on external input that did not materialize. Where the robustness of data or information collection methods were poor, this may also have led to a failure to meet the other quality attributes.

As an example of poor robustness, the expected result in one sample itemFootnote3 was “Reduced emissions from the implementation of new environmental technologies”. The AEB selected for audit the performance indicator that stated “Annual reduction of emission of air pollutants (criteria air contaminants) resulting from environmental technologies supported”. The indicator had five specific targets in megatons for the reduction of certain pollutants. The Actual Results column in the DPR reported 16 tonnes of total air contaminant reductions in 2012; it also reported that amounts for individual substances were not available. Environment and Climate Change Canada needed the input from three external bodies, but only one of those provided data on time to be included in the DPR actual results. The figure of 16 tonnes was based on the report of the one external entity that reported. In addition, Environment and Climate Change Canada was not able to review or verify the validity of the reported figure.

As an example of a lack of accuracy, the expected result for a selected itemFootnote4 was “Targeted sectors have the meteorological and environmental information they need to operate efficiently and safely”. The performance indicator was the combined level of satisfaction of the four main clients of the Meteorological Service of Canada (MSC), including National Defence. The actual results reported included a statement that the number of responses from National Defence was too small to be included. Without the supporting data from National Defence, the reported figure of 7.8 out of 10 may not have been accurate.

Regarding reliability, the AEB found that for some samples the alignment of the actual results reported with the PAA and PMF as expressed by the performance indicator or the expected result was not reliable because the data and information collected inadequately represented the performance indicator or the expected result. Another main reason for the information not being reliable was a lack of consistency from year to year: for some samples, the data were collected in a different way or there had been changes to the PAA or to the indicator since the previous year; often, there was no explanation provided for the lack of consistency. For others, it was the first time they were being reported and data limitations were not explained.

As an example of data and information collected that inadequately represented the performance indicator or the expected result, the expected result for a selected item was “Compliance with pollution laws and regulations administered by ECCC”. The performance indicator was “compliance with selected regulations reported under this indicator: Dry Cleaning Regulations (initial pilot; other regulations to be added)”. The Actual Results reported in the DPR related to a Canadian Environmental Protection Act, 1999 (CEPA) dry cleaning regulation (Tetrachloethylene Use in Dry Cleaning and Reporting Requirements) selected for the pilot.  Recognizing the unique challenges posed to the measurement of enforcement actions, the pilot followed a rigorous methodology which produced accurate results. Although the actual result reported, a 51% compliance rate, was accurate in relation to that one regulation, the pilot regulation was only one of many regulations under pollution laws administered by ECCC referred to in the DPR Expected Result, and this was not fully explained in the DPR. As such, the result stated was not reliable for assessing the achievement against the overall indicator or expected result, which focused more broadly on pollution laws and regulations. The program intends to apply the lessons learned in the pilot and measure more regulations in future to more reliably present the performance picture. Overall, there are material incidences of variability in the data and information reported in the DPR.

3.4 DPR sections on Performance Analysis and Lessons Learned

The information provided in the DPR under the Performance Analysis and Lessons Learned section sets out the departmental programs’ or sub-programs’ highlights for the year. These usually do not relate specifically to the stated expected result, but focus on achievements for a variety of initiatives and activities.

The audit selected six highlight statements from the 2012-2013 DPR Performance Analysis and Lessons Learned sections for examination from within the previously sampled items. Because of their level of detail and relation to important program initiatives, these statements are vital to informing the reader of the DPR about the Department’s activities.

The AEB found that the controls over the quality of information included in the DPR sections on Performance Analysis and Lessons Learned were functioning effectively and ensured the robustness, accuracy and timeliness of the information, with one weakness noted regarding reliability: for all the items selected, the performance analysis was incomplete because the Department did not present the achievements being reported in terms of the planned or expected result. For example, one highlight stated:

“Carried out over 59,000 compliance promotion activities in 2012-2013, most of which promoted 9 high-priority risk instruments related to two acts: Canadian Environmental Protection Act (CEPA) 1999 (specifically, the Chemicals Management Plan and the Clean Air Regulatory Agenda), and the Fisheries Act.”

Although the AEB verified that this statement was robust, timely, accurate and reliable in all other respects, it does not offer the reader any indication as to whether this result was what was expected or whether it was an improvement or setback compared to previous years.

3.5 Recommendations

Recommendation 1

In cooperation with the responsible program branches, the Assistant Deputy Minister (ADM) of the Finance Branch should provide guidance to assist the branches in ensuring their reported performance information is of acceptable quality and is well documented. In so doing, the ADM of the Finance Branch, with the cooperation of the branches, should consider adapting some of the review procedures and tools being used for CESI reporting.

Management Response

The Finance Branch agrees with the recommendation, but notes that it regularly provides guidance and assistance to the program branches to support performance reporting.

The Finance Branch will take steps to provide detailed guidance to program branches to support the preparation of performance information. In collaboration with program branches, this will include the examination of review procedures and tools employed by CESI, as well as other leading practices within program branches.

Recommendation 2

The ADMs of the Science and Technology Branch, Environmental Stewardship Branch, Meteorological Service of Canada, Enforcement Branch and Strategic Policy Branch should establish effective methods to properly document their submissions and review of performance information for the DPR.

Management Responses

Responses of the Science and Technology Branch, Environmental Stewardship Branch, Enforcement Branch and Strategic Policy Branch

Program branches will work with the Finance Branch to build on existing systems and best practices to establish effective methods to document their performance information submissions, taking into consideration program variations and complexities.

Methodologies will be based on guidance from the Finance Branch with respect to baseline requirements for effective documentation.

Meteorological Service of Canada’s Response

The Meteorological Service of Canada (MSC) will continue to build on existing systems and best practices to establish effective methods to properly document their performance information submissions, such as the already established International Organization for Standardization (ISO) 9001 certified Quality Management System (QMS), committing the MSC to continuous improvement of its systems and processes.

Included within the QMS processes are strong governance and clearly identified accountabilities, such as the multi-branch MSC-led Director General Leads Committee and Semi-Annual Management Reviews held by the QMS Steering Committee, chaired by the Assistant Deputy Minister of MSC.  

Page details

Date modified: