1. Introduction

The purpose of this formative evaluation was to assess whether the obligations of the Canadian Environmental Protection Act, 1999 (CEPA 1999) are being fulfilled and whether Environment Canada has undertaken the required actions to meet the Act's intent.

A renewed CEPA 1999 received Royal Assent on September 14, 1999, and was proclaimed in force on March 31, 2000. This Act is the federal government's primary piece of environmental protection legislation. It promotes pollution prevention and the protection of the environment and human health in order to contribute to sustainable development. This Act is more than double the length of the previous Canadian Environmental Protection Act (CEPA 1988), increasing from 149 sections to 356 sections. It provided new authorities to allow the Minister of the Environment (the Minister) flexibility in achieving environmental results. It also established requirements, many with specified time frames, that the Minister must meet.

Section 343 of CEPA 1999 requires a review of the Act to be conducted every five years by a committee of one or both of the Houses of Parliament. Such a review is expected in 2005. In support of the upcoming Parliamentary Review, the Minister of the Environment and the Minister of Health are expected to table a formal joint submission to Parliament in advance of the committee's work. This formative evaluation complements the submission of the Ministers. It provides an evidence-based, independent evaluation of progress to date in meeting the Act's obligations and the Department's expected outcomes for CEPA 1999 and its various Parts.

This document contains the results of the "Formative Evaluation of CEPA 1999 It is organized into four main sections:

In addition, the report includes a number of appendices containing more detailed information on key departmental outputs during the evaluation period. Finally, a summary of the views of stakeholders interviewed in support of this evaluation has been included as Annex 1 to the report.

The objectives of the evaluation were to:

CEPA 1999 is a federal law jointly administered by Environment Canada and Health Canada with the primary purpose of protecting the environment and contributing to sustainable development through pollution prevention. The Act aims to integrate environmental factors into all decision-making by government and private entities.

The Act commits the Government of Canada to implementing pollution prevention as a national goal and as the priority approach to environmental protection. The Act provides the federal government with new tools to protect the environment and human health and provides a framework for protecting Canadians from pollution caused by "toxic" substances.Footnote vi  The Act ensures that the potential risks posed by chemical substances and biotechnology products are properly assessed, establishes strict deadlines for controlling certain toxic substances and requires the virtual elimination of inherently toxic substances that are bioaccumulative, persistent and result primarily from human activity and that are not naturally occurring.

Through CEPA 1999, the Government of Canada expects to demonstrate national leadership, work to minimize overlap and duplication and increase harmonization across and within Canadian jurisdictions.

Box 1 - CEPA 1999 Guiding Principles

CEPA 1999 is described as being an "enabling" Act. However, it also imposes a significant number of obligations on Environment Canada, while maintaining many of the obligations that had been established for CEPA 1988 (see Appendix I). Many of these obligations are unique among Canadian jurisdictions. For example, the Act requires the Ministers of the Environment and Health to consider the risks posed by all substances included on the Domestic Substances List and, where appropriate, mandates the Department to propose and finalize risk management measures and tools within specified timelines.

CEPA 1999 is organized into 12 major Parts:

To achieve the objectives of CEPA 1999, Environment Canada works in partnership with Health Canada, other departments, the provinces and territories, Aboriginal governments, industry and the public.

The governance of CEPA 1999 is the responsibility of a number of parties:

The Department has established accountabilities for delivering on the Ministerial obligations and intended outcomes associated with each Part of the Act. These are documented in Appendix II.

The evaluation assessed departmental progress in achieving the Department's expected outcomes for CEPA 1999 and its various Parts over the period from its entry into force on March 31, 2000, through December 31, 2004.Footnote vii  The evaluation includes all the programs and activities established under CEPA 1999, including all the programs previously under CEPA 1988 that were continued under CEPA 1999. The evaluation focused on the first 11 substantive Parts of the Act.Footnote viii 

During the evaluation period, the administration of CEPA 1999 fell under the Department's Clean Environment Business Line umbrella (one of four business lines in the Department at the time). The evaluation includes all programs and activities conducted under the Clean Environment Business Line's Air Result and Toxics Result,Footnote ix  with the exception of environmental assessment.Footnote x  Relevant to CEPA 1999, this Clean Environment Business Line had the following expected outcomes:

Obligations and activities undertaken by Health Canada are excluded from the evaluation, as that department is conducting its own evaluation. The Act's obligations with respect to the role and activities of the Governor in Council (Cabinet) are also excluded. CEPA 1999 also places a significant number of obligations on Persons (e.g., no unauthorized use or disposal of identified substances or wastes). The evaluation undertook an indirect examination of the extent to which these obligations are being met by considering the effectiveness of departmental programs and procedures, including its compliance and enforcement activities. No independent assessment of the actual compliance of Persons with the various obligations was undertaken, however.>

Finally, the evaluation did not directly assess the actions and performance of other jurisdictions with equivalency agreements or agreements respecting administration with Aboriginal governments and peoples. Instead, the evaluation focused on the information- sharing and verification and assurance mechanisms instituted by the Department to ensure that the Act's obligations and objectives are satisfied.

CEPA 1999 is an enabling tool to protect human and environmental health. However, the "use" of the Act alone is not an effective measure of its achievements. In some cases, various Parts of the Act have not yet been used because there is no rationale at this point to apply the particular provisions. In other cases, different Acts, tools and/or instruments can be more effectively used to meet the intent of the EPA 1999 provisions (e.g., the use of the Fisheries Act or Canada-wide Standards). As a result, the evaluation does not assess whether all Parts of the Act are used and/or used equally, but rather whether the obligations in the Act are being fulfilled and whether the Department is organized in a manner that will enable progress on meeting the intended objectives of the Act.

The evaluation is formative in nature. It measures progress towards achieving results, such as whether systems and procedures are in place to implement the Act and whether the Department is on track to eventually achieve the intended outcomes. It does not evaluate actual environmental outcomes. The Act has not been in place long enough to fully evaluate its impact in terms of environmental results. Actual environmental improvements are noted where evident, however.

Responsibilities for delivering on the various obligations under CEPA 1999 are distributed across the Department, as noted above. Through the 2001-02 CEPA Operational Review exercise, the Department conducted an examination of implementation progress, using an approach that corresponded to the Department's program structure. For the purposes of the upcoming Parliamentary Review, however, the decision was taken to organize the evaluation framework in a manner that corresponds directly to the individual Parts of the Act, as Members of the Parliamentary Committee may be unfamiliar with the Department's organization and likely will want to know the impact of the Act itself. The evaluation focuses on the degree to which CEPA 1999 has enabled the achievement of the expected outcomes of CEPA 1999 (as documented in Appendix IV) rather than on the use of various Parts or instruments.

This evaluation is "evidence-based." That is, its conclusions and recommendations are based on objective, quantitative and documented evidence to the fullest extent possible. The evaluation was conducted in accordance with the work plan described in the Evaluation Plan prepared by the Department's Audit and Evaluation Branch. The major project Phases included:

During Phase I, a project initiation meeting was held with the Department's Evaluation Committee to review and confirm the project's scope and objectives; clarify roles and responsibilities; and finalize the evaluation work plan. A brief examination of the available documentationFootnote xi  was conducted to gain a better understanding of the range of documentation available to support the evaluation and to identify any shortcomings. A series of evaluation instruments were then developed, including:

The completed evidence collection templates, summary evidence templates, interview notes and other working notes were submitted to the Audit and Evaluation Directorate upon completion of the evaluation.

In Phase II, the evaluation instruments were applied to a review of the documentation made available to the evaluation team. Gaps in the evidence base were documented, and interviews were scheduled with the Department's Accountable Leads (see Appendix II). The Leads were provided with a summary of the gaps in the documentation for their areas of accountability in advance of the interview. The initial analysis was then updated to incorporate any additional documentation or information made available through the interview process.

Forty-five stakeholders were contacted through a parallel process and invited to provide input to the evaluation. Of those contacted, 35 accepted the invitation and agreed to be interviewed. (Appendix VIII includes a list of the individuals and organizations interviewed.) As stakeholder views provided limited hard "evidence" in support of the evaluation, these views have been summarized and presented separately as Annex 1 to this report.

In Phase III, the evidence within each area of relevance was analyzed, and preliminary findings were developed. The preliminary findings provided an assessment for each of the evaluation criteria within each Part and sub-Part of the Act. The preliminary findings were presented to the Evaluation Committee.

Presentations on the preliminary findings that were relevant to their individual areas of accountability were then made to the Department's Accountable Leads. The Leads were asked to identify errors and omissions and were requested to provide additional sources of evidence in instances where the preliminary findings were considered to be in error. Additional evidence received was analyzed accordingly, and the preliminary findings were updated.

Draft evaluation findings were then prepared. These draft findings consisted of three presentation decks:

In Phase IV, the Evaluation Committee was briefed on the draft findings of the evaluation. Subsequent debriefs were conducted with:

The individuals briefed were able to provide comment and feedback on the draft findings. Additional evidence received was analyzed, and the draft findings were updated.

In Phase V, the draft evaluation findings were documented as a Draft Report, which was submitted to the Evaluation Committee, Accountable Leads and Regional Directors General. Reviewers were asked to provide written comments. The Draft Report was adjusted, where appropriate, and a Final Evaluation Report was prepared and submitted.

There are a number of limitations associated with the methodology used in the evaluation. These include the following:

Page details

Date modified: