3. Evaluation design

3.1. Purpose and scope

The evaluation was conducted in order to meet the requirements of the Financial Administration Act and the 2009 Treasury Board of Canada Policy on Evaluation, which state that all ongoing grants and contributions programs and direct program spending must be evaluated at least once every five years.

The evaluation initially addressed the five-year time frame from 2010–2011 to 2014–2015, but was updated with recent information available from 2015–2016 and 2016–2017. The scope of the evaluation scope covers ECCC activities related to water quality monitoring, research activities and laboratory support, including activities supported through G&Cs. The following activities have been removed from the scope of the evaluation:

  • Activities related to the research framework for climate change adaptation, because this work is being evaluated within the context of the Evaluation of the Clean Air Agenda Adaptation Theme;
  • ECCC’s hosting and support of UNEP GEMS, because this activity has ended and there was no requirement identified by management for a “lessons learned” assessment of this work; and
  • Activities that were transferred out of sub-program 1.2.1 in 2015–2016 and which are the focus of other specific evaluations (e.g., Great Lakes Program; Lake Winnipeg Program and St. Lawrence Program).

Additionally, activities related to oil sands monitoring were given limited focus in the evaluation, as these activities have been recently reviewed by the Commissioner of the Environment and Sustainable Development (CESD)Footnote 9 and the Expert Panel on Assessing the Scientific Integrity of the Canada-Alberta Joint Oil Monitoring.Footnote 10

3.2. Evaluation approach and methodology

The following methodologies were used to collect evidence for the evaluation.

Document review. Relevant documentation on the management, delivery and results of core activities related to water quality and aquatic ecosystems health was reviewed and key points related to the evaluation questions were identified and summarized. This included foundational documents, previous evaluations and audits of program components, partner agreements, policy documents, strategies, work plans, performance reports and other strategic documents.

Bibliometric analysis. An analysis of the Program’s publications related to water quality and aquatic ecosystems health was conducted in order to assess productivity and alignment with core activities covered by the Program. Given that scientists within the WSTD are active members of the wider scientific community, collaborations and linkages at the national and international level were also assessed as an indicator of impact. The bibliometric analysis included unique publications from 2010 to 2014, and provided an analysis of the raw data by year, number of authors, number of unique journals, diversity of institutions, and external collaborations.

G&C project file review.Footnote 11 A sample of 17 files from the 40 G&C projects that were funded between 2010 and 2014 was reviewed. The sample included: single-year and multi-year agreements; a range of research categories; a variety of Canadian universities and research institute recipients across all regions; and a range of funding amounts. This data collection method addressed the evaluation questions related to expected outcomes, appropriateness of design, efficiency, and performance information.

Key informant interviews. A total of 41 interviews were conducted with program representatives, provincial and territorial partners and other external stakeholders, in order to obtain a cross-section of views. The distribution of interviews by respondent category is shown in Table 2.

Table 2: Key informant interviews conducted by category
Stakeholder category Interviews conducted
ECCC program managers and scientists in national and regional offices 17
Program partners representing provincial and territorial agreements, and international and domestic water boards 14
Key external data users of ECCC water quality monitoring data 7
G&C project funding recipients 3
Total 41

Case studies. Two case studies were conducted to provide additional insight for the evaluation. The following were the two case studies:

  • Pilot project implementing the risk-based approach (RBA) tool (power analysis) for the PPWB’s water quality data. This case study examined the 2011–2012 pilot project, which studied the ability of current sampling strategies in the watershed to detect long term trends and contribute to optimized monitoring strategies.
  • CABIN’s “network of networks” approach for inter-agency collaboration and data sharing. This case study examined the use of CABIN for collaboration and data sharing among stakeholders.

The method used in the case studies included targeted interviews with five ECCC representatives and three external stakeholders, and a review of relevant documentation.

3.3. Challenges and limitations

Challenges that arose during the evaluation, as well as the related limitations and strategies used to mitigate their impact, are outlined below:

  • The Program went through a significant transition during the evaluation time frame, which posed a challenge in terms of assessing its performance over time.Footnote 12 To address this challenge, interviewers were trained to seek clarity regarding the time frame applicable to interviewees’ comments. Additionally, findings are presented in such a way as to acknowledge the changes that took place, and focus on the most recent Program activities and results.
  • This Program is highly interconnected and plays a supportive role for many other programs, making it difficult to attribute impacts solely to the Program. The evaluators recognized the interdependencies between the various programs and tried to clearly delineate findings and relevant links, where possible. Emphasis was placed on examining the Program’s contribution to, rather than attributing the Program’s impact on, achievement of the intended outcomes.

Page details

Date modified: