3. Evaluation design

3.1 Purpose and scope

The evaluation was identified in ECCC’s 2014 Departmental Risk-Based Audit and Evaluation Plan. It was conducted in fiscal years 2014-15 and 2015-16 in order to meet commitments to evaluate the initiative, as well as to respond to the Treasury Board Policy on Evaluation requirement that all direct program spending be evaluated at least once every five years.

Data was collected and analyzed for all 10 program elements. However, this was an evaluation of the overall CAA Adaptation Theme, as opposed to an evaluation of individual program elements. As such, the findings presented reflect the overall results for the Theme and the degree of detail provided at the individual program level is limited.

Data collection for nine of the elements was conducted by R.A. Malatest and Associates Ltd between October 2014 and March 2015. TC’s Evaluation and Advisory Services conducted a separate, independent evaluation of its NTAI in fiscal year 2014-15, the results of which were integrated into this evaluation report.

The evaluation covers the period from 2011-12 to mid-2014-15, which is approximately two-thirds of the funding period, although relevant activity in the remainder of 2014-15 was also considered. The evaluation examined issues of relevance and performance (including effectiveness, efficiency and economy).

3.2 Evaluation approach and methodology

The following data collection methodologies were used to address the evaluation issues and questions. Further details on the evaluation methodologies can be found in Appendix D.

Document review

Documents from the program elements, as well as the Adaptation Theme overall, were reviewed. Thematic analysis was used to summarize the information from the documents under each evaluation question.

Data review

This review was primarily based on the performance data reported for indicators identified in the Adaptation Theme Performance Measurement Framework (PMF) for nine of the program elements as of November 2014. Although the evaluation assessed program performance relative to the outcomes listed in the revised logic model, the PMF nonetheless contained performance measurement information that was relevant to these revised outcomes.

Key informant interviews

Key informant interviews were conducted with program representatives and individuals external to the program. Since data collection for the NTAI was conducted separately as part of TC’s evaluation of this program, the interviews covered nine of the 10 program elements. A total of 90 interviews were conducted with 89 key informants (one individual provided feedback on two program elements). Thirty-five (35) or two of every five interviewees represented senior management and program managers who were directly involved in the delivery of program elements, while the remaining 54 interviewees represented stakeholders not directly involved in program delivery, including federal representatives; provincial, territorial and municipal organizations; Indigenous peoples; academics; and industry.


An online survey of external stakeholders was conducted between January 27, 2015 and February 24, 2015, to collect feedback on the relevance of the initiative and the achievement of outcomes. External stakeholders from eight of 10 program elements were invited to respond to the survey, because the small number of ECCC CCPSP stakeholders (fewer than 20) was not sufficient to generate reliable survey results and TC had already completed its own evaluation of the NTAI. A total of 148 individuals responded to the survey from a sample of 638, representing a 23% response rate. Respondents were invited to answer questions for up to two of the program elements with which they were most familiar.Footnote 1

Case studies

Case studies were conducted for four Adaptation theme initiatives, including:

  • SakKijanginnatuk Nunalik: Building Sustainable Communities in the Face of Changing Climate, with a focus on the Municipal Water Sub-Project (from INAC’s CARPAN)
  • Coastal Infrastructure Vulnerability Index (from DFO’s ACCASP)
  • Forest Change Initiative (from the NRCan Adaptation Program)
  • Climate Change, Adaptation and Acute Gastrointestinal IllnessFootnote 2  in the Canadian North (from the PHAC’s PPHSACC)

Data were collected for the case studies through document and file reviews and interviews with key stakeholders.Footnote 3  The intent of the case studies was not to generalize results to the program element overall, but rather to identify program insights, such as achievements, external factors and best practices and lessons learned.

3.3 Challenges and limitations

Challenges experienced during the conduct of the evaluation, and the related limitations and strategies used to mitigate their impact, are outlined below.

  1. Intended outcomes and performance measurement strategies were not fully identified for half of the program elements and the quality of existing performance measurement strategies varied considerably (see section 4.2.4 for a more detailed discussion). Due to limited Theme-related performance data, the evaluation evidence relied heavily on the findings from key informant interviews. To compensate for this, evidence from interviews was triangulated with that from other methodologies such as the survey and, to the extent possible, the document review. Case studies were also used to gather more detailed information on selected projects from individual program elements and to identify insights that may be useful to other program elements.
  2. A horizontal evaluation of a relatively heterogeneous array of activities with different areas of focus (that is, health, environment and infrastructure) poses challenges in terms of data aggregation to demonstrate the collective contribution of program element activities to Thematic outcomes. Furthermore, the diversity of Theme activities complicated measurement (that is, customization of data collection instruments to reflect 10 program elements) and the development of Theme-level findings. To address this challenge, the analysis considered the unique aspects of individual program elements and assessed how each element related to the overall Theme’s relevance and performance. Further, survey and interview guides included standard questions to gather the same feedback for different program elements, to facilitate aggregation. While the evaluation considered each of the 10 program elements, the focus was on an assessment of the overall Adaptation Theme.

Report a problem or mistake on this page
Please select all that apply:

Thank you for your help!

You will not receive a reply. For enquiries, contact us.

Date modified: