Evaluation of the Community Ecosystem Partnerships Program: chapter 3


3.0 Evaluation Design

3.1 Purpose and Scope

The Evaluation of Community Ecosystem Partnerships (sub-sub program 1.3.4.5 in EC's Program Alignment Architecture) was identified in the 2013 Departmental Risk-Based Audit and Evaluation Plan, which was approved by the Deputy Minister in the spring of 2013. The evaluation was conducted between September 2013 and March 2014 in order to meet evaluation coverage requirements of the Financial Administration Act (for G&Cs) and the Treasury Board Policy on Evaluation (for direct program spending), which require that an evaluation of all ongoing G&Cs and direct program spending, respectively, be conducted once every five years.

The purpose of this evaluation was to assess the relevance and performance (including effectiveness, efficiency and economy) of the CEP program. The CEP program represents 0.29% of the department's 2013-2014 direct program spending (DPS), including G&Cs. The evaluation covered the five year time frame from 2008-2009 to 2012-2013 and focused on two regional programs within the CEP program: The AEI and the Okanagan-Similkameen PE. It should be noted that the evaluation did not examine all G&Cs in the West and North Region, only those related to the Okanagan-Similkameen PE (approximately 28% of all G&Cs disbursed by the CEP program in the region). Given that ecosystem activities in the Prairie and Northern Region ceased in 2010-2011, these were not examined in this evaluation.

Top of Page

3.2 Evaluation Approach and Methodology

The methodological approach and level of effort for this evaluation were determined using a risk-based approach. The following data collection methodologies were developed to adequately address the evaluation issues and questions. Evidence gathered was then used to develop overall findings and conclusions.Footnote11

Document Review

A document review was conducted as part of the evaluation. Key documents were gathered, listed in an inventory and then each document was assessed in terms of its contribution to each of the evaluation questions and corresponding indicators. Information was captured for analysis using a document review template. Key documents reviewed for the purposes of this evaluation included Government of Canada and Departmental publications (e.g., Speech from the Throne, Departmental Performance Reports), internal CEP documents on program priorities, processes and governance structure, strategic plans for AEI and Okanagan-Similkameen programs, and overviews of individual funded projects.

Administrative File Review

The review of contribution files included such items as contribution agreements, project activity reporting, financial files (e.g., requests for payment, cash flow statements, recipients' accounting of expenditures), as well as progress and final reports. The review of project-level files and data was intended to fill gaps in the program documents, files and data. The file review included a sample of 24 (out of 101) contribution agreements under AEI and 8 (out of 8) under the Okanagan-Similkameen initiative. The review assessed the extent to which projects funded under these CEP initiatives achieved their expected outcomes and whether their activities were undertaken in an efficient and economical manner.

Key Informant Interviews

Key informant interviews were conducted with 33 respondents to gather detailed information related to all evaluation questions and issues. Interviews were conducted by telephone using a semi-structured interview guide tailored to the specific respondent group. The following provides a breakdown of the interviewees: program staff and management (n=13); internal Environment Canada partners (in Atlantic and Quebec and West and North) (n=5); federal partners (n=1); provincial partners (n=4); project funding recipients (e.g., non-governmental organizations) (n=9); and unfunded proponents (n=2).Footnote12

All relevant stakeholder perspectives were considered. The evaluation methodology provided a balanced blend of views on program performance, as over 60% of interviewees were not directly accountable for the program's delivery and almost 50% were external to the Department.

Top of Page

3.3 Limitations

Challenges experienced during the conduct of the evaluation, the related limitations and strategies used to mitigate their impact are outlined below:

Top of Page

Page details

Date modified: