3. Evaluation design

3.1. Scope

The purpose of this evaluation was to assess the relevance and performance of the six Umbrella Terms and Conditions in facilitating the achievement of environmental outcomes over the five years from 2010–2011 to 2014–2015. An evaluation of the UTCs was part of Environment Canada’s 2013 Risk-based Audit and Evaluation PlanFootnote 6 and was a specific TBS request. The evaluation primarily examined the relevance and performance of projects funded under the UTCs, and these findings were rolled up to draw conclusions about the UTCs overall. In addition, the evaluation incorporated an assessment of the UTC approach as an enabling mechanism for Gs&Cs funding.

This evaluation supplements evaluations of the numerous individual Gs&Cs programs that use the six umbrella authorities, in order to respond to the requirements of the Financial Administration Act to evaluate all ongoing programs of Gs&Cs at least once every five years. Individual Gs&Cs programs are evaluated on an ongoing basis as part of evaluations of the associated PAA sub-programs and sub-sub-programs to which they are aligned. Evaluations of individual UTC Gs&Cs programs are listed in Annex D.

One key issue that was considered in the evaluation design is the fact that the UTCs do not constitute a typical single “program.” The UTCs are purposefully designed as an enabling mechanismFootnote 7 for funding to support a very broad range of outcomes associated with different individual programs across the department. It was therefore difficult to link the large number of funded projects and associated programs to a shared set of measurable outcomes and to report on these outcomes in a coherent fashion. To address this issue, evidence of progress toward positive environmental outcomes was aggregated across UTCs and the contribution of funded projects to the UTCs’ expected results (see Annex C) was assessed.

3.2. Evaluation Approach and Methodology

The research activities that were undertaken are briefly described in this section, and further details on the methodological approach are available in Annex E.

Document Review Documents related to the relevance, design and efficiency of the UTCs were reviewed, for example, the Terms and Conditions for each umbrella authority, a 2013 internal audit of Gs&Cs and a 2009 evaluation of ECCC’s former class Gs&Cs.

Administrative Data and File Review: The Gs&Cs database kept by ECCC’s Corporate Services and Finance Branch was reviewed to provide descriptive information on projects funded under the UTCs, including expenditures by program and branch (see Annex B). The evaluation also examined a sample of 85 project files in order to assess the extent to which the projects funded under the UTCs met intended project goals within expected timelines, leveraged resources from partners, and contributed to the UTC expected results. The file review sample was selected from a total of 635 projects completed in the timeframe from 2010–2011 to 2014–2015.

Key Informant Interviews: In order to gather opinions and observations related to all of the evaluation questions, a total of 17 key informant interviews were conducted with ECCC program managers with experience using one or more of the UTCs (n=15) and representatives of Corporate Services and Finance Branch involved in coordinating the management of the umbrella authorities (n=2).

Recipient Survey: An online survey of funding recipients was conducted to assess views on ECCC’s delivery of Gs&Cs under the UTCs and the contribution of funded projects to environmental outcomes, both short-term project outcomes and UTC expected results. The survey focused on completed projects funded under the six UTCs from 2010–2011 to 2013–2014. Of 294 funding recipients invited to participate in the survey, responses were received from 57 recipients for a response rate of 19%.

3.3. Challenges and Limitations

Challenges encountered in the conduct of the evaluation, as well as associated limitations and mitigation strategies, are outlined below.

  • As the UTCs support a broad range of program outcomes that span the department’s entire PAA, it was difficult to organize the numerous contribution agreements into a limited set of clear environmental outcomes against which program performance could be assessed. To mitigate this challenge, evaluators designed the interviews and survey to capture standard ratings of progress toward intended short-term project outcomes, without reference to specific environmental outcomes. In this way, it was possible to aggregate evidence across all six umbrellas.
  • There are limited performance data on UTC intended outcomes or expected results. Although performance information for PAA programs and sub-programs is collected as part of the department’s Performance Measurement Framework, there are no performance data focused specifically on the UTCs’ expected results (and associated performance indicators) and no departmental management information system that consolidates project-level performance data. Moreover, there is limited information on short-term project outcomes in project final reports, as these reports focus primarily on the degree of completion of funded project activities and outputs as specified in the contribution agreement. To address this challenge (as noted above), standard ratings of the degree of achievement of intended short-term project outcomes were obtained in the key informant interviews and recipient survey, and standard ratings to estimate projects’ contribution to the applicable UTC expected results were obtained in the project file review and recipient survey.
Report a problem or mistake on this page
Please select all that apply:

Thank you for your help!

You will not receive a reply. For enquiries, contact us.

Date modified: