Evaluation of Canada’s meteorological warning services for Arctic Ocean: chapter 5


3.0 Evaluation Design

3.1    Purpose and Scope

The purpose of this evaluation is to assess the relevance and performance (including effectiveness, efficiency and economy) of the METAREA Initiative, thus fulfilling a commitment to evaluate the Initiative and meeting the requirements of the TB Policy on Evaluation to evaluate all direct program spending every five years. While evaluations typically explore issues related to ongoing program management, the METAREAs Initiative is a time-bound initiative tied inextricably to the start-up or initial implementation of the METAREAs. As such, issues of implementation and ongoing delivery are treated interchangeably in the context of the current evaluation.

Since associated outcomes are not included in the program performance measurement strategy, they are also not assessed in the current evaluation. The effectiveness of the program is evaluated in relation to the Initiative’s primary intended outcomes, as described in Section 2.6 (above) and Annex 1.

The evaluation covers the four-year time frame from 2010-2011 to 2013-14 and addresses only the actions taken under the METAREAs Initiative, as a related evaluation of the 2.2.2 sub-program, Meteorological and Ice Services in Support of Marine Navigation, had already been completed in 2011. That evaluation was larger in scope, but did not contain evidence related to the METAREAs, as the Initiative had only recently been funded. This evaluation builds on the 2011 evaluation.

3.2    Evaluation Approach and Methodology

Data collection for the evaluation occurred between November 2013 and February 2014. The following data collection methodologies were employed and evidence from these methods triangulated to develop findings and conclusions.Footnote 4

Document and Literature Review
A document and literature review was conducted as part of the evaluation. Key documents were gathered, listed in an inventory and then each document was assessed in terms of its contribution to each of the evaluation questions and corresponding indicators using a document review template. Documents included: descriptive program information (such as the Project Charter), departmental and Government of Canada publications related to policy and priorities, and other internal strategic and operational planning documents. The literature review largely focused on grey literature pertaining to marine and economic activity in the North. This data collection method addressed evaluation questions related to the relevance of the METAREA Initiative. A list of documents that were reviewed is included in Annex 2.

Program Administrative and Performance Data
A review of program data included administrative and performance data maintained on the METAREA Initiative ECOLLAB website or provided by program representatives. The performance measurement annual reports for the Initiative (2011 and 2012) were important to understanding the extent of achievement of project milestones. Other administrative data sources included progress reports and financial records, including allocated and expended resources for each component by fiscal year. This data collection method addressed evaluation questions related to effectiveness and program efficiency.

Key Informant Interviews
Key informant interviews were conducted in-person and by telephone with a total of 23 internal and external stakeholders. Given the early stage of implementation of the Initiative and difficulty contacting knowledgeable users of METAREA services, gathering feedback from external stakeholders was challenging and resulted in a heavier than usual reliance on the views of program personnel in assessing the evaluation issues. This issue is addressed in more detail under challenges and limitations. The following provides a breakdown of the interviewees:

For each category of interviewee, a customized, open-ended guide was developed that considered the knowledge of the respondent and the nature of their expected contribution to the evaluation. Interviewees received a copy of the interview guide in advance of the interview, allowing them time to prepare. Notes from the interviews were entered into a matrix by interview question/evaluation issue, which enabled sorting of the information by interviewee category and evaluation issue for analytical purposes. The findings from the key informant interviews addressed all evaluation questions, but were particularly important for the performance issue.

3.3    Challenges and Limitations

Challenges experienced during the conduct of the evaluation, as well as the related limitations and strategies used to mitigate their impact, are outlined below.

  1. Obtaining input from end-users

    The methods identified for the evaluation did not include a broad-based canvassing of end-users of METAREA’s information products and services because: the program did not yet have any means of capturing data on real or potential end-users; developing a contact list to support a survey of users was beyond the scope of the project; and previous efforts to conduct public opinion research with users have had very limited success. While these challenges were addressed to some extent by gathering the views of a limited number of users through key informant interviews, scope limitations meant that not all user groups were included in the key informant interviews. In particular, direct feedback from representatives from economic sectors and Northern residents, two ancillary beneficiary groups for the Initiative, was not collected at all. As a result, limited evidence is available to address the program’s achievement of intended “associated outcomes” involving these two groups of potential beneficiaries.

     

  2. Addressing achievement of intermediate and final outcomes

    Due to the absence of reliable data from end-users it is difficult to assess the achievement of certain intermediate and final outcomes of the Initiative. It should also be noted that at the time of the evaluation, full service coverage in the METAREAs had not yet been achieved as the Initiative is scheduled to be completed on March 31, 2015. Where appropriate, administrative data and qualitative interview information were used, to the extent possible, to address these issues.

Page details

Date modified: