Evaluation of the Weather Observations Forecasts and Warnings Program

Final
Audit and Evaluation Branch
January 2016

PDF (547 KB)

Report Clearance Steps

Planning phase completed - December 2014
Report sent for management response - September 2015
Management response received - October 2015
Report completed - October 2015
Report approved by the Deputy Minister (DM) - January 2016

Acronyms used in the report

ASTD
Atmospheric Science and Technology Directorate
BCP
Business Continuity Planning
CESI
Canadian Environmental Sustainability Indicators
CSB
Corporate Services Branch
DPS
Direct Program Spending
GEO
Group of Earth Observations
GEOSS
Global Earth Observation System of Systems
IMIT
Information Management / Information Technology
ISO
International Organization for Standardization
MEPIC
Meteorological and Environmental Predictions Innovation Centre
MSC
Meteorological Services Canada
NIRT
National Inquiry Response Team
NOAA
National Oceanic and Atmospheric Administration
PSPC
Pacific Storm Prediction Centre
PWPMC
Public Weather Program Management Committee
QMS
Quality Management System
SSC
Shared Services Canada
STB
Science and Technology Branch
UNEP
United Nations Environment Programme
UNESCO
United Nations Educational, Science and Cultural Organization
WMO
World Meteorological Organization

Acknowledgments

The Evaluation Project Team would like to thank those individuals who contributed to this project, particularly members of the Evaluation Committee as well as all interviewees who provided insights and comments crucial to this evaluation.

The Evaluation Project Team was led by Robert Tkaczyk, under the direction of the Environment and Climate Change Canada Evaluation Director, William Blois, and included Lindsay Comeau and Katheryne O’Connor.

Version Control

Date: January 19, 2016

File Name: Weather Observations Forecasts and Warnings - FINAL - EN.doc

Table of Contents

Executive Summary

Context

The Evaluation of the Weather Observations, Forecasts and Warnings Program addresses the issues of the program’s relevance and performance (including effectiveness, efficiency and economy) from April 1, 2011 to June 30, 2014. The program represents 17.25% ($131.9M) of the department's 2014 direct program spending (DPS), including grants and contributions (G&Cs).

The Weather Observations, Forecasts and Warnings Program provides 24 hours per day, 365 days per year weather warnings, forecasts and information. Annually, the program collects millions of atmospheric and water observations and produces approximately 1.5 million weather forecasts and warnings to help Canadians to anticipate meteorological events in order to protect themselves and their property. The program is delivered by the Meteorological Services of Canada (MSC) through collaborations with Environment and Climate Change Canada’s (ECCC) Science and Technology Branch (STB) and Corporate Services Branch (CSB) involving data, science, and information distribution in Canada and internationally. These activities depend on the telecommunication systems and supercomputing capacity managed by Shared Services Canada (SSC). Key collaborators include the media, all levels of government and academia in Canada, other national meteorological services, research and space agencies, and the United Nations World Meteorological Organization (WMO).

Findings and Conclusions

Relevance

There is an ongoing need for the provision of weather and environmental services to better manage the significant human and economic consequences stemming from severe weather events. The program is clearly aligned to government priorities related to the economy, public safety and the environment, and is also consistent with federal roles and responsibilities related to the Department of the Environment Act, the Canadian Weather Modification Act, the Emergency Management Act, as well as supporting Canada’s international commitments under the Convention of the World Meteorological Organization.

Efficiency and Economy

The design of the program is appropriate, and the program continues its renewal agenda intended to improve weather services. The evaluation found the governance structure to be clear, appropriate and effective. The recent transition of IT resources to SSC has led to some issues with delays and the quality of products and services, though mitigation strategies are now in place to address them. Evidence also indicates that the program is undertaking activities and delivering products in an efficient manner. Challenges were, however, identified related to reporting burden (e.g., ISO certification reporting requirements), as well as delays in procurement.

Performance information is collected and reported, and is being used for decision-making. Although the program has a logic model and collects performance data, performance indicators and targets are either not identified for, or aligned to, most intended outcomes. In addition, the program’s intended outcomes may not be fully reflected by the current logic model, as they focus more on satisfaction, awareness and use of weather and environmental services rather than the impacts these services are expected to achieve.

Effectiveness

Data collected as part of the evaluation suggests the program is on track to achieve most of its outcomes. Key external factors influencing the achievement of outcomes were generally thought to concern evolving client expectations (e.g., social media), government-wide restructuring (e.g., SSC), and federal government policy related to communication.

Recommendations

The following recommendation is based on the findings and conclusions of the evaluation. The recommendation is directed to the Assistant Deputy Minister (ADM) of the MSC, as the senior departmental official responsible for the management of the Weather Observations, Forecasts and Warnings Program.

Recommendation #1: Develop and implement a performance measurement strategy for the Weather Observations, Forecasts and Warnings Program, including an updated logic model, the identification of performance indicators and targets for each intended outcome, and a performance reporting strategy.

Management Response

The ADM of the MSC agrees with the recommendations and has developed a management response that appropriately addresses each of the recommendations.

The full management response can be found in Section 6 of the report.

1.0 Introduction

This report presents the results of the Evaluation of the Weather Observations, Forecasts and Warnings Program which was conducted by Environment and Climate Change Canada’s (ECCC) Evaluation Division, Audit and Evaluation Branch, in fiscal years 2014 and 2015. The evaluation was identified in the 2014 Departmental Risk-Based Audit and Evaluation Plan, which was approved by the Deputy Minister. The evaluation was conducted to meet a Treasury Board requirement related to the renewal of funding of the Government of Canada’s weather and environmental monitoring and supercomputing infrastructure, as well as to meet the evaluation coverage requirements of the Financial Administration Act and the Treasury Board Policy on Evaluation, which require respectively that an evaluation of all grants and contributions (G&Cs) and direct program spending (DPS) be conducted at least once every five years. The program represents 17.25% ($131.9M) of the department's 2014 DPS, including G&Cs.

2.0 Background or Context

2.1 Program Profile

The Weather Observations, Forecasts and Warnings Program provides 24 hours per day, 365 days per year weather warnings, forecasts and information with lead times ranging from minutes to weeks. Annually, the program collects millions of atmospheric and water observations and produces approximately 1.5 million weather forecasts and warnings. Information is disseminated primarily to enable Canadians to anticipate dangerous meteorological events in order to protect themselves and their property. There are also various additional benefits of accurate and timely forecasts related to the Canadian economy in such areas as agriculture, transportation, recreation and tourism. The program is delivered by the Meteorological Services of Canada (MSC) through collaborations with ECCC’s Science and Technology Branch (STB) and Corporate Services Branch (CSB) involving data, science, and information distribution in Canada and internationally. These activities depend on the telecommunication systems and supercomputing capacity managed by Shared Services Canada. Key collaborators include the media, all levels of government and academia in Canada, other national meteorological services, research and space agencies, and the United Nations World Meteorological Organization (WMO).

2.2 Activities

The activities in this program area represent the MSC’s core business, namely, the provision of the public weather programFootnote1including:

This is the largest single program in ECCC, accounting for three quarters of the MSC budget and involving all MSC Directorates, as well as STB and CSB. Beneficiaries of ECCC’s public weather prediction services include the general public, other levels of government, and private sector organizations.

The MSC organizational structure includes 6 consolidated regional storm prediction centres,Footnote2 each of which is supported by four key functions of the weather program: the Prediction Services Directorate, the Canadian Centre for Meteorological and Environmental Prediction, the Monitoring and Data Services Directorate, and the Policy, Planning and Partnerships Directorate.Footnote3 The roles and responsibilities of these four MSC directorates in the delivery of this program are as follows:

There are also a number of critical elements of the program conducted outside of MSC that are required to support the public weather program:

Other key program partners include international organizations (e.g. the United Nations WMOFootnote4) and Group on Earth ObservationsFootnote5(GEO)). Participation in these organizations is intended to enable Canada to benefit from global earth observations, science information and technologies developed by other countries (e.g., United States, China, Australia).

Signature Projects

Beginning in 2010, new and reallocated funds have focused on the development of 8 Signature Projects to revitalize and modernize the MSC, fiveFootnote6 of which fall within the program being evaluated:

2.3 Governance and Management

The Weather Observations, Forecasts and Warnings Program contributes to the department’s strategic outcome: “Canadians are equipped to make informed decision on changing weather, water and climate conditions.” Accountability for the program rests with the Assistant Deputy Minister (ADM), MSC; however, accountability for the delivery of the various activities rests with the four accountable Directors General (DGs) who are each responsible for their respective directorate (described under Section 2.2) within MSC.

Key program governance bodies include:

Other supporting and advisory committees include:

2.4 Resource Allocation

Table 1 below provides a summary of budget information by type of expenditure, while Table 2 below provides a summary of budget information by branch for the period from 2011-2012 to 2014-2015.

Budget 2011 provided $78.7M in funding to upgrade the parts of the monitoring network most critically in need of upgrades and that support the weather prediction process.Footnote7 Building on this, Budget 2013 provided additional funding ($248M) beginning in 2013-14 (until 2022-23) to continue modernizing the monitoring infrastructure, including replacing the oldest radars and to modernize the weather and forecasting services (primarily through the Signature Projects). These amounts are included in the budget amounts presented in the tables below.

Table 1: Weather Observations, Forecasts and Warnings Budget 2011-12 to 2014-15

Budget
All Branches 2011-12 2012-13 2013-14 2014-15
Salary $69,875,670 $64,890,161 $72,468,818 $75,089,233
O&M $52,227,731 $35,936,125 $33,283,254 $35,398,151
Capital $11,663,808 $13,590,091 $19,854,130 $19,523,337
G&CsFootnote8 $3,356,955 $15,967,570Footnote9 $3,684,879 $4,191,491
VNR Salary $405,165 $485,980 $426,163 $336,874
VNR O&M $1,436,259 $3,892,358 $3,899,373 $4,101,349
Grand Total $138,965,588 $134,762,285 $133,616,617 $138,640,435

Source: ECCC’s financial system.

Table 2: Distribution of Weather Observations, Forecasts and Warnings Budget by Branch - 2011-12 to 2014-15

Budget
Branch 2011-12 2012-13 2013-14 2014-15
Meteorological Services Canada $87,256,267 $109,555,254 $105,115,184 $112,187,066
Corporate Services Branch $36,067,501Footnote10 $13,330,976 $13,982,554 $13,989,751
Science and Technology Branch $15,641,820 $11,876,055 $14,518,878 $12,463,618
Grand Total $138,965,588 $134,762,285 $133,616,617 $138,640,435

Source: ECCC’s financial system.

2.5 Expected Results

The logic model for the Weather Observations, Forecasts and Warnings Program is presented in Annex 2. The expected outcomes used to assess the performance of the program are presented below:

Immediate outcomes

Intermediate outcome

Final outcome

3.0 Evaluation Design

3.1 Evaluation Scope

In accordance with the 2009 Treasury Board Policy on Evaluation, the evaluation addresses the issues of relevance and performance (including effectiveness, efficiency and economy) of the Weather Observations, Forecasts and Warnings Program.

The evaluation covered the period from April 1, 2011 to March 31, 2015 and includes all components of the program.

3.2 Evaluation Approach and Methodology

The methodological approach and level of effort for this evaluation were determined using a risk-based approach. The following data collection methodologies were developed to adequately address the evaluation issues and questions. Evidence gathered was then used to develop overall findings and conclusions.

Document Review

The document review involved reviewing key secondary materials to capture documented evidence related to the core evaluation issues. Key documents were gathered, listed in an inventory and then each document was assessed in terms of its contribution to each of the evaluation questions and corresponding indicators. A document review template was used to organize the review and capture the information. Documents included: descriptive program information on priorities, governance structure and processes; Government of Canada and Departmental publications (e.g., Speech from the Throne, Departmental Performance Reports); and other internal financial, performance and administrative documents.

This data collection method addressed all evaluation questions.Footnote12

Key Informant Interviews

In total, 38 key informant interviews were conducted in person or by telephone, to gather detailed information related to the evaluation questions and issues. Interviews were conducted either by telephone or in-person using a semi-structured interview guide tailored to the specific respondent group. All relevant stakeholder perspectives were considered, including through interviews with: program staff and management (n=15); internal ECCC partners and stakeholders (n=3); federal partners and stakeholders (n=4); provincial/territorial stakeholders (n=8); industry/media stakeholders (n=4); and international stakeholders (n=4).

The evaluation methodology provided a balance of internal and external perspectives on the program’s performance, as approximately 60% of interviewees were not directly accountable for the program's delivery and approximately 53% were external to the Department.

This data collection method addressed evaluation questions 1, 4, 5, 6, 7 and 8.

Literature Review

A review of literature, including grey and published literature sources, was conducted on a key selection of the most relevant texts, including documents about comparable programs in similar jurisdictions and other relevant studies (e.g., related to the ongoing need for the program). About 10-15 key sources (articles) were reviewed. A total of 3 key informant interviews were also conducted to fill in gaps and clarify various aspects of the public weather programs in their respective countries.

A review template was used to gather and record the information. The information was then organized in a matrix by evaluation issue, such that the evidence from the various sources could be easily compiled for the analysis stage.

This data collection method addressed evaluation questions 4 and 6.

G&C File Review

A review of a sample of 28 out of a total of 52 program G&Cs project files was completed to examine project activities, outputs, outcomes and lessons learned in more depth. All relevant project documentation within each file was reviewed, including the funding application, contribution agreement, progress reports, and final reports. To be broadly representative of G&Cs activity during the time period, the sample was selected to ensure it reflected a range of project values, fiscal years, types of activity (e.g., international initiative/membership, research, training), and types of funding recipient (e.g., academic institution, international organization). To ensure the file review was structured and captured all the information from the project files pertinent to the evaluation, a data capture template was created.

This data collection method addressed evaluation questions 4, 6, 7 and 8.

Case Studies

Case studies, as an evaluation methodology, combine various sources of information to provide illustrations and examples of selected activities or components of a program. Two case studies were undertaken as part of the evaluation. The first examined the Pacific Storm Prediction Centre (PSPC), located in Vancouver, British Columbia with the objective to obtain a regional perspective on emerging weather issues, as well as regional challenges related to providing weather and environmental services. The second case study examined the emerging issue of impact-based forecastingFootnote13with the objective to gain a better understanding of the advantages and challenges of this new approach, as well as the activities that are underway within MSC that focus on impact-based forecasting. Data collection for each case study involved a review of relevant documents and key informant interviews (n=3-4 for each case study). Case study interviews were in addition to the key informant interviews described above.

This data collection method addressed evaluation questions 4, 6 and 8.

3.3 Limitations

No significant methodological limitations were identified as part of the evaluation. The evaluation was able to gather all required documentation and interview all key stakeholders, as part of the document review, file review, literature review, key informant interviews and case studies. Although a survey of users was not undertaken, the evaluation was able to rely on public opinion research (POR) conducted on behalf of the program by independent public opinion and market research firms.

4.0 Findings

This section presents the findings of this evaluation by evaluation issue (relevance and performance) and by the related evaluation questions. For each evaluation question, a rating is provided based on a judgment of the evaluation findings. The rating statements and their significance are outlined below in Table 3. A summary of ratings for the evaluation issues and questions is presented in Annex 1.

Table 3: Definitions of Standard Rating Statements
Statement Definition
Acceptable The program has demonstrated that it has met the expectations with respect to the issue area.
Opportunity for Improvement The program has demonstrated that it has made adequate progress to meet the expectations with respect to the issue area, but continued improvement can still be made.
Attention Required The program has not demonstrated that it has made adequate progress to meet the expectations with respect to the issue area and attention is needed on a priority basis.
Not applicable There is no expectation that the program would have addressed the evaluation issue.
Unable to assess Insufficient evidence is available to support a rating.

4.1 Relevance

Continued Need for Program
Evaluation Issue: Relevance Rating
  1. Is there a continued need to provide weather observations, forecasts and warnings?
Acceptable

There is a continued environmental and societal need for the provision of weather observations, forecasts and warnings. Given that severe weather, noted to be increasing, has significant human and economic consequences, information regarding what weather is anticipated to occur is important for related planning, decision-making and mitigation of the negative effects.

Alignment with Federal Government Priorities
Evaluation Issue: Relevance Rating
  • 2. Is the Weather Observations, Forecasts and Warnings Program aligned with federal government priorities?
Acceptable

Evidence indicates that the program is aligned with federal government priorities related to the economy, public safety, and the environment.

Consistency with Federal Roles and Responsibilities
Evaluation Issue: Relevance Rating
  • 3. Is the Weather Observations, Forecasts and Warnings Program consistent with federal/departmental roles and responsibilities?
Acceptable

The program is consistent with federal/departmental roles and responsibilities related to the provision of weather and environmental services as outlined in relevant legislation. The program also supports Canada’s commitments under the Convention of the World Meteorological Organization.

4.2 Performance - Economy and Efficiency

Program Design
Evaluation Issue: Performance - Efficiency & Economy Rating
  • 4. Is the design of the Weather Observations, Forecasts and Warnings Program appropriate for achieving its intended outcomes?
Acceptable

The design of the program is appropriate, and the program continues its renewal agenda intended to improve weather services. Evidence also indicates that activities are underway to support impact-based forecasting.

Program Governance
Evaluation Issue: Performance - Efficiency & Economy Rating
  • 5. To what extent is the governance structure clear, appropriate and effective for achieving expected results?
Acceptable

Overall, the program’s governance structure is clear, appropriate and effective. Recent restructuring has led to some issues related to IT resources, though mitigation strategies are now in place to address them.

Program Efficiency and Economy
Evaluation Issue: Performance - Efficiency & Economy Rating
  • 6. Is the Weather Observations, Forecasts and Warnings Program undertaking specific activities and delivering products at the lowest possible cost? How could the efficiency of the program’s activities be improved? Are there alternative, more economical ways of delivering program outputs?
Acceptable

International comparisons with other weather services suggest that the program is undertaking activities and delivering products in an efficient manner. The low variance (2-3%) between budget amount and actual expenditures suggests that the program is being managed well. Challenges identified by key informants related to reporting burden, as well as delays in procurement.

Table 4: Weather Observations, Forecasts and Warnings Program Budget and Actual Expenditures for the Period 2011-12 to 2014-15 (in 000’s)

MSC
  Budget $
2011-2012
Actuals $
2011-2012
Budget $
2012-2013
Actuals $
2012-2013
Budget $
2013-2014
Actuals $
2013-2014
Budget $
2014-2015
Actuals $
2014-2015
Salary $49,680 $51,245 $52,642 $52,996 $59,242 $59,630 $60,882 $61,407
O&M $25,734 $29,231 $30,689 $28,824 $26,740 $28,404 $28,930 $29,632
Capital $7,987 $7,217 $9,319 $6,174 $14,873 $10,352 $17,267 $12,956
G&Cs $2,977 $2,954 $15,638 $15,585 $3,312 $3,313 $3,893 $3,891
VNR Salary $119 $115 $339 $268 $171 $189 $115 $113
VNR O&M $758 $811 $927 $871 $777 $723 $1,101 $1,064
Total $87,256 $91,571 $109,555 $104,717 $105,115, $102,611 $112,187 $109,064
Variance -$4,315   (-4.9%) $4,838 (4.4%) $2,504 (2.4%) $3,123 (2.3%)

Source: ECCC’s financial system.

STB
  Budget $
2011-2012
Actuals $
2011-2012
Budget $
2012-2013
Actuals $
2012-2013
Budget $
2013-2014
Actuals $
2013-2014
Budget $
2014-2015
Actuals $
2014-2015
Salary $12,190 $11,677 $8,368 $8,157 $9,102 $8,553 $8,925 $8,949
O&M $2,099 $2,070 $1,962 $1,588 $2,320 $2,232 $1,905 $2,086
Capital $508 $525 $867 $1,235 $2,055 $1,830, $748 $735
G&Cs $380 $380 $329 $329 $373 $373 $298 $298
VNR Salary $286 $286 $147 $147 $255 $251 $222 $222
VNR O&M $178 $168 $204 $178 $414 $390 $366 $368
Total $15,642 $15,106 $11,876 $11,634 $14,519 $13,629 $12,464 $12,646
Variance $536 (3.4%) $242 (2%) $890 (6.1%) -$194   (-1.6%)

Source: ECCC’s financial system.

CSB Footnote34
  Budget $
2011-2012
Actuals $
2011-2012
Budget $
2012-2013
Actuals $
2012-2013
Budget $
2013-2014
Actuals $
2013-2014
Budget $
2014-2015
Actuals $
2014-2015
Salary $8,005 $7,969 $3,880 $4,478 $4,125 $4,219 $5,283 $4,081
O&M $24,394 $24,486 $3,285 $3,872 $4,224 $4,286 $4,564 $4,265
Capital $3,169 $2,361 $3,404 $3,764 $2,926 $2,189 $1,508 $1,491
G&Cs - - - - - - - -
VNR Salary - - - - - - - -
VNR O&M $500 $490 $2,762 $2,766 $2,708 $2,792 $2,635 $2,494
Total $36,068 $35,307 $13,331 $14,880 $13,983 $13,486 $13,990 $12,332
Variance $761 (2.1%) $1,549 (11.2%) $496 (3.5%) $1,658 (8.8%)

Source: ECCC’s financial system.

Totals
  Budget $
2011-2012
Actuals $
2011-2012
Budget $
2012-2013
Actuals $
2012-2013
Budget $
2013-2014
Actuals $
2013-2014
Budget $
2014-2015
Actuals $
2014-2015
Totals $138,966 $141,984 $134,762 $131,230 $133,617 $129,726 $138,640 $134,053
Variance -$3,018 (2.1%) $3,532 (2.6%) $3,891 (2.9%) $4,587,370   (3.3%)

Source: ECCC’s financial system.

Performance Measurement
Evaluation Issue: Performance - Efficiency & Economy Rating
  • 7. Are performance data being collected and reported? If so, is this information being used to inform senior management/ decision-makers?
Opportunity for Improvement

Although the program has a logic model and collects performance data, performance indicators and targets are either not identified for, or aligned to, most intended outcomes. In addition, the program’s intended outcomes may not be fully reflected by the current logic model, as they focus more on satisfaction, awareness and use of weather and environmental services, rather than the impacts the use of these services is expected to achieve. Key informants indicate that performance information is being used for decision-making.

4.3 Performance - Effectiveness

Performance - Effectiveness
Evaluation Issue: Performance - Effectiveness Rating
  • 8. To what extent have intended outcomes been achieved as a result of the Weather Observations, Forecasts and Warnings Program?
Acceptable

Overall, the evidence indicates that the program is on track to achieve most of its outcomes. Key external factors influencing the achievement of outcomes relate to evolving client expectations (e.g., social media), government-wide restructuring (e.g., SSC), and federal government policy related to communication.

(i) Immediate outcome 1: Acceptable
Users are satisfied with the accuracy, timeliness, geographic precision, reliability and access of EC’s public weather and environmental services

Weather information is being used and users are generally satisfied with the accuracy, timeliness, geographic precision, reliability and access of ECCC’s public weather and environmental services. Dissemination of information through social media is progressing.

(ii) Immediate outcome 2: Acceptable
Continued access to meteorological data from other countries

Through financial support and active membership in international for a (e.g., WMO and GEO), the program is collaborating with international partners/organizations , to ensure free and open access to meteorological research and data in order to improve the accuracy of weather forecasts and increase program effectiveness.

(iii) Immediate outcome 3: Acceptable
Meteorological research addresses issues affecting Canadian interests and priorities

The program is pursuing meteorological research that is aligned to program priorities and user needs, thereby increasing the utility of weather-related products and services.

(iv) Intermediate outcome 1: Unable to assess
Understanding Environment Canada’s weather and environmental products

Methodological and other limitations mean that there is insufficient information to assess the extent to which target audiences understand weather and environmental products. Some qualitative evidence suggests there may be less understanding related to more highly specialized weather products and services.

Although evidence presented as part of the discussion related to immediate outcome 1 clearly demonstrates that users access and use weather and environmental services, limitations of the current evaluation methodology and resources has meant that direct evidence of users’ understanding of ECCC’s weather and environmental services was not available for the current evaluation.

(v) Final outcome 1: Acceptable
Use of Environment Canada’s weather and environmental services [to make sound decisions which contribute to mitigating negative environmental, economic, social and health impacts]

The evaluation found evidence that ECCC’s weather and environmental services are being used to support decision-making related to public safety/emergency and economic planning in order to mitigate social and economic harm due to weather conditions. It is too early to conclude on the contribution these services have had on mitigating negative social and economic effects.

External Factors

5.0 Conclusions

Relevance:
Overall, the activities undertaken as part of the Weather Observations, Forecasts and Warnings program continue to be relevant, as there is an ongoing need for the provision of weather and environmental services, given that severe weather events have significant human and economic consequences. The program is clearly aligned to government priorities related to the economy, public safety and the environment. The program is also consistent with federal roles and responsibilities related to the Department of the Environment Act, the Canadian Weather Modification Act, the Emergency Management Act, as well as supporting Canada’s international commitments under the Convention of the World Meteorological Organization.

Efficiency and Economy:
The design of the program was found to be appropriate, as the program continues its renewal agenda aimed at improving weather services, including the re-organization of MSC’s structure, the Signature Projects, implementing MSC’s 2015 Service Strategy, the 2013-16 MSC People Plan, and new investments for overall capital infrastructure renewal and modernization. Evidence suggests that the governance structure is clear, appropriate and effective. Recent restructuring has led to some issues related to IT resources, though mitigation strategies are now in place to address them. Performance information is being collected and reported, and is being used for decision-making. Although the program has a logic model and collects performance data, performance indicators and targets are either not identified for, or aligned to, most intended outcomes.

The evaluation concludes that the program is being efficiently delivered, given that Canada’s weather and environmental services compare favourably to international benchmarks in terms of cost, the low variance between budget and actual expenditures, the high level of automation of the program’s observing network, and the program’s continued collaboration with international partners. Challenges to the delivery of the program related to reporting burden, as well as delays in procurement.

Effectiveness:
Evidence indicates that progress is being made toward the achievement of all the program’s intended immediate outcomes. Users are satisfied with ECCC’s weather and environmental services; ECCC has continued access to meteorological data from other countries and international organizations; and meteorological research addresses issues affecting Canadian interests and priorities. The evaluation was unable to directly measure the level of users’ understanding of ECCC’s weather and environmental services, although there is clear evidence that weather and environmental services are used with some frequency, which would be indicative of users understanding ECCC’s weather and environmental products. With respect to the final outcome, ECCC’s weather and environmental services are being used to support decision-making related to public safety/emergency and economic planning, although is too early to conclude on the contribution these services have had on mitigating social and economic harm.

6.0 Recommendations and Management Response

The following recommendation is based on the findings and conclusions of the evaluation. The recommendation is directed to the ADM of the MSC, as the senior departmental official responsible for the management of the Weather Observations, Forecasts and Warnings Program.

Recommendation #1: Develop and implement a performance measurement strategy for the Weather Observations, Forecasts and Warnings Program, including an updated logic model, the identification of performance indicators and targets for each intended outcome, and a performance reporting strategy.

Although the program has a logic model and collects performance data, performance indicators and targets are either not identified for, or aligned to, most intended outcomes.

Furthermore, despite that the logic model is appropriately aligned to departmental strategic outcomes, a stronger performance statement might be made if the program was to update the logic model’s long-term outcomes to focus more on program objectives as stated in the departmental PAA: namely to enable Canadians to anticipate dangerous meteorological events in order to protect themselves and their property. As the program proceeds with its renewal agenda and moves towards an impact-based approach, there will be a need to ensure a focus on the actual impacts of weather and environmental services beyond their use.

Management Response

Statement of Agreement/Disagreement with the Recommendation
The ADM MSC agrees with the recommendation.

Management Action
The Weather and Environmental Services program has been a measurement-centric program since the 1980’s when forecast verification began.  The first ISO certification in 2007, built on that tradition of performance measurement and continuous improvement. Consequently the Weather and Environmental Services program collects and reports on numerous performance indicators at various levels, including key performance indicators that report on the efficiency of processes, to broader indicators that assess progress toward wide-ranging outcomes through the DPR, FSDS and CESI, to name a few. Further, the establishment of detailed work processes, quality objectives and key performance indicators allows us to measure productivity, ensure quality services and influence management decisions.  As part of the QMS process, certified to the ISO 9001:2008 standard,  regular annual reviews of processes and key performance metrics is required and therefore the culture of reviewing performance is ingrained in the organization.  

It is important to note the program has a mandate to produce information to equip Canadians to make decisions about their safety.   It is beyond the scope of the program to ensure the safety of Canadians, however, the program works with emergency management organizations and Public Safety to tailor products to increase understanding of risks and address client needs.  

Via a step-wise approach, we will first review the logic model to determine whether outcome statements should be adjusted to better reflect the impacts of the program.  Subsequently the corresponding performance measurement strategy, including indicators and performance reporting approach will be reviewed and updated accordingly.  The analysis will take into consideration the constraints of conducting public opinion research and measuring longer term outcomes and establishing attribution.

Timeline
Timeline Deliverable(s) Responsible Party
October 2016 Review and update of program logic model (as required) DG Leads
May 2017 Update Performance Measurement Strategy accordingly DG Leads

Annex 1 Summary of FindingsFootnote42

Relevance
Evaluation Question Acceptable Opportunity for Improvement Attention Required Not Applicable
1. Continued need for the program      
2. Aligned to federal government priorities      
3. Program consistent with federal roles and responsibilities      

Long Description of Annex 1

Annex 1 consists of a table displaying a rating for each of the three evaluation questions related to relevance and the five evaluation questions related to performance that were addressed by the evaluation. Each question is given one of four ratings: Acceptable, Opportunity for Improvement, Attention Required or Not Applicable

Performance
Evaluation Question Acceptable Opportunity for Improvement Attention Required Not Applicable
4. Program design appropriate for achieving expected program results      
5. Governance structure is clear, appropriate and effective for achieving expected results      
6. Program undertaking activities and delivering products at the lowest possible cost and achieving its intended outcomes in the most economical manner      
7. Performance data is being collected and reported, and is being used to inform senior management/decision-makers?      
8. Achievement of intended outcomes      

Long Description of Annex 1

Annex 1 consists of a table displaying a rating for each of the three evaluation questions related to relevance and the five evaluation questions related to performance that were addressed by the evaluation. Each question is given one of four ratings: Acceptable, Opportunity for Improvement, Attention Required or Not Applicable

Annex 2 Program Logic Model

Program Logic Model
Long Description of Annex 2

Annex 2 presents a logic model for the Weather Observations, Forecasts and Warnings Program. The logic model is a graphical depiction of how activities and outputs associated with Weather Observations, Forecasts and Warnings Program relate to immediate, intermediate and final outcome for the program. Program activities lead to outputs, which lead to immediate outcomes, then to intermediate outcomes and finally to the final outcome.

Key Activities:

Management processes:

  • Business Policy Services and Stakeholder Relations

Core processes:

  • Service (client relations)
  • Research and Development Projects
  • Monitoring
  • Prediction
  • Service (delivery)
  • Mission-Critical IT Service Delivery

Support processes:

  • IMIT service (support)
  • Other enabling processes

Outputs:

  • Meteorological information and risk communication:
    • Forecasts
    • Watches
    • Warnings
    • Special Statements
    • Advice
    • Dialogic risk communication
    • Early notification
    • Model outputs
  • Weather data and model outputs
  • Scientific Publications

Target Audience

  • General public
  • Other government departments
  • Media
  • Emergency measures organizations
  • Weather sensitive economic sectors
  • Meteorological consulting industry
  • International/foreign meteorological organizations and agencies
  • Meteorological research community

Immediate outcomes

  • Users are satisfied with the accuracy, timeliness, geographic precision, reliability and access of EC’s public weather and environmental services
  • Continued access to meteorological data from other countries
  • Meteorological research addresses issues affecting Canadian interests and priorities

Intermediate outcome

  • Understanding EC’s weather and environmental products

Final outcome

  • Use of EC’s weather and environmental services [to make sound decisions which contribute to mitigating negative environmental, economic, social and health impacts]

Footnotes

Page details

Date modified: