Appendix 3: Evaluation of the Active and Safe Injury Prevention Initiative 2011–2012 to 2012–2013 – Evaluation description

Appendix 3 - Evaluation Description

Evaluation Scope

The scope of the evaluation included activities conducted by recipient NGOs, through Active and Safe funding. Other work undertaken by NGOs in injury prevention which is beyond PHAC funded projects is excluded from the evaluation.

The timeframe of this evaluation was from 2011 (the date the initiative started), until 2013 (the date of its completion).

Evaluation Issues

The evaluation focused on the five core issues outlined in the Treasury Board of Canada's Policy on Evaluation (2009). These are noted in the table below.

Table 5: Core Evaluation Issues and Questions
Core Issues Evaluation Questions
Relevance
Issue #1:  Continued Need for Program Assessment of the extent to which the program continues to address a demonstrable need and is responsive to the needs of Canadians
  • What were the health/societal needs contributing to the need for this Initiative?
  • What are the current needs for injury prevention in Canada?
Issue #2:  Alignment with Government Priorities Assessment of the linkages between program objectives and (i) federal government priorities and (ii) departmental strategic outcomes
  • What are the Government of Canada and PHAC priorities related to injury prevention?
  • Do the activities align with these priorities?
Issue #3:  Alignment with Federal Roles and Responsibilities Assessment of the role and responsibilities for the federal government in delivering the program
  • What are the federal and PHAC roles in injury prevention?
  • Are there other F/P/T programs or initiatives that complement or duplicate the role of PHAC relative to the Active and Safe Initiative?
Performance (effectiveness, economy and efficiency)
Issue #4:  Achievement of Expected Outcomes (Effectiveness) Assessment of progress toward expected outcomes (incl. immediate, intermediate and ultimate outcomes) with reference to performance targets and program reach, program design, including the linkage and contribution of outputs to outcomes
  • Immediate Outcome 1: Targeted multi-sectoral stakeholders have opportunities to collaborate in pan-Canadian initiatives aimed at preventing sports and recreational related injuries
  • Immediate Outcome 2: Targeted multi-sectoral stakeholders have access to evidence-based knowledge needed to increase awareness of sports and recreation injury prevention
  • Immediate Outcome 3: Target audiences have access to public awareness and education products aimed at preventing sports & recreation related injuries
  Outcomes expected beyond two years of the Initiative
  • Intermediate Outcome 1: Targeted multi-sectoral stakeholders are equipped to design and develop injury prevention policies, programs, and practices that promote safety and prevent and reduce injuries
  • Intermediate Outcome 2: Target audiences understand how to prevent injury and take actions to increase their safety
  • Intermediate Outcome 3: Target audiences have the capacity to promote safety and prevent sports related injuries
Issue #5:  Demonstration of Economy and Efficiency Assessment of resource utilization in relation to the production of outputs and progress toward expected outcomes
  • Has the Active and Safe Initiative been delivered in the most efficient manner?
  • Has the Active and Safe Initiative been delivered in the most economical manner?
  • Is there appropriate performance measurement in place? If so, is the information being used to inform senior management decision-makers?

Data Collection and Analysis Methods

Evaluators collected and analyzed data from multiple sources to guide the evaluation. Sources of information used in this evaluation included the following methodologies: literature review, document review, case study and key informant interviews.

Literature Review

The literature review examined academic literature, as well as grey literature from reliable sources, focussed on documenting the need for the program.

Inclusion criteria were used to assess the sources and to ensure the most appropriate, credible, reliable and relevant information needed to address specific evaluation questions. Health Canada and Public Health Agency's library services were used for the literature search. Specific inclusion criteria included the key words "unintentional injuries", a focus on child/adolescent/youth/young adult populations, literature focused on Canada, and literature published between 2009 and 2013. There were a total of 37 articles located from academic literature. A web search was used to supplement the literature review. The criteria used were the same as the literature review.

Document Review

The document review consisted of government and program documents (including websites) from which evaluators extracted the data required for the appropriate indicators as per the data collection matrix. The evaluation team coordinated with the program to facilitate the acquisition of documents which included:

  • Government documents such as requests for funding, ministerial briefing materials, Government of Canada budgets, and government websites. These documents provided background and information on program relevance.
  • Documents produced by the program, including progress reports and work plans, project reports (PERTs), financial information, policies and plans. These documents provided information pertaining to program delivery, performance data, outcome achievement, and efficiency and economy. An environmental scan of the provincial/territorial programs in place was also provided by the program.

Key Informant Interviews

Key informant interviews were conducted to gather in-depth information, including individual perspectives, explanations, examples and factual information to address many of the evaluation questions related to relevance and performance. The interview guides were based on the indicators within the data collection matrix and included mixed open and closed-ended questions. Interviews were conducted in the official language of choice of the respondent.

Thirty (30) interviews were conducted for this evaluation with the following groups of key informants:

  • Public Health Agency of Canada management (n=3)
  • Public Health Agency of Canada program staff (n=2)
  • NGOs/stakeholders (n=25)

The evaluator conducted interviews using appropriate and feasible approaches for the evaluation budget and timeframe including: in-person interviews, telephone interviews, and two group interviews. On-site, face-to-face interviews were conducted with NGOs in the Ottawa area, where possible.

Case Study

During November and December 2013, one case study was conducted to examine design and delivery and highlight best practices from the Public Health Agency's Multi-sectoral Partnerships to Promote Healthy Living and Prevent Chronic Disease. This program was selected as a case study because of its early success in leveraging funds from the private sector to promote health and prevent disease.

In terms of data collection, key informants from the Public Health Agency's Centre for Chronic Disease Prevention were asked to provide insights, and documents and internal files were reviewed.

Assessment of Economy and Efficiency

To assess economy and efficiency, the evaluation:

  • Conducted a resource allocation/utilization review at the program level through an assessment of available financial data (e.g., spending levels, trends, issues, etc.);
  • Conducted key informant interviews regarding how resources were used, the effectiveness of program management, the efficiency of business processes;
  • Obtained and reviewed qualitative data from key stakeholders regarding the efficiency and economy of the program (e.g., views on approaches to improve efficiencies, cost savings, leveraging, quality of outputs for resource spent, etc.); and
  • Examined the sufficiency and use of performance information to enhance economy and efficiency.

Data Analysis

Data were analyzed by triangulating information gathered from different sources and methods listed above, using NVivo for qualitative analysis. The data collected were analyzed using the following methods:

  • Systematic compilation, review and summarization of data to illustrate key findings;
  • Statistical analysis of quantitative data from databases;
  • Thematic analysis of qualitative data; and
  • Comparative analysis of data from disparate sources to validate summary findings.

Summary analyses of the data collected were recorded by evaluation question, using an Evaluation Directorate template and referencing the sources/methods used to collect the data.


Report a problem or mistake on this page
Please select all that apply:

Thank you for your help!

You will not receive a reply. For enquiries, contact us.

Date modified: