Section 2: Evaluation of the International Health Grants Program (2008–09 to 2012–13) – Evaluation methodology

2. Evaluation methodology

2.1.Purpose and scope of the evaluation

The goal of the evaluation was to provide the senior management of the Health Portfolio with a neutral and evidence-based assessment of the relevance and performance of the Program. The evaluation was also intended to inform the renewal of the Program, which is currently being finalized, and the future direction of the Program in the newly combined Office of International Affairs for the Health Portfolio.

The following questions guided the collection of data. All of these questions are covered by the evaluation. Key findings for each question were then synthesized into broader conclusions.

Relevance

  • Continued need for the program:
    • Is there a continued need for a project-based component?
    • Is there a continued need for Canada to be a member of PAHO and IARC?
  • Alignment with Government of Canada priorities:
    • Do the memberships in PAHO and IARC align with Government priorities?
    • Do the project component activities align with the Health Portfolio’s strategic priorities/outcomes?
  • Alignment with Federal roles and responsibilities:
    • Are the program’s activities and objectives aligned with the portfolio’s jurisdictional, mandated and/or legislated role (2008-2013)?

Performance

  • Achievement of expected outcomes:
    • To what extent does the Program inform the Government of Canada policies related to global health issues in order to improve the health of Canadians?
    • To what extent do the projects funded by the Program contribute to the achievement of Program objectives?
    • What are the lessons learned that would assist in determining the future direction of the Program?
  • Demonstration of Efficiency and Economy
    • Are there efficiencies to be gained in the management of the Program?
    • Are the two components of the Program the best approach to meet the Program’s objectives?
    • Is there performance measurement in place? How is the program using performance measurement to improve the design, delivery and reporting?

The last evaluation was completed in 2007-2008Endnote 24 and a mid-point review of the Program was completed in July 2010Endnote 25 . The evaluation and review did not assess the PAHO membership.

The current evaluation covered activities related to the membership in PAHO, and activities related to the project component, including commitments to International organizations.

The scope of this evaluation did not include the World Health Organization — Framework Convention on Tobacco Control as it was previously evaluated in the Horizontal Evaluation of the Federal Tobacco Control StrategyEndnote 26 in 2012. Also, as the PAHO/World Health Organization Collaborating Centres were not funded through the Program, they have not been assessed through this evaluation.

2.2.Evaluation design

An objective-based evaluation approach was used for the conduct of the evaluation to assess the progress made towards the achievement of the expected outcomes, whether there were any unintended consequences and what lessons were learned.

The Treasury Board’s Policy on Evaluation (2009) guided the identification of the evaluation design and data collection methods so that the evaluation would meet the objectives and requirements of the Policy.

The evaluation was designed and conducted by the Public Health Agency of Canada/Health Canada Evaluation Directorate, a group internal to the Public Health Agency but not involved in the program areas responsible for international activities. The conduct of the evaluation was guided by the Public Health Agency of Canada’s Evaluation Committee.

2.3.Data collection

2.3.1.Evaluation framework

An evaluation scoping and planning exercise took place from November 2012 to February 2013. This included:

  • a review of Health Canada plans, strategies and program-specific documents and reports
  • a preliminary review of the published literature
  • a review of past and current assessments
  • a review of the logic model
  • semi-structured key informant interviews.

The scoping informed the development of an evaluation framework, the identification of evaluation methods and the development of the interview guides for the data collection phase of the evaluation.

2.3.2.Lines of evidence

This evaluation incorporated multiple lines of evidence and a combination of qualitative and quantitative measures. Lines of evidence included a review of literature and relevant documents, as well as key informant interviews. Evaluators conducted the interviews using semi-structured interview guides that followed the evaluation questions. Results were triangulated to ensure a balanced analysis of the relevance and performance of the Program.

In 2006Endnote 27 and again in 2009Endnote 28 , the Department of Foreign Affairs and International Trade led and coordinated a horizontal review of all international memberships to identify the highest and lowest priority/value memberships. The effectiveness of Canada's engagement with PAHO and with IARC was reviewed in both instances with the close collaboration of Health Canada. The current evaluation drew on the findings of these 2006 and 2009 reviews.

Other key sources of evidence included Health Canada’s mid-point review (2010) and summative evaluation (2008) of the Program. Additional evidence was collected through interviews, recent documents and the literature review.

The lines of evidence for the evaluation are further described in Table 3.

Table 3: Lines of evidence
Line of evidence Description
Literature review Evaluators conducted a brief and focussed review of literature to explore how other countries assessed their relations with international organizations and how they measure their engagement to achieve the maximum benefits in support to their domestic priorities. An internet search revealed that several countries are now engaged in reviewing their contribution to multilateral organizations mainly to assess the validity of using these organizations to deliver their Official Development Assistance Program.
Administrative file and document review In total, the evaluators reviewed 250 documents, not including websites. The types of documents reviewed included:
  • program authorities, acts, protocols and agreements
  • performance measurement information (where available)
  • strategic plans
  • external evaluation reports (conducted by others, including the Canadian International Development Agency, the Department of Foreign Affairs and International Trade and also PAHO)
  • foundational documents, briefing notes and decks
  • financial data
  • international documents on multilateral organizations
  • positions paper
  • presentations, including to the Minister and Executive Committee.
Key informant interviews

From January to early March 2013, 16 interviews were conducted. A list of key informants was compiled with the help of the program area. Additional contacts were identified during interviews for more specific program elements.

Interviews were held with key Health Canada and Public Health Agency senior management and program staff as well as other key external stakeholders in Department of Foreign Affairs and International Trade, the Canadian International Development Agency and the Organisation of American States Permanent Mission, PAHO/World Health Organization Collaborating Centres and the Canadian Institutes of Health Research. Individuals with involvement or experience in international affairs and Grants and Contributions associated with the Program either within Health Canada or the Public Health Agency or externally, were targeted for the interviews. Following presentation to the Evaluation Committee, and the interest in examining further the engagement with PAHO, additional interviews, both internal and external to the Public Health Agency, were deemed necessary.

The interviews addressed evaluation issues related to relevance and performance, including roles and responsibilities; the benefits for Canada to participate in international organizations; the activities related to Canada’s engagement; and their views of Canada’s engagement/performance in these organizations. The interviews also identified areas for improvement and existing gaps (challenges) and suggestions on how to adjust the engagement as a result of the merger of the two international affairs offices of Health Canada and the Public Health Agency of Canada.

Case studies

Two projects and one PAHO/World Health Organization Collaborating Centre were selected for the case studies due to their success. These were not intended to be a representative sample, but rather to illustrate best practices and how a small investment could have significant returns for Canada.

The case studies examined:

  • context: background information, climate, concerns, issues
  • strategies: approaches taken, agencies and actors involved
  • challenges: concerns that emerged, various perspectives
  • outcomes: accomplishments, changes, lessons learned.

The following were selected for the case studies:

  • Pan American/World Health Organization Collaborating Centre on Noncommunicable Disease Policy
  • Prenatal manganese exposure from mancozeb use: expanding knowledge from a birth cohort study on pesticides and children’s development
  • Establishment of collaborative relationships between the Province of Tierra del Fuego, Argentina and Nunavut, Canada.

2.4.Challenges and limitations

This section describes the challenges encountered during the evaluation, and the limitations in the design and methods that could impact on the validity and reliability of evaluation findings, conclusions and recommendations. The evaluation team identified mitigation strategies to ensure that the findings can be used with confidence to guide the program planning and decision-making. Challenges, limitations and mitigation strategies are described in Table 4.

Table 4: Challenges, limitations and mitigation strategies
Limitation Challenge Mitigation Strategy
Reliance on secondary sources of information for the multilateral organizations

When using data from secondary sources, there is a risk that the information provided are not comprehensive or balanced.

For example, the reviews of the memberships conducted by the Department of Foreign Affairs and International Trade in 2006 and 2009 did not fully cover the Evaluation Policy core issues on Relevance and Performance.

The Summative Evaluation (2008) and the Mid-Point Review (2010) did not cover the memberships to the multilateral organizations, although the Mid-Point review acknowledged that 95 per cent of the total program budget was used to cover international memberships.Endnote 29

  • Information from secondary sources was confirmed through other data sources and additional information was collected to address the gaps in the secondary sources and to ensure that the evaluation addressed all core issues under relevance and performance.
  • The evaluation team confirmed that the mid-point review and summative evaluation had been reviewed and approved by Health Canada’s centralized evaluation unit.
  • The majority of the Program budget supports memberships to multilateral organizations. As such, the evaluation focused on the relevance and performance of the memberships. For the project component, the evaluation drew mainly from the 2010 Mid-Point Review to assess relevance and performance of the project component.
Limited consultation with other federal departments engaged in international work, or directly with international partners due to tight timelines, the identification of the evaluation as low risk in the Departmental Evaluation Plan and the availability of secondary data Reliance on the perceptions of internal key informants and secondary data sources which not be systematic and free of bias.
  • Triangulation of interviews with other lines of evidence:
    • Extensive document and file review supported by internal interviews to highlight most relevant information and perspectives
    • Secondary information on international health grants were obtained through other evaluations, audits and reviews.
Limited systematic performance measurement or financial data.

The absence of consistent performance measures made comparison of activities or products over time difficult. The lack of regular collection of performance data and assessment of progress toward program outcomes limited the ability to examine performance and efficiencies within the program area. Also, the existing logic model for the Program only applied to the project component. There was no logic model for the PAHO component.

Overall, this made it difficult to define what the Government of Canada wanted to achieve through its membership in PAHO.

  • Performance information collected through the departmental performance report and through interviews was used, where available
  • Reliance on secondary performance data, interviews and triangulation of information from the various lines of evidence allowed for validation of findings
  • Efficiency of the membership assessed by examining how the activities were conducted and how strategies were put into place to achieve goals and objectives.
Report a problem or mistake on this page
Please select all that apply:

Thank you for your help!

You will not receive a reply. For enquiries, contact us.

Date modified: