ARCHIVED - Federal Initiative to Address HIV/AIDS in Canada Implementation Evaluation Report

 

II. Methodology

A. Approach

This evaluation study was concerned with the following:

  • continued need for a federal role in addressing HIV/AIDS in Canada;
  • status of program implementation;
  • governance and performance management of the Federal Initiative; and
  • evidence of success.

The Federal Initiative is a broad-based approach to HIV/AIDS, involving four departments and agencies that undertake a wide range of activities to contribute to increased HIV/AIDS prevention and access to diagnosis, care, treatment, and support. The intended outcome of this study is to provide a map to better align the existing complex array of activities through greater efficiency and effectiveness, leading to intended program results.

Evaluation Strategy

This evaluation examined how the overarching initiative interacted with its individual program activities. The approach was designed to provide the principal users of the findings with information to improve the management and delivery of the Federal Initiative. The approach also helped to focus on the requirements for the summative evaluation. Certain events occurred during the evaluation exercise in 2007-2008 that assisted in the analysis, which guided discussions and recommendations. Self-assessment tools were used by each program and are included in Annex II - Tools. Evaluation data was referenced to corporate and program reports, as well as to evidence from direct observation. The results are summarized in tables annexed to this report (Annexes I-1 to I-7 – Supporting Analytical Tables).

Relevance

To confirm program relevance, a comprehensive literature review was undertaken, including the most recent epidemiological and surveillance reports. Two public opinion research studies, which gauged Canadians’ awareness of HIV/AIDS, were used and provided evidence related to relevance. Internal documents were reviewed to determine whether the activities undertaken by the programs continued to be appropriate to the federal role. The most pertinent documents are referenced in the bibliography.

Design and Delivery

The following four principal lines of enquiry were used to answer three questions related to design and delivery:

  1. Review of Corporate Documentation – The most relevant were the various Federal Initiative program documents, Departmental Performance Reports (DPR) (including the horizontal annexes); annual reports (CSHA 2004, Federal Initiative 2005 and draft 2006). The following component studies of PHAC Gs&Cs were reviewed for evidence of the shift from the CSHA to Federal Initiative priorities: Report on National HIV/AIDS Programs – Grants and Contributions Review of Four National Funding Programs (June 2005) and the Requests for Proposals, which were subsequently posted on the PHAC website; AIDS Community Action Program (ACAP) – Grants and Contributions Allocation project ( San Patten and Associates, March 2006) and subsequent ACAP regional funding allocations; Final Report of the AIDS Community Action Program (ACAP) Program Review (The Alder Group, September 2006) and the management response; Review of the Non-Reserve First Nations, Métis and Inuit Communities – HIV/AIDS Project Fund, Final Report (January 2006) and the Request for Proposals, which was posted on the PHAC website in response to this review; and the ACAP Evaluation Framework (ACAP 2007).
  2. A literature review concerning approaches to increase HIV/AIDS program effectiveness in program delivery, best practices, governance, and risk management and a review of summative evaluation frameworks and evaluative capacity of organizations were also undertaken.
  3. Key Informant Interviews
    • Ten informants from the RCs were chosen for their knowledge of corporate history and their responsibilities regarding managing and coordinating the Federal Initiative. They provided information and guidance and reviewed drafts for accuracy in 2008.
    • Interviews with representatives from the RCs were conducted to gather data regarding responsibilities and contributions to Federal Initiative outcomes. These were undertaken by external consultants for two separate exercises in 2006. Twenty-one informants were interviewed in total (12 for the Performance Measurement Framework project (Annex II-4), and 9 for the Annual Report (Annex II-2)). Templates are provided in Annexes II-1 to II-4 – Tools.
  4. Information Gathered from Federal Initiative Programs (using standard templates and tools)
    • A self-assessment template was developed to identify issues related to program design and implementation and test the alignment of the RCs with the Federal Initiative logic model (Annex II-1). This template was completed by all RCs (July to October 2008).
    • A performance measurement template (Annex II-3) was developed to collect performance data (summer 2007).

Implementation of New Activities

To determine whether programs were implemented as planned, an analysis of the following questions was carried out:

  • How were new program activities integrated into the five Federal Initiative areas of action? (Annex I-2)
  • Were new program activities introduced and funded as planned? (Annexes I-2 and I-3)
  • Were milestones reached as planned? (Annex I-3)

To assess the degree of implementation of planned activities, the following papers were analyzed: financial documents, DPRs (2005-2007), Federal Initiative annual reports, key informant interview reports, and program templates. Annex I-3 – Implementation Status of Federal Initiative New Investments – summarizes inputs based on information collected on new activities funded and their degree of implementation. The following measures were used:

  1. Full Implementation – Satisfactory evidence collected on new activities. Key outputs or special projects are completed or fully operating and outcomes are well documented;
  2. Advanced Implementation – Plans are developed; activities are fully described; performance data and some key outputs are available; whereas outcomes are less well described.
  3. Implementation in progress – Plans are developed and, in general, activities are described; performance data is available; absence of key outputs.
  4. Not implemented – Absence of evidence on activities or plans.

Governance and Performance

The following indicators were used to determine if the horizontal initiative has met expectations of governance for the overall system:

  • proper authorization of programs and activities;
  • clear objectives and performance expectations for the overall initiative, including component programs;
  • good understanding of objectives by program managers and staff;
  • cooperative, coordinated and transparent management and planning processes;
  • agreed on and clearly defined accountabilities of the lead and supporting organizations;
  • regular compilation and reporting of performance information – spanning activities, reach, key outputs, and strategic outcomes;
  • performance information facilitating planning and management across all program elements, or interdependent groupings of programs; and
  • integration of performance reporting with the planning and performance reporting cycles of participating departments and agencies.

Information was obtained through interviews, observation processes, documents from the RCC and AWG meetings (minutes, terms of reference) and corporate documents (DPR, Federal Initiative Annual Reports).

Measurement Strategies for Success

The same approach described under “Design and Delivery Measurement Strategies” applied to this section. Information obtained and analysis can be found in Annexes I-1 to I-7 – Supporting Analytical Tables.

Partner Validation

  • Interim reports were shared with the Federal Initiative AWG to validate the accuracy of data and gather information on the next steps for Federal Initiative programs.
  • An Evaluation Steering Committee was created to bring together senior evaluators from HC, CIHR, the CSC, and PHAC to assess accuracy, feasibility, and utility.

Data was analyzed and findings summarized in technical reports used as working papers for the preparation of the overall evaluation report. Analytical exhibits and charts included in this study are derived from the technical reports. Technical reports are located in the files of the Accountability and Evaluation Section, HIV/AIDS Policy, Coordination and Programs Division (PHAC).

B. Limitations

Funding: Implications of Ramping-up and Reductions – The ramping-up of additional funding over five years, combined with the allocation of the first year’s new funding ($5 million) to front-line work and activities that supported the target populations, placed a burden on existing staff to meet the objectives of sustaining and strengthening ongoing activities. Resources for new staff were added in year two (2005-2006). Delays in staffing these new positions deferred the start of new activities, including the full implementation of the performance measurement and management system, with consequences that will be elaborated in the discussion of the performance strategy. Reductions to Federal Initiative funding in 2007-2008 also affected positions supporting evaluation and reporting, including this study. For example, PHAC’s planned pan-regional HIV/AIDS Accountability Coordinator position (ACAP) was not staffed, with the work being shared among existing regional evaluators.

Inconsistent Financial Tracking – Federal Initiative funds were allocated by areas of action. Program expenditures were reported by Salaries, Operations and Management (O&M), Gs&Cs and areas of action. However, expenditures were not always reported against the area of action through which funding was received. Corporate budget reductions were not uniformly identified as a separate cost. This lack of reporting consistency limited the ability to make connections between investments and the elements of the Federal Initiative logic model.

Absence of Benchmarks – Another barrier encountered is the lack of benchmarks by which to measure and indicate program progress. Without benchmarks or the identification of best practices, it is extremely challenging to measure program improvements and implementation.

Page details

Date modified: