Species at risk program frameworks : chapter 6
| Previous | Table of Contents | Next |
- 6.1 Overall Approach
- 6.2 Evaluation Issues
- 6.3 Evaluation Methodologies
- 6.4 Evaluation Timing
- 6.5 Evaluation Costs
Evaluation studies provide information, beyond ongoing monitoring, about key aspects of Program operations and outcomes, as well as the continued relevance and possible alternatives.
Since the implementation of SARA and the management of the SAR Program are undertaken by EC, DFO and PC, the core departments will work co-operatively on SAR Program evaluations. The Audit and Evaluation Branch of EC will chair the management of the evaluations in close consultation with its counterparts at DFO and PC.
The overall approach to monitoring and evaluating any program is one of staged expectations, learning and adjustment. The overall approach will be guided by the Program results logic (logic models) and performance measurement strategies presented in sections 3 and 5.
The stages of the evaluation study are:
- data collection (multiple lines of inquiry for each evaluation issue area);
- analysis and development of findings;
- meetings / consultations with core departments to review / verify findings;
- development of conclusions and recommendations; and
- reporting.
The Evaluation Report will summarize findings by
- evaluation issue;
- element of the SAR protection cycle (summary of design, delivery, and success related findings);
- core department (summary of design, delivery and success related findings); and
- key support mechanisms (effectiveness of inter-departmental management, F/P/T governance, stakeholder consultations, and public education / outreach).
In developing the evaluation issues, the following factors were considered:
- the Treasury Board Secretariat issues categories of relevance, success and cost-effectiveness, design and delivery as well as the relevant Expenditure Review Committee (ERC) questions;
- the SAR results logic (logic models) and performance measurement strategy;
- the findings of the Formative Evaluation; and,
- the Program partners, Aboriginal people and stakeholders and their significant role in delivering SAR results.
The evaluation questions fall under four broad issue categories:
- Rationale: This issue examines whether there is a continued need for the SAR Program. At issue is whether the SAR Program is needed to support the conservation and recovery of species at risk and the extent to which the Program facilitates the broad conservation agenda.
- Design and Delivery: This issue and its component questions focus on how the Program is implemented. The Program may be relevant, but flawed in its implementation. The questions under this theme address the effectiveness of the current design, governance structures and delivery approach.
- Success / Program Impact: Questions under this theme assess the extent to which the expected outcomes (as identified in the logic model) have been achieved. The indicators and performance measurement strategy form the core components of this assessment.
- Cost-effectiveness / Alternatives: The questions under this theme explore the cost of delivery with a view to identifying more efficient or effective approaches where possible. Without a simple outcome indicator or other programs to serve as a benchmark, this question will be answered qualitatively based on the findings and conclusions to the questions of program design, delivery and success.
Table 14 presents the evaluation issues and questions, and data sources and methods. For each question, multiple lines of evidence are suggested to increase the reliability and validity of the evaluation information.
Evaluation Theme and Questions | Data Source | Data Collection / Analysis Methods |
---|---|---|
Program Rationale (SAR Program) | ||
Is there a legitimate and necessary role for the government in the protection and recovery of species at risk? Is the current role of the federal government appropriate or are there areas that are candidates for re-alignment with the provinces, territories or others? What activities could be transferred to the private or voluntary sectors, or other level of government? |
Literature Program managers Experts Program partners Aboriginal people Program stakeholders |
Literature review Interviews Expert panel |
Program Design and Delivery (for each department, SAR element) | ||
Are the roles and responsibilities of COSEWIC, the core departments and P/Ts in the implementation of the Accord and SARA clear? Is there an appropriate accountability framework in place? Do the governance structures and mechanisms support interdepartmental and interjurisdictional cooperation and consistency in the application of SARA? To what extent do decision-making and planning processes allow for strategic ranking of species and prioritization of activities? To what extent are multi-species and ecosystem-based analyses used? Are consultations with partners, Aboriginal people and stakeholders effective? Are there ongoing opportunities for partners, Aboriginal people and stakeholders to provide input? Do SAR Program policies support consistent program delivery and implementation of SARA across the core departments? Is there an effective performance monitoring system in place to support program management and demonstrate results? Is the overall capacity (resources) commensurate with the program design, delivery and results expectations? To what extent have Program risks been effectively managed? Is the risk management strategy adequate? Have new risk areas emerged? |
Program managers Program partners Aboriginal people Inter-departmental Action Plan Status Report Program measurement strategy / indicators |
Program documentation review Interviews Surveys Case studies Workshop |
Program Success / Impacts (for each core department, by SAR element) | ||
To what extent has the SAR Program generated the expected outputs? See Table 13 for a list of outputs and indicators. | Performance measurement strategy (Table 13) Program managers |
Performance indicator assessment Program documentation review Interviews Surveys |
To what extent have the expected outcomes been achieved? (See Table 13 for a list of expected outcomes and indicators.) | Performance measurement strategy (Table 13) Program managers Experts Program partners Aboriginal people Program stakeholders |
Performance indicator assessment Interviews Surveys Case studies Workshop |
To what extent are federal and provincial governments collaborating in support of the Accord and the Act? Are there effective mechanisms in place to coordinate delivery (e.g., bilateral agreements)? Have the mandatory requirements of SARA been met? Is the intent of the Act being met? What are the barriers to success? |
Program managers Experts Program partners Aboriginal people |
Interviews Surveys Case studies Workshop |
Cost Effectiveness (SAR Program) | ||
Are there better ways of achieving the results, including alternatives for delivery? Could efficiency be improved? To what extent do SARA and the SAR Program complement or duplicate other federal legislation and/or provincial or territorial legislation and programs? What is the value of a multi-species or ecosystem-based approach as compared with a single-species approach? Is the overall SAR program affordable? If not, what programs or activities would be abandoned? |
Evaluation findings, including program leverage Program financial data Core department SAR Program managers P/T SAR managers |
Synthesis of evaluation findings Analyses of delivery costs; program documentation review Interviews Surveys Workshop |
Program evaluation methodologies that could be considered include
- Review of program documentation, administrative data and literature – Program documents such as background documents on the programs, procedures, project proposals, project files and other documents will be useful in helping the evaluators familiarize themselves with the Program and its evolution as well as to address specific issues. Administrative data includes memoranda of understanding, minutes of meetings, and contribution agreements.
- Interviews – In-depth interviews (telephone and/or in person) will be required with a wide range of partners, Aboriginal people and stakeholders. These include Program management and staff, experts, P/T partners, land claim agreement authorities, and Aboriginal groups, industry, project participants and other stakeholders.
- Consultations with Program partners and Aboriginal people – Because of the importance of Program partners and Aboriginal people to the success of the SAR Program in-depth consultations (including workshops, regional visits) will be required.
- Surveys – surveys with Program partners, Aboriginal people and stakeholders can be useful to capture the views of such large groups.
- Expert Panel meeting – An expert panel can be useful for bringing together various viewpoints on the overall rationale and success of the Program. Experts can include those with scientific expertise as well as program implementation experience. Participants should represent the range of partners, Aboriginal people and stakeholders involved in the SAR Program.
- Workshop – A workshop with representatives from the core departments (HQ and regions) can be used to help clarify findings and contribute to the overall analysis and conclusions.
- Case studies – Case studies are useful methods for assessing some of the project impacts / results in greater depth. Case studies, involving the review of project documents, interviews with program managers, project partners, Aboriginal people and other stakeholders, are useful method for obtaining qualitative information on results and lessons learned. Case studies can be used to address horizontal issues (e.g., public education and outreach, governance structures), issues that emerge during the evaluation, and the actions taken for specific species or ecosystems.
SARA legislation must be reviewed by Parliament after it has been in force for five years. The five-year review of SARA is expected to begin in mid-2008.
Given that the Formative Evaluation of the SAR Program was completed in 2006 and was undertaken at the early stages of implementation of the SAR Program and made several recommendations, the SAR Program RMAF and RBAF will be in place in 2007-2008, and the Parliamentary Review of SARA will be initiated in mid-2008; the outcome evaluation of the SAR Program will be undertaken in 2010-2011 (five years after Formative Evaluation, three years after the establishment of the RMAF and RBAF and possibly one year after Parliamentary Review, depending on when the review is completed). This will provide adequate time for the SAR Program to undertake required adjustments in response to the recommendations from the Formative Evaluation and possibly the Parliamentary Review, and to gather at least three years worth of information associated with the RMAF and RBAF’s performance indicators, which are essential for undertaken a meaningful and valuable evaluation. Furthermore, this will allow the SAR Program to return to government in 2011-2012 on the need for additional funding (the five-year funding received from the government is ending in 2011-2012) with the support of the outcome evaluation report.
An evaluation plan for the conduct of the outcome evaluation will be developed in 2009-2010, prior to the conduct of the outcome evaluation. The evaluation plan will be developed by the core departments. The Audit and Evaluation Branch of EC will chair the management of the evaluation plan in close consultation with its counterparts at DFO and PC.
An estimated budget of $250,000 will be required to develop the evaluation plan and undertake the outcome evaluation. The funds for such work will be proportionally distributed by the level of funds available to each organization from the federal Species at Risk Program.
| Previous | Table of Contents | Next |
Report a problem or mistake on this page
- Date modified: