Evaluation of the Reviews and Interventions Pilot Project

2. Evaluation Methodology

2.1 Evaluation Questions and Scope

The evaluation of the Pilot was conducted in accordance with the requirements of the Treasury Board Directive on the Evaluation Function (2009) and examined issues of relevance and performance. As the Pilot was recently established, the evaluation assessed: whether the Pilot was implemented as planned; the efficiency of implementation; the added value of CIC's capacity to intervene on issues related to credibility and program integrity; the rationale for separating the responsibility for conducting reviews and interventions between CIC and the CBSA; and the contribution of the Pilot to the expected outcomes of the reforms made to the in-Canada asylum system. An assessment of the CBSA's reviews and interventions function was not within the scope of this evaluation.

The evaluation covered a one-year period, starting with the coming into force of the reforms in December 2012, through to the end of December 2013, although data to June 2014 were used where available. The evaluation questions, organized by core issue, are presented in Table 2.1 (see technical appendices for the full set of evaluation questions, indicators, and methodologies). A logic model for the R&I Pilot was not developed; however, it is included as an activity in the logic model for the In-Canada asylum system, which may be found in Appendix A.

Table 2.1: Evaluation Questions: Evaluation of the Reviews and Interventions Pilot Project

Core Issues: Performance Evaluation Questions
Achievement of Expected Outcomes 1.1 Was the Ministerial Reviews and Interventions Pilot implemented as planned?
Achievement of Expected Outcomes 2.1 To what extent have ministerial reviews and interventions contributed to the integrity of the refugee system thus far?
Demonstration of Efficiency and Economy
3.1 Was CIC efficient in implementing the Ministerial Reviews and Interventions Pilot?
Core Issues: Relevance Evaluation Questions
Continued Need for the Program 4.1 Is there a need to conduct ministerial reviews and interventions for credibility and program integrity (as part of refugee status determination)?
Alignment with Government Priorities 4.2 Are ministerial reviews and interventions consistent with departmental and government-wide priorities?
Alignment with Federal Roles and Responsibilities 4.3 Are ministerial reviews and interventions the role and responsibility of the federal government (including CIC vs. CBSA role)?

2.2 Data Collection Methods

Data collection for this evaluation took place between March and July, 2014, and included multiple lines of evidence to help ensure the strength of information and data collected, which are described below.

2.2.1 Document Review

Relevant program documents (see the Technical Appendices for a list of documents) were reviewed to gather background and context on the R&I Pilot, as well as to assess its relevance and performance, including:

  • government documents, such as Speeches from the Throne, Budget Speeches, and Reports on Plans and Priorities;
  • documents related to the implementation and operation of the Pilot (e.g., budgets, training material, standard operating procedures, processing manuals, and operational dashboards);
  • written RPD decisions; and
  • studies and analyses produced by MACAD (e.g., Metrics of Success Reports).

2.2.2 Interviews

A total of 32 interviews (in-person and telephone) with three key stakeholder groups were conducted (Table 2.2), includingFootnote 11 :

  • CIC senior management and program officers involved with the R&I Pilot, both at Headquarters and in the Regions;
  • representatives of the CBSA; and
  • representatives of the IRB.Footnote 12
Table 2.2: Summary of Interviews Completed
Interview Group Number of Interviews
CIC representatives 15
CBSA representatives 11
IRB representatives 6

Due to the small number of interviews in each interview group, a summary approach to the analysis was used to develop key themes (see the Technical Appendices for the interview guides).

2.2.3 Focus Groups

Five focus groups were conducted with SIOs and RIAs, including three in-person focus groups in the Toronto R&I office (17 participants); one in-person focus group in the Montreal R&I office (three participants); and one telephone focus group with the Vancouver R&I office (three participants). The results of the focus groups were analyzed to determine key themes related to challenges in implementing the Pilot, mechanisms for coordination and communication, and results (see the Technical Appendices for the moderator guide).

2.2.4 Program Data Analysis

Available performance data and financial data were used to provide information on the relevance and performance of the Pilot project. The Refugee Claimant Continuum (RCC)Footnote 13 as well as data from National Case Management System (NCMS) was analyzed to determine the number and type of reviews and interventions conducted, as well as their outcomes. This supported the analysis of the contributions of the Pilot to the integrity of the refugee system. Financial data from CIC's Refugee Reform Project Management Office and from CIC's Financial Management Branch were used to examine planned versus actual expenditures for the Pilot.

2.3 Limitations and Considerations

The limitations of the evaluation, as well as the mitigation strategies employed to address them, are described below:

  • Timing of the evaluation. At the time of the evaluation, the Pilot project had been implemented for one year only. Thus, caution should be exercised when reading the results, as they may reflect the stages of program implementation and transition issues related to the development of new business processes, the hiring of new staff, and the development of new tools. Where early stages of implementation may account for the findings, this is indicated in the report.
  • Access to IRB decision-makers. Given that the IRB is an independent, arms-length tribunal; it was not possible to obtain the views of IRB decision-makers, which could have supported the analysis of the utility and value-added of the reviews and interventions. Instead, the evaluation team reviewed a small sample of IRB decisions, which provided a proxy source of evidence on the utility of the interventions. The evaluation also interviewed representatives of the RPD and Refugee Appeal Division (RAD) Registries to obtain views on the impact of the Pilot on the IRB's registry processes.
  • Data inconsistencies. Data obtained from the RCC differed from data extracted from other administrative information sources (e.g., NCMS extract, Metrics of Success reports, and Operations Performance Management Branch (OPMB) dashboards). In part, this was the result of different businesses processes being applied to the difference data sources. These differences are also the result of the time period for which the data were captured, updated, or cleaned. For example, data used in Metrics of Success reports represent a snapshot of data extracted at a particular time, whereas data gathered from the dashboards reflect the latest information in NCMS, which may have been improved and/or cleaned over time. These limitations were overcome through the use of up-to-date data extracted directly from NCMS and validated by OPMB. In addition, other lines of evidence were used, alongside the data, to support evaluation findings and increase confidence in the results.

Page details

Date modified: