ARCHIVED – Formative Evaluation of the Pre–Removal Risk Assessment Program

2.0 Evaluation Issues and Methodology

This section of the report presents a summary of the evaluation questions and methodology, including limitations. The evaluation issues and questions are presented in Annex 1.

2.1 Evaluation Issues and Questions

This study was based on an evaluation matrix developed in November 2006. During a planning phase for this evaluation, in January 2007, the evaluation methodologies were confirmed and data availabilities were assessed. See Appendix A for the logic model and Appendix B for the evaluation matrix.

The evaluation questions were organized under the issue areas of relevance, design and delivery (training and resources, consistency across regions), program integrity, program results, and cost-effectiveness. These issue areas were assessed through 16 evaluation questions, provided in the table below.

Evaluation Issue Evaluation Question
Relevance
  • Is there a continued need for the PRRA program?
  • Is the PRRA Program consistent with CIC and GoC priorities?
Design and Delivery: Consistency Across Regions
  • To what extent is the application of PRRA policies and procedures consistent across regional offices?
  • To what extent is the planning and program delivery coordinated between CIC NHQ and Regional PRRA offices?
  • Are PRRA officers equipped with the skills, knowledge and tools needed to efficiently process cases?
Design and Delivery: Decision-Making
  • Are PRRA assessments and decisions sound and fair?
  • Are decisions made in a timely fashion?
  • What has been the impact of the ‘single-decision-maker’ model (in terms of workload, processing)?
Design and Delivery: Program Integrity
  • Is there continued program integrity?
  • Are potential program improvements identified and provided to inform policy and program development?
Program Results
  • Has PRRA been successful in granting protection to those in need?
  • To what extent does the program support Canada’s international obligations and adherence to the principle of non-refoulement?
  • Is the planning and program delivery coordinated between CIC and CBSA (at the national and regional levels)?
  • What has been the impact of PRRA on the removals process?
  • Have there been any unexpected outcomes?
Cost-Effectiveness
  • Is the operation of PRRA cost-effective?

2.2 Methodology

The evaluation was designed to use multiple lines of evidence and qualitative and quantitative research methods. A description of each of these methods follows.

Document Review

Documents reviewed for this evaluation are listed in Appendix C. These included, but were not limited to:

  • background and contextual documents such as process maps, Standing Committee notes, and the Thompson report[note 9];
  • program-specific documents such as the Evaluation Matrix, best practices reports, QA tools, PRRA conference reports; and
  • operational documents, which included processing manuals, PRRA decision templates, training materials and presentations.

Interviews

Forty interviews were conducted with PRRA program stakeholders, by telephone or in-person. Interview guides were provided to all interviewees in advance of the interviews. The results of the interviews were summarized and analyzed by performance indicator, as per the evaluation framework. The breakdown of interviews is shown in Table 3 (see Appendix D for a detailed list of interviewees and Appendix E for the interview guides used).

Table 2. Number of Interviews, by Stakeholder Group

Stakeholder Group Number of Interviews
CIC Headquarters 7
CIC PRRA Offices 14
Canada Border Services Agency (Regions and NHQ) 8
Department of Justice
(Regions and NHQ)
6
Non-governmental Organizations 5
Total 40

Focus Groups with PRRA Officers

A total of six focus groups were held with PRRA officers, one in each of the PRRA offices. A total of 37 participants attended the sessions, which accounts for approximately half of all officers (see Table 4). With the assistance of PRRA coordinators, participants with differing levels of experience and varying backgrounds were selected, in order to ensure a comprehensive discussion. See Appendix F for the focus group materials.

Table 3. Summary of Focus Group Participants

PRRA Office Number of Participants
BCY 8
PNT 4
Mississauga[note 10] 8
Niagara Falls 7
Quebec 9
Atlantic[note 11] 1
Total 37

In addition to gathering information on issues related to program design and delivery and success, the focus groups discussed the PRRA process for assessing applications, specific to each region. The results of those discussions were used to develop office-specific process maps, which illustrated the similarities and differences in the processing of applications across the six offices. Note that these process maps are included in the PRRA case studies (see description below).

Quality Assurance (QA) Exercise

To examine the evaluation questions regarding the soundness and fairness of PRRA decisions and the consistency of application of policies and procedures across regions, a QA exercise was conducted. A template was developed, building upon previously designed QA tools, its structure corresponding to the PRRA decision template used by officers. It was streamlined based on regional input and pre-tested in one office. (See Appendix G for the QA tool).

Each PRRA office was asked to designate a coordinator(s) to complete the exercise, with the goal of completing five templates each[note 12]. Coordinators were provided with guidelines instructing them on obtaining a random sample of completed cases to assess, running a report that listed concluded cases (omitting any concluded cases that did not involve a written decision). The data was sorted by ascending FOSS ID number, thereby mixing case types and decision makers for a more random sample.

The results were discussed with the coordinators at the PRRA coordinators conference (May 24th-25th, 2007) to determine whether there were limitations with the QA tool and to ensure that the results were interpreted correctly. The PRRA coordinators also discussed implementing an ongoing QA process for PRRA, including potential benefits, challenges, and options for QA implementation.

Data Analysis

Complete data, including annual and regional breakdowns, was available for almost every indicator in the evaluation matrix. However, this data is not gathered on an ongoing basis and it required substantial effort on the part of Refugees Branch to gather the data.

PRRA Program Data

The primary source for PRRA case data was the National Case Management System (NCMS). It is a web-based electronic case tracking system, delivered over CIC’s intranet using a centralized database. NCMS is used by CIC and CBSA personnel in the field and at NHQ. It delivers critical case-tracking support to enforcement personnel across many areas, including removals, refugee processing, detentions, hearings and war crimes. CIC PRRA officers enter case information into this system. NCMS has pre-programmed ‘canned’ reports that can be used to generate tombstone data on PRRA cases, such as volume of applications, number of decisions, and inventory levels. To obtain more specific data for many of the indicators (such as country of origin, volume of refugee claimants to non-claimants), extensive manipulation of large amounts of raw data was required. This data was acquired through CIC’s Data Warehouse Services.

Legal Data

Information regarding PRRA legal challenges was provided by DOJ using its iCase tracking system. This system monitors the number and outcomes of legal challenges to PRRA decisions in Federal Court.

Additional Data Sources

The PRRA offices provided region-specific resource information and statistical data on case volumes and inventories, while the Finance Branch provided budgeted and actual program resource information.

Case Studies

To provide a ‘snap-shot’ of PRRA in each office, the evaluation undertook six case studies. The case studies examined key program statistics, the PRRA process, lessons learned and key issues. Each study included a process map illustrating the case assessment process.  The case studies are presented under separate cover (Formative Evaluation of the Pre-Removal Risk Assessment Program: Regional Case Studies).

Overall Analysis

To complete the overall analysis of evaluation findings, a set of evidence matrices were developed. Findings for each indicator were input into a matrix according to the data source used. The individual matrices for each data source were then rolled into a summary of all of the findings and cross-referenced against the evaluation questions.

2.3 Limitations of the Methodology

There were some limitations to the methodology, particularly with respect to the quality assurance exercise.

QA Exercise

For reasons of practicality, and after consultations with coordinators and program representatives, it was decided that PRRA coordinators would assess case decisions from their own offices rather than transferring cases among regions. The administrative burden and logistical difficulty of attempting to create a ‘double-blind’ sample by removing all case and office identifiers and transporting them among offices presented too great a challenge to manage within the timeframes of the evaluation. While restricting the QA assessment within each region impacted the randomness to a degree, steps were taken to generate as random a sample as possible (see above section for details).

A total of 32 QA templates were completed in five PRRA offices.[note 13] This sample was acknowledged to not be of representative size, rather, the results were used as only one indicator of consistency and quality of decision-making, to corroborate other evidence and support the qualitative findings in this area.

____________

9. The Thompson report was a brief review of the PRRA program, completed in 2003.

10. This included two participants from the CIC Toronto–St. Clair office.

11. With only one PRRA officer for the region, the focus group discussion guide was used in an interview setting.

12. Note that no cases from the Atlantic region were reviewed due to the fact that there is no coordinator position in this region.

13. BCY (10), PNT (5), Mississauga (6), Niagara Falls (5), Quebec (6). With no PRRA Coordinator in the Halifax office, the PRRA officer was excluded from the QA exercise, with the Vancouver office agreeing to assess five additional cases.

Page details

Date modified: