Evaluation of Ministerial Instructions (Implementation)

2. Methodology

Data collection for this evaluation took place between March and June, 2011. The time period that is covered by the evaluation extends from the time that the 2008 Budget came into effect (June 18, 2008) until the end of the data collection phase. This includes the time after which MI2 came into effect (June 26, 2010) because the scope of the evaluation goes beyond MI1.

Several lines of enquiry, including both quantitative and qualitative lines of evidence, were used for the evaluation. Although the number and type of methods used for each question varied, all questions were investigated using two or more methods. The advantage of using multiple lines of evidence is that one may examine each question from several perspectives, and can have greater confidence in the reliability and validity of the findings when these lines of evidence converge.

2.1 Document review

The purposes of the document review were to enable the evaluators to learn about MI and its context, and to collect pertinent program information. Key documents reviewed included: statistical reports; 2008 federal budget papers; briefing documents; operational procedures for the FSW Program; operational bulletins; the November 2009 Auditor General Report; and more general literature dealing with immigration policy respecting skilled workers. Appendix D provides a list of the documents reviewed.

2.2 Administrative data review

When CIC implemented MI1 and established the CIO, it also initiated comprehensive data-collection and monitoring systems that provided extensive administrative data on the processing of MI applications. The analysis of this data was a key source of information for the evaluation. Data related to MI1, MI2 and pre-MI FSW applicants were extracted from databases provided by CIC’s Research Datamart Portal, from C-50 monthly reports, and from FSW-CIO production summaries.

2.3 Key informant interviews

Interviews with key stakeholders were conducted in order to assess program implementation and operation; to explore interviewees’ perceptions of the success of MI in achieving its immediate and long-term objectives; to examine communications; and to gather suggestions for improving the program. Key informants were identified and interview guides were designed to govern the interviews. A list of interviewees is provided in Appendix E; interview guides and protocols are in Appendix F. A total of 31 people were interviewed, including CIC officials from national headquarters (NHQ), immigration program managers in the missions, and provincial government immigration officials. Table 2-1 provides a breakdown of the interviewees in each category. Interviews lasted between 45 minutes and two hours.

Table 2-1: Summary of interviewees

Interview Group Number of Interviewees
CIC Immigration Program Managers (IPMs) 4
Provincial representativesFootnote 11 9

2.4 Site visit to the CIO

To learn how MI applications are processed and to ask questions relating to the first set of MI, the evaluation team visited the Centralized Intake Office in Sydney, Nova Scotia. The visit commenced with a tour of the CIO to observe the processing sequence for MI1 and MI2. Next, CIO managers were interviewed in a group session lasting five hours, which thoroughly considered the CIO staff’s perspective on the evaluation issues. Finally, a file review was conducted of a random sample of 90 MI1 applications received by the CIO, comparing their actual progress to the intended design of the MI1.

2.5 Survey of visa offices

A survey of Canadian visa offices abroad (CVOA) was carried out in June 2011. There being a finite number of CVOAs that dealt with MI1, there was no need to sample, which obviated sampling error and the need for statistical tests on data obtained from the survey. The questionnaire was devised based on the requirements of the evaluation framework and a copy can be found in Appendix F.

The survey was pre-tested with the visa offices in London and Buffalo. The immigration program managers were asked to fill in the survey and to answer a short list of questions pertaining to the questionnaire. Their feedback was used to revise the questionnaire.

To maximize the response rate, CIC International Region emailed the survey to all CVOAs (except London and Buffalo). Missions were asked for one consolidated response (one survey) per office. The response rate, after a reminder, was 77% (37 of 48).

2.6 Assessment of financial data

Because the evaluation framework included a question on the cost-effectiveness of the CIO, the evaluation team worked with representatives from Finance, Central Processing Region and International Region to compile and analyze financial data. Specifically, the goal was to use data from CIC’s Cost Management Model (CMM)Footnote 12 to determine the cost of processing one FSW application pre-MI, to the cost for one received under MI1.

2.7 Limitations of the methodology

The evaluation contains a balance of qualitative and quantitative lines of evidence and allows for the triangulation of research findings. However, there are two methodological concerns that should be noted.

The first stems from the fact that the primary focus of the study was the implementation of MI1, but data collection took place after MI2 had been in place for almost a year. This introduced the potential for some confusion between the two on the part of key informants. While the evaluation team was very careful to distinguish between the two sets of instructions in interviews, they did not have the same opportunity in the survey, which was administered on-line, with no opportunity to probe or clarify responses. In those cases where other contextual evidence in the survey responses suggested some confusion between MI1 and MI2, the evaluators have not included the response in the analysis.

Another limitation of the methodology had to do with the available financial data. As will be discussed further in section 3.3.4 Economy and Efficiency, CIC maintains financial data on the processing of applications, but it is not sufficiently detailed to allow for a comparison of the pre and post-MI costs. In order to address this problem, the evaluation expanded its’ assessment of the CIO to determine whether it is achieving its other objectives, all of which would contribute to the efficiency of the overall process.

Report a problem or mistake on this page
Please select all that apply:

Thank you for your help!

You will not receive a reply. For enquiries, contact us.

Date modified: