Internal Audit of Immigration Pilot Programs

Internal Audit & Accountability Branch

March 29, 2022

Table of contents

1. Background


  1. The Immigration and Refugee Protection Act (IRPA) of 2002 contains provisions under Section 14.1 that allow the Minister of Immigration, Refugees and Citizenship Canada (IRCC) to issue Ministerial Instructions (MIs), establishing a class of permanent residents as part of the economic class, in support of the Government of Canada’s economic immigration goals. In order to address specific economic needs of various populations, regions, and sectors, MIs have been used to create immigration pilot programs that help establish new economic immigration classes, and test new and innovative approaches to immigration.
  2. As per Section 14.1, these pilot programs are meant to be temporary in nature, with a five-year time limit starting on the day on which the instruction first takes effect, and are meant to allow up to 2,750 applications to be processed annually per economic class created. They are assessed for their effectiveness and success before a decision is made about whether or not the pilot should be transitioned into a permanent program.
  3. Additionally, Section 25.2 of the IRPA provides the Minister of IRCC the authority to establish public policies, based on public policy considerations, that can also be used to create a program or initiative.
  4. There is no universal definition of pilot programs within IRCC. For the purposes of this audit, pilot programs are defined as time-limited programs or initiatives, established through MIs, under the authority of Section 14.1 of the Act, and are used by the Department to test innovative approaches and/or address specific economic needs and/or Government priorities.
  5. The Audit of Immigration Pilot Programs was included in the Department’s 2020-2022 Risk-Based Audit Plan, which was recommended by the Departmental Audit Committee at the June 2020 meeting and subsequently approved by the Deputy Minister.

Operating Environment

  1. The first pilot program, established through MIs, was launched in IRCC in April 2013. As of November 2021, the Department has initiated a total of 11 pilot programs established through MIs, of which one has been transferred to a permanent program, three are being transferred to a permanent program, three have been replaced or re-designed, and four are active pilots.
  2. The design, development, implementation, and evaluation of pilot programs at IRCC is a collaborative effort between several branches. Once a pilot program has gone through the senior management committee approval process, the Immigration Branch (IB) often plays a lead role in the policy design, analysis, and development for the program. The Immigration Program Guidance (IPG) Branch supports IB in determining roles and responsibilities related to delivery and implementation, including communicating with the various processing networks and external partners.
  3. IPG then plays a lead role in operational product development, including determining data and information technology (IT) requirements, developing Program Delivery Instructions (PDIs), and website tools. IPG also works with the Communications Branch and IB to develop media releases and is responsible for providing overall functional guidance.
  4. Once a pilot program has been launched, relevant networks (e.g. Centralized Network, Domestic Network, and International Network) will be involved in application intake and processing. IPG will provide regular updates to senior management on pilot program performance, supported by reports developed by the Operations Planning and Performance Branch (OPPB).
  5. The Integrity Risk Management (IRM) Branch will be involved in providing functional direction and strategic support to the Operations Sector for the purpose of enabling consistent and coherent integrity risk management on pilot programs. In certain cases, IRM is also supported by reports developed by OPPB. This includes analyzing the efficacy of existing risk management approaches and controls, and identifying potential gaps and opportunities in pilot programs in order to reduce potentially exploitable program vulnerabilities. Operational and administrative changes may be made to a pilot program during its life based on feedback received from key stakeholders.
  6. Finally, the Research and Evaluation (R&E) Branch may conduct an evaluation of a pilot program part way through its implementation phase and based on the annual Departmental Evaluation Plan. The purpose of the evaluations is to assess early pilot program outcomes and determine if the pilot program is on track to meet its objectives. The findings and recommendations of the evaluations support the decision-making process at the end of a pilot program’s lifecycle when the Department must decide whether to terminate the pilot program or mainstream it into a permanent program.
  7. Depending on the needs and objectives of specific pilot programs, other branches within IRCC may be involved at various points during a pilot’s lifecycle.

2. Audit objective, scope and methodology

Audit Objective and Scope

  1. The audit objective was to determine whether effective controls are in place for the development, delivery, and oversight of immigration pilot programs and to identify lessons learned and improvements for the development and management of future pilot programs.
  2. The audit examined structures, controls, and processes to direct, develop, manage, monitor, and assess immigration pilot programs. It also assessed the impact of pilot programs on the Department’s effectiveness in the overall delivery of its immigration and refugee programs. Finally, the audit examined if the Department effectively leverages pilot programs as a mechanism to meet its policy objectives and departmental priorities.
  3. The following pilot programs (some of which include more than one stream) were selected for detailed examination, including:
    • Start-Up Visa Program (2013-2018)
    • Atlantic Immigration Pilot (2017-2021)
    • Caregivers Pilots (the Home Child Care Provider and Home Support Worker pilots) (2019-2024)
  4. The selected pilot programs were chosen based  on consultations with implicated IRCC sectors (i.e. Operations and Strategic and Program Policy) and the following considerations: MIs pilot type (i.e. regional economic pilot program or sectoral economic pilot program); funding type (i.e. funded or unfunded); response type (i.e. to test an innovative approach, or to address specific needs raised by the Canadian population); and pilot status (i.e. underway, a pilot that has been converted to a permanent program, a pilot that was ceased).
  5. The Start-Up Visa (SUV) Pilot was the first pilot program established through MIs that IRCC launched in April 2013. The SUV Pilot has since been transitioned into a permanent program. The Atlantic Immigration Pilot (AIP) was in the process of being transitioned into a permanent program during the audit. The AIP pilot closed on December 31, 2021 along with pilot bilateral agreements. However, the MIs for the pilot expire on March 5, 2022 which grants IRCC the authority to continue to intake pilot applications until March 5, 2022. IRCC will start accepting permanent resident (PR) applications under the AIP as of March 6, 2022. As for the Caregivers Pilots, the Caring for Children and Caring for People with High Medical Needs pilots were established in 2014. These were replaced with two new caregiver pilots in June 2019 – the Home Child Care Provider and Home Support Worker pilots.


  1. The following procedures were performed as part of the audit:
    • Conducted 27 interviews with key personnel to solicit their views on the governance, risk management, and monitoring activities in place; program efficiencies; and lessons learned for the pilot programs selected for detailed examination. Stakeholders interviewed included representatives from IRCC senior management, program managers and program employees from 13 various branches, as well as external stakeholders from the Atlantic Canada Opportunities Agency and the four Atlantic provinces (NL, NB, NS and PEI);
    • Reviewed key supporting documentation such as presentations to senior management and governance committees, Performance Information Profiles (PIPs), evaluations, PDIs, internal Standard Operating Procedures (SOPs), among others, to identify and confirm the existence of controls for development, management and oversight of the pilot programs;
    • Examined governance, risk management, and monitoring activities in support of pilot programs; and
    • Carried out analysis of the information and evidence collected.
  2. The audit included the following criteria to evaluate the extent to which the audit objective was met:
    • An effective governance and oversight structure has been established to provide ongoing support at a departmental level in development and management of pilot programs; and roles, responsibilities and accountabilities of key players have been defined and communicated.
    • Risk management processes have been in place to ensure the identification, mitigation and monitoring of risks associated with pilot programs, including: the risk of ineffectiveness in delivering immigration and refugee programs (i.e. negative impact on the productivity, lack of clarity by clients regarding available pathways, etc.); financial and human resources management risks; fraud; and integrity.
    • Processes have been developed and implemented to monitor, assess and report on the effectiveness and efficiencies of pilot programs, its compliance with regulatory requirements, as well as identify improvements for the management of pilot programs.
  3. The audit did not specifically report on the pilot programs selected for detailed examination, but rather used the results of analysis to identify overall trends in the development and management of pilot programs.
  4. The audit team did not perform an in-depth testing and analysis of applications received for specific pilot programs to determine whether they were compliant with program terms and conditions.
  5. This audit was carried out from June 2021 to August 2021. The audit observations, conclusions and recommendations are based on the work performed.

Statement of Conformance

  1. This audit was planned and conducted in conformance with the Institute of Internal Auditors’ International Professional Practices Framework, as supported by the results of a quality assurance and improvement program.

3. Audit findings and recommendations

Departmental Pilot Guiding Principles and Program Governance and Oversight

Definition of Pilot Programs

  1. Overall, pilot programs included time-limited economic immigration classes that were created outside of the Department’s regulatory immigration programs, but operated within the regulations, and were used to test innovative approaches, and/or address specific economic needs raised by the Canadian population and/or Government priorities. However, there was no universal definition and/or common understanding of pilot programs within the Department in terms of what they are used to achieve and how they are developed.
  2. All MIs, including those through which pilot programs are created, are published in the Canada Gazette and are published on IRCC’s website, specifying the pilot program criteria along with the launch date. There was no consolidated list internal to the Department that tracked all key pilot program-related information or which could have been used for formal tracking, monitoring and reporting on their status, noting if they were funded or unfunded, what stage they were in, key risks and considerations, and performance results.

Guiding Principles for the Use of Pilot Programs

  1. Well defined guiding principles for pilot programs help support better design, management and evaluation of these programs; support decision-making; ensure pilot programs are being leveraged to meet policy objectives and departmental priorities; and enable the Department to add value to clients, employees, and partners.
  2. Currently, pilot programs are leveraged to address emergent needs and to test new innovative immigration approaches. For example, the AIP was developed to respond to acute demographic and labour market challenges in a specific region, including slow economic growth, an aging workforce, and difficulty attracting and retaining immigrants, while testing an employer-driven approach. The Caregivers Pilots were developed to provide a clearer and direct pathway for in-home caregivers to permanent residence, while continuing to provide Canadian families with a range of caregiving options. The key feature tested for the Caregiver Pilots was the two-step application process to work in Canada and qualify for permanent residence. Step 1 is the pre-assessment for permanent residence, in which individuals submit applications for permanent residence and work or study permit(s) for themselves and their family. Step 2 is the final decision for permanent residence, in which individuals inform IRCC when they have accumulated their two years of required work experience and then IRCC makes a decision on their permanent residence application.
  3. Overall, pilot programs were viewed as a useful means to explore, test, determine viability, and deploy new innovative approaches, processes and concepts to support the Department in the achievement of its mandate, policy objectives and priorities.
  4. Some interviewees expressed a concern regarding how pilot programs were being used. It was mentioned that pilot programs were occasionally used as a faster mechanism to respond to an emerging need or changing government priorities as opposed to using pilot programs for the purpose of testing new and innovative immigration approaches.
  5. It was noted that the Department did not have formally defined guiding principles to guide the design, management and evaluation of pilot programs and facilitate the decision-making around the pilot programs. As such, the Department may not have sufficiently established a strong benefits realization management regime within the Department that would contribute to improving program delivery and achieving organizational, horizontal and Government of Canada objectives and priorities.

Governance and Oversight

  1. An effective governance and oversight structure is essential to support decision-making at the appropriate levels and by the appropriate parties for pilot programs. It was found that existing IRCC governance structures, such as the Policy Committee, and the Executive Committee, were leveraged as the main mechanisms to provide executive and senior management level governance and oversight for individual pilot programs at the design phase. The pilot programs go through relevant senior management committees for approval, which provides an appropriate level of rigour. In addition, the approach to pilot programs is confirmed through a Memo to the Minister.
  2. From the design phase and throughout the implementation phase of individual pilots, it was found that the level of oversight provided by senior management varied for the economic pilot programs selected for this audit. For example, in the case of the AIP, monthly dashboards were shared with senior management during the implementation, which provided an overview on the performance of the pilot. These dashboards included information on the number of employers participating in the pilot, the number of approved Permanent Resident (PR) applications, and number of admissions to the Atlantic region. Additionally, monthly dashboards on integrity risk trends were provided for the AIP. In the case of the Caregivers Pilots, presentations were provided to EXCOM, MinDM, the Antifraud Working Group, and the Policy Committee at the end of the life cycle of the pilot programs in 2018, to assess the next steps, including considerations to transition the pilot into a new permanent program. For the SUV, there was a lack of available documentation provided to senior management.
  3. Given how pilot programs are unique in design, objective, and scope, using governance and oversight structures in different ways tailored to the needs of the pilots is an approach that is often necessary to meet their unique needs. However, while governance and oversight are established at the program level, there is an opportunity to strengthen the existing oversight structures over the pilot programs by bringing a holistic perspective into the management of the portfolio of IRCC pilot programs. This will enable effective implementation of the departmental pilot program guiding principles, allow for holistic oversight of the performance of pilot programs, promote the coordinated management of pilot programs, reduce common risks, and ensure that lessons learned are being applied and that potential process efficiencies are being leveraged in the implementation of existing programs or new pilot programs.

Perceived Overlap Between Core Programs and Pilot Programs

  1. From a policy perspective, pilot programs are distinct, innovative solutions and pathways proposed to reach outcomes. However, a concern was raised that in some cases, the development of pilots were influenced by external factors (i.e. stakeholders, political influences), rather than being put forward by federal and provincial/territorial partners. While this could impact the implementation timeframes, the Department worked closely with partners to clearly identify the policy objectives and design a pilot which also met the federal/provincial/territorial needs. While changes mid-stream can be important for continuous improvement, there needs to be a formal approach to manage the changes and expectations to ensure they are properly integrated with the intent of the pilot and that their implementation is resourced.
  2. From an operational perspective, feedback from program managers, staff and external stakeholders noted a perceived overlap between pilot programs and core programs and pathwaysFootnote 1 with regard to eligibility criteria and application requirements that creates market confusion among clients. This extends to not only programs within the Department, but also with programs delivered at the provincial level. For example, there are approximately 80 streams across the nine provinces and two territories that are participating in the Provincial Nominee Program (PNP). For new applicants, it can be overwhelming to navigate and understand the differences between the various programs (both permanent and pilot) and pathways. Some individuals who apply under the pilots may have been eligible to apply under existing programs. For example, the PNP and the pilot SUV program had some perceived overlap. The PNP was designed to enable provinces and territories (PTs) to create streams to nominate immigrants who meet local labour market and economic needs and who have an intention to reside in their province or territory. The “business streams” established by PTs, to target business/entrepreneurs to operate a business and establish in their nominating province or territory, vary between jurisdictions, and include a wide range of business types. The SUV was intended to attract cutting edge entrepreneurs working in technology and innovation. The perceived overlap came by way of PTs having designed streams targeting entrepreneurs in the technology sector, which was similar to the objective of the SUV. Ultimately, the SUV applicants included some traditional business entrepreneurs that were similar to those that fall under the PNP.
  3. Another example of perceived overlap relates to the PNP and AIP programs for which the ultimate outcome and target groups of the two programs may seem duplicative, however, the approach to achieving this outcome is unique. The AIP and the PNP both aim to attract and retain newcomers that will fill local labour market needs in the Atlantic region. The employer-driven model, with an enhanced settlement focus, utilized by the AIP was developed in response to the low retention rates that were being observed in Atlantic Canada’s PNP. Both programs have proven to be crucial to Atlantic Canada’s ability to attract and retain newcomers that can support their economy.
  4. The Atlantic provinces have proven very adept at engaging stakeholders and directing them to the program that is best positioned to fill their unique needs and, therefore, to reduce market confusion and number of applications received. Additionally, AIP employers have access to IRCC’s Dedicated Service Channel (DSC) which provides support in navigating the immigration system. While the DSC does not direct an employer to a specific program, they are able to explain the differences in IRCC’s suite of immigration programs if there is confusion. Furthermore, the IRCC Communications Branch uses designated webpages for each pilot and program so that the public can be directed to specific pages where they can find the right information. The branch also leverages its Come to Canada tool to help potential applicants filter through programs as well as its Smart Messaging System to ensure that information is concise and that the messaging is going to the intended audience.

Financial Costs and Impacts of Pilot Programs

  1. Pilot programs are perceived to have higher upfront operational costs compared to existing permanent programs within the Department. Since pilot programs are often developed to address emerging needs and are also time-limited, there is a shorter timeframe available for the design stage. A significant number of resources are often required to meet shorter deadlines, to implement the pilot programs quickly, and to adjust to new developments during the implementation stage. Additionally, there is often a requirement for increased coordination between all key stakeholders, including external partners such as communities, PTs, and businesses, to ensure pilot programs are a success.
  2. Although it is generally understood that pilot programs may require more financial and human resources during the design stage, there are several pilot programs that have been developed without any additional resources allocated to meet demands. For example, the AIP, Caregivers, and Agri-Food Pilots were unfunded, and the Rural and Northern Immigration Pilot was partly funded. There was no true costing of pilot programs, and unfunded pilot programs relied on existing departmental capacity to deliver on priorities, increasing the strain on staff.
  3. Additionally, there was no analysis of the impact of pilot programs on the ability of the Department to effectively deliver their core programs or to develop effective mitigation measures to minimize the impact. Training for pilot program-specific operational tasks and activities was inconsistent, making it more difficult for employees to effectively manage their overall workload and limiting the Department’s ability to identify training efficiencies. It was noted that pilot program-specific training is often developed and delivered locally by staff with processing experience. While the training may serve its purpose for new operational staff, there is not necessarily a quality assurance review of the training program, which can affect the relevance and sustainability of that knowledge within the Department.
  4. Without clearly defined guiding principles for managing pilot programs, it may be more difficult to develop, manage and evaluate pilot programs as well as make informed, data-driven decisions regarding their potential transition into permanent programs for the Department. Available financial and human resources may not be prioritized effectively or efficiently and there may be a duplication of effort or additional gaps in the implementation of pilot program operations. Finally, it may be more difficult to identify emerging issues and obstacles in a timely manner or consider lessons learned for new pilot programs or existing permanent programs.
  5. Recommendation 1: The Senior Assistant Deputy Minister (ADM), Strategic and Program Policy in conjunction with the ADM, Operations should formally define a set of guiding principles for pilot programs to help better design, implement and evaluate these programs within the Department as well as determine and put in place mechanisms and controls to enable the implementation of the departmental pilot program guiding principles.

    Management Response: Management agrees with this recommendation. In response, the Immigration Branch will work with the Operations Sector on establishing and integrating guiding principles to help manage pilot programs effectively and standardize their development and implementation.

Accountabilities, Roles, and Responsibilities

  1. Well-defined and documented accountabilities, roles, and responsibilities aligned with the Benefits Framework are essential for ensuring that stakeholders fulfill their responsibilities and that pilot programs achieve expected outcomes and realize anticipated benefits.
  2. Documented guidelines, processes, and procedures provide an authoritative reference to stakeholders and contribute to effective and consistent development, delivery, and monitoring of pilot programs as well as support policymaking regarding their mainstreaming. They are essential in ensuring that stakeholders have a common understanding of objectives, requirements, outputs, and benefits.
  3. Overall, roles and responsibilities of key parties involved in the development, delivery, monitoring, and mainstreaming of pilot programs within the Department were generally understood. However, it was found that there was limited supporting documentation or formal framework outlining pilot program-related accountabilities, roles, and responsibilities. The general roles and responsibilities have not been defined holistically for all types of pilot programs and all functions (e.g. Benefits Realization). 
  4. While IB generally led the development and design phase of pilot programs, the Branch had assigned teams for specific pilot programs (i.e. for the AIP, Rural and Northern Immigration Pilot, and Economic Mobility Pathways Pilot) to lead pilot program design and implementation (including tracking and measuring program outcomes, making program changes, working with partners and stakeholders and briefing management). However, there was no dedicated team or center of expertise within the Branch to provide formal direction and/or guidance and incorporate lessons learned from past successes and challenges to effectively design and implement new pilot programs, affecting employee workload.
  5. To enhance the discharge of its roles and responsibilities for the delivery and performance monitoring of economic pilot programs, IPG, which has responsibility over these types of programs, set up a dedicated Innovation on Economic Pilots and Pathways (IEPP) Unit in February 2020, formerly known as the Economic Pilots Task Force. The IEPP is responsible for the Agri-Food Pilot, the Rural and Northern Immigration Pilot, the Economic Mobility Pathway Project, and the Municipal Nominee Program.
  6. It was found that there were no formal departmental processes, procedures, or frameworks to guide the design, and implementation of new pilot programs. Management and personnel relied upon the variable access to and understanding from institutional and individual knowledge and experience in lieu of formal procedures and processes. In the absence of departmental guidance, the IEPP Unit took the initiative to develop a PowerPoint deck, “The Design, Delivery and Evaluation of Pilots (Full Cycle)”, which describes the typical lifecycle of pilot programs. This document noted key steps in the pilot program process and roles and responsibilities of key players (i.e. IB, IPG, and R&E). The document was approved internally by the DG, IPG in February 2021. The document has yet to be shared with other stakeholders, and approved and operationalized at the departmental level. If key steps in the development and implementation of pilot programs are not identified and applied, there is the potential for increase in the risk of errors or omissions occurring in the process, that lessons learned are not being applied, and that the potential process efficiencies are not being leveraged, ultimately impacting the success of the pilot programs, and increasing costs to the Department.
  7. At the pilot program level, compliance procedures and program instructions were provided through PDIs and SOPs to those responsible for the operational implementation of the pilot programs. Based on interviews, they were clear and well understood and were updated regularly. Guidance was also provided to external partners, including PTs for the AIP as well as the National Angel Capital Organization and Canada’s Venture Capital and Private Equity Association for the SUV, in the discharge of their responsibilities.
  8. There were also clearly documented processes, procedures, and tools available on IRCC’s intranet that outlined how to process both permanent and temporary residence applications for the pilot programs. This included processes and procedures to assess compliance of applications with the programs’ terms and conditions and to ensure program integrity.

Risk Management

  1. Risk management facilitates decision-making that is informed by an understanding of risks. It allows for a proactive response to threats and opportunities alike. Sound risk management is about supporting strategic decision-making that contributes to the achievement of an organization’s overall objectives.
  2. A formal comprehensive approach to risk management will allow for the identification, assessment, and mitigation of pilot program risks in a systematic manner and support decision making and resource allocation. A fraud risk management process that includes reporting and monitoring on potential fraud risks also helps inform and support control activities needed to detect, prevent, and mitigate potential fraud risks specific to pilot programs.
  3. It was found that documented processes and procedures were in place to manage fraud and program integrity risk for the programs selected for audit, and were tailored to the specific needs of these pilot programs. For example, IB developed a roles and responsibilities document with IRCC’s Operations Sector to outline how each branch would be overseeing the integrity of the AIP program. Since June 2019, IB has been working with colleagues in the Operations Sector to establish key performance indicators (KPIs) related to program integrity and to collect data to get better insight into the integrity of the pilot. IB closely monitors program integrity cases raised by the provinces and IRCC operational colleagues to assess the extent to which the design of the pilot has inadvertently enabled misuse. In collaboration with IPG, IRM and the networks, IB’s role is to identify potential policy changes to strengthen program integrity, as well as apply lessons learned to other immigration programs.
  4. Additionally, IRM was proactive in working with pilot programs in the assessment of fraud and integrity-related risks, including the development of a standardized process to support their work. For example, they worked closely with AIP in the pilot’s transition to a permanent program to ensure measures are put in place for risk mitigation. IRM chairs an Anti-Fraud Working Group (AFWG), in which provinces can raise and did raise any program integrity issues involving the PNP. The AFWG reports up to the Economic Operations Working Group. It was found that both the AIP and the SUV had documented program integrity measures/frameworks in place.
  5. In some cases, instances of suspected or confirmed fraud do not pertain to IRCC clients. Since these individuals do not have any associated files in the GCMS system, they cannot be flagged. IRM noted that there is, nonetheless, a process in place for filing and disseminating these instances of suspected or confirmed fraud, thereby permitting IRCC officers to substantiate the claims and to possibly link tips received to produce a more fulsome picture of the suspected/confirmed fraud and its prevalence in the immigration process.
  6. There were also communication channels in place (e.g. mailboxes, the DSC, e-submission process for the provinces to encourage consistent and detailed submissions, working groups, program integrity training, etc.) for both internal and external key stakeholders to raise any fraud or program integrity related issues. Overall, these were effective and accessible as needed. However, with regards to AIP, provincial representatives noted the need to improve the management and monitoring of the general IRCC mailbox as they occasionally do not receive responses to their inquiries or they lack sufficient information to adequately support decision making. It was noted that while provinces often provide IRCC with trends they are observing, such as potential instances of fraud or misrepresentation, there is a lack of a closed feedback loop. IRM advised that the e-submissions process, which they recently implemented, aims to remedy this issue. This process allows provinces to submit their concerns in a standard manner and receive timely feedback.
  7. It should be noted that often times, risks that were found, were not relevant to the abuse/fraud that could affect the programs, but rather attributed to operational consideration risks (e.g. processing times, bottle necks, etc.).
  8. It was found that there was no overarching guidance or direction to pilot programs on the requirements to establish a formal comprehensive risk management framework as part of the design or implementation stages of the pilot programs. Pilot programs were not developing risk management strategies to identify, assess and mitigate risks associated with programs in a systematic manner. Additionally, there was no formal requirement to develop human resource or financial plans, including the assessment and mitigation of human resource, financial, and operational risks, for pilot programs. Without formal guidance or requirements in place, it can be more difficult to accurately assess the operational impacts pilot programs have on IRCC’s core business. For example, interviewees noted that existing IRCC staff are often tasked to support pilot program implementation in addition to their existing responsibilities. Additionally, there was concern that unfunded pilot programs were diverting existing resources from the Department’s core business activities. Interviewees believed this resulted in increased employee workload and additional pressures on departmental operations.
  9. Generally, operational implementation representatives felt that they had adequate communication channels in place to identify and mitigate risks as well as to have questions around processes and procedures addressed. External stakeholders noted some opportunities for improvement to ensure ongoing support from IRCC, including providing clearer guidelines for de-designation and changes to an individual’s employment after endorsement as well as simplifying or reducing the amount of reporting required.
  10. Without a formal comprehensive approach to risk management, informed decision making and adequate resource allocation to support pilot program development and implementation would become more difficult.

Monitoring and Reporting

  1. A structured and integrated approach to managing investments relies on a systematic selection, prioritization, monitoring, and evaluation process. Aligning monitoring and reporting with the requirement of the Departmental Results Framework and the Benefits Framework will ensure that programs align with the departmental objectives and support business needs while minimizing risk, maximizing anticipated project outcomes, and supporting the realization of benefits.
  2. Lessons learned are the documented information that reflects both the positive and negative experiences of a program. Documenting and acting on lessons learned will demonstrate the Department’s commitment to continuous improvement in management of pilot programs.
  3. It was found that R&E played an integral role in leading the assessment of the performance of pilot programs. For pilots included in the annual Departmental Evaluation Plan, the departmental evaluation function assessed and reported on the early performance and outcomes and efficiencies of individual pilot programs. Evaluation of pilot programs enables the Department to support early evidence-based decision-making regarding the pilot program direction and potential transition of the pilot into a permanent program. It also provides valuable insight and recommendations for strategic program changes based on lessons learned through the delivery.
  4. With pilot programs lasting three to five years, there were challenges associated with pilots’ short timeframes. Normally, evaluation takes place three years into the pilot implementation for five-year pilots. The Department was limited in its ability to fully assess longer-term outcomes such as economic establishment and retention, as that information is not fully available until later in the pilot lifecycle. As such, certain decisions need to be made (such as whether the pilot should be terminated or mainstreamed) before the long-term results of a regional economic pilot can be known.
  5. Performance measurement strategies and frameworks, including PIPs, logic models, and data collection strategies, were developed for the pilot programs examined. The pilot programs relied on OPPB and the Chief Data Officer (CDO) Branch to support both internal and external performance reporting. However, it was noted that there was a heavier reliance on program evaluations rather than regular monitoring and reporting exercises for the SUV and Caregivers Pilots.
  6. Identifying data requirements during the design stage of the pilot is integral to ensuring that performance can be effectively measured. It was noted that engagement with CDO was inconsistent across the specific pilot programs examined and involved certain challenges. For example, CDO was consulted with in the early stages of development of the AIP which allowed the Branch to set up the necessary data collection mechanisms and reporting interfaces to ensure that relevant performance data was collected to support the measurement of pilot program performance. However, complexities in the program design of the TR to PR category of the Caregivers pilots meant that, although CDO was engaged early in the process, there have been ongoing reporting challenges.
  7. Overall, it was found that as the Department gained experience with the development and management of pilot programs, the ongoing monitoring, assessment and reporting practices have improved. It was also found that limited performance mechanisms were in place to assess the return on investment of pilot programs or other objectives such as enhanced partnerships and collaboration. For example, there was a need to improve data collection and enhance performance management for the SUV. The SUV performance measurement strategy was updated in response to a recommendation resulting from the Evaluation of the SUV pilot. Despite this, there are remaining challenges with the collection of data required to measure the success of the SUV, some of which are outside of the program’s area of control. The SUV program has indicators that are monitored as part of their federal economic PIPs, which are currently under review, however, the SUV has been working with existing indicators and data. Additionally, IRCC relies on private sector partners to help deliver the SUV program. IRCC communicates with industry associations to receive information on how the program is being implemented from their perspective and what they are hearing from the designated entities, but the designated entities are not required to report back. In transitioning the SUV pilot to a permanent program, the lessons learned from the pilot program indicated that the entire process and relationships with the organizations should be mapped. To date, those processes have not been mapped out.
  8. Recommendations and lessons learned specific to the AIP and SUV pilot programs were developed during the evaluations of these programs. A number of best practices identified for the AIP on how to design and implement a pilot program informed future pilots. For example, when designing the Rural and Northern Immigration Pilot, early lessons learned from the AIP were considered. There was, however, no formal process or tracking of lessons learned from all pilot programs in a holistic manner, including lessons learned related to the innovative practices of pilot programs or best practices in the effective design and implementation process of pilot programs.
  9. In absence of a defined lessons learned process in place, the Department increases the risk of running into similar obstacles or missing opportunities to implement good processes and practices for new pilot programs and existing permanent programs.
  10. With respect to monitoring and reporting on the compliance with Section 14.1 of the IRPA, processes were in place to monitor and report on compliance with regards to the number of applications received and approved. All pilot programs selected for audit were in compliance with these requirements. Additionally, the pilots were based on the attainment of a defined economic goal, the classes of applicants being targeted were economic in nature, and the pilots did not or would not exceed a five-year lifespan.

4. Conclusion

  1. Overall, the audit found that there was no clearly defined guiding principles for pilot programs to better guide development, oversight, management, evaluation and transition of pilot programs.
  2. Due to the uniqueness of the objectives of each pilot program, pilots were managed independently, and existing departmental structures and processes were leveraged to support their development, implementation, and transition or termination. Although there were adequate fraud and program integrity risk management processes in place, there was no overarching guidance to create a formal risk management framework for pilot programs. The Department heavily relied upon the evaluation function to assess and report on the early performance results of pilot programs to support early decision-making regarding changes to a pilot program or the transition to a permanent program. However, there were limitations in assessing economic establishment and retention as the respective data is not available until years after the pilots have ended. Monitoring and reporting did not include Benefits Realization to inform further investment decisions.
  3. As demands on IRCC continue to grow, and there is more potential to use pilot programs, clearly defined guiding principles for the management and evaluation of the pilot programs will support the holistic management and oversight of the Department’s pilot program portfolio and the broader achievement of departmental objectives. This will also provide direction and guidance for the development, implementation of individual pilot programs with consideration of leading practices and lessons learned.

Management has accepted the audit findings and developed an action plan to address the recommendations.

Page details

Date modified: