Anonymized Recruitment Pilot Project ─ Final report
Please note that the Name-blind Recruitment Pilot Project is now being referred to as the Anonymized Recruitment Pilot Project. This name better reflects the methodology used for the pilot, which focussed on anonymizing applications by concealing names as well as other significant personal information.
Table of contents
- 1. Introduction
- 2. International context
- 3. Methodology
- 4. Analysis and findings
- 5. Operational challenges during NBR Pilot Project
- 6. Conclusion
- Bibliography
- Annex A – Safeguards for merit-based appointments
- Annex B - List of the 17 participating organizations and selected external processes
- Annex C - Operational process
- Annex D – Example of NBR and Traditional Applications from Public Service Recruitment System (PSRS)
- Annex E - Multivariate Regression Model Theory
- Annex F - P-Values from the Multivariate Regression Model
- Glossary
1. Introduction
The Public Service Commission (PSC), in collaboration with the Office of the Chief Human Resources Officer (OCHRO), at the Treasury Board of Canada Secretariat, has undertaken a Name-Blind Recruitment Pilot Project (NBR Pilot Project).
The PSC’s mandate is to promote and safeguard merit-based appointments and, in collaboration with other stakeholders, to protect the non-partisan nature of the public service. Its mandate includes a representative public service. As such, the PSC actively investigates new ways to improve recruitment processes and, over the years, has implemented a variety of tools to support bias-free recruitment (See Annex A for additional information).
The objective of the NBR Pilot Project was to determine whether concealing personal information (NBR assessment method) which could lead to the identification of a candidate’s origin from job applications, had an impact on the screening decisions made by reviewers when compared to the Traditional assessment method where all personal information was presented.
2. International context
In recent years, a number of jurisdictions have, with mixed results, undertaken studies and initiatives to explore how blinding applications can influence hiring decisions. For example, Krause et al. (2012) in a European study (France, Germany, the Netherlands and Sweden), observed statistically significant differences between anonymized and traditional methods of screening candidates. They noted, however, that the direction of the difference (beneficial or detrimental) was inconsistent where anonymization showed beneficial effects in some processes and lowered the probability of being invited to an interview in other situations. The authors noted that it is possible that the benefits of anonymization depend on whether discrimination is present in the hiring process. They also noted that anonymization may negatively impact sub-groups of candidates by preventing the implementation of existing positive measures aimed at promoting greater diversity.
In a study undertaken by Oreopoulos (2011), randomly created résumés were sent by email in response to online job postings in Toronto and Montreal. Résumés were designed to represent typical Canadian immigrants from China, India, Pakistan and Greece, in addition to non-immigrants with and without ethnic-sounding names. Results showed that Canadian-born candidates with English-sounding names were more likely to be invited to an interview than Canadian immigrants with ethnic-sounding names.
In a separate international study of anonymous job applications of new Ph.D. economists, Krause et al. (2011) observed that women had a lower probability of being invited to an interview when their identity was concealed during the selection process. The authors suggested that anonymization of candidate information prevented the use of positive measures aimed at improving the representation of women.
A study conducted by the Australian Public Service observed similar results as Krause et al. (2011) where de-identifying applications at the short-listing stage did not appear to assist in promoting diversity. In fact, when all candidate’s information was made available, reviewers discriminated in favour of female and visible minority candidates.
Results of these studies suggest that the benefits of NBR may be partly dependent on the organizational context, including whether discrimination is present in the hiring process and whether the organization currently has, and makes use of, policies aimed at improving diversity.
In October 2015, the UK Civil Service implemented name-blind recruitment. The objective was to help reduce unconscious bias of recruiters and to ensure a greater diversity of candidates being recruited, including their socio-economic backgrounds. Unfortunately, no systematic review of the impact of name-blind as a recruitment method has been undertaken so far.
The paucity of published studies and research on name-blind recruitment coupled with their mixed results, provide the impetus to explore what effect NBR could have in the Federal Public Service and which subgroups could potentially benefit from such an approach.
3. Methodology
Sample selection
Twenty-seven (27) external processes launched between April and October 2017, across 17 participating organizations were included in the NBR Pilot Project (see Annex B). The resulting sample is comprised of 2,226 candidates and including 685 members of visible minoritiesFootnote 1 (30.8%). Overall, 54 independent reviewers (2 reviewers per process) participated in the pilot, resulting in 4,452 independent screening decisions.
Given the small number of candidates who self-declared as members of Indigenous Peoples (73 candidates or 3%) or as persons with a disability (102 candidates or 5%), the analysis is restricted to visible minorities.
Since live external recruitment processes were used for the pilot project, processes were considered for inclusion as they were launched. A random sampling of processes would have required recruitment plans from participating organizations. As these were not available for all organizations, a non-random quota sampling methodology was employed to ensure a sufficiently broad representation of various occupational categories.Footnote 2
Distribution of Applications across Groups
External recruitment processesFootnote 3 were used to garner data and were conducted in real time, in collaboration with the 17 participating organizations. Twenty-seven processes were selected and advertised via the Public Service Recruitment System (PSRS) which is the platform where federal jobs are advertised either externally to all Canadians or internally to public servants only.
For each process, applications were randomly distributed into 4 groups (see Table 1) representing 25% of the sample of applications:
- Group 1: Applications assigned to this group were assessed by two reviewers under the Traditional assessment method;
- Group 2: Applications assigned to this group were assessed under the Traditional assessment method by Reviewer A and under the NBR assessment method by Reviewer B;
- Group 3: Applications assigned to this group were assessed under the NBR assessment method by Reviewer A and under the Traditional assessment method by Reviewer B;
- Group 4: Applications assigned to this group were assessed by both reviewers under the NBR assessment method.
Reviewers were asked to assess all applications independently and were instructed not to consult each other during their assessment.Footnote 4
Group 1 (25% of applications) | Group 2 (25% of applications) | Group 3 (25% of applications) | Group 4 (25% of applications) | |
---|---|---|---|---|
Reviewer A | Traditional | Traditional | NBR | NBR |
Reviewer B | Traditional | NBR | Traditional | NBR |
Redacting personal information
The anonymization process was divided into five phases (see Annex C).
- For each process, participating organizations sent applications to the PSC.
- The PSC then randomly distributed selected applications across groups as described above.
- Applications requiring anonymization were assigned to trained anonymizers for redaction.
- Redacted applications were subsequently quality controlled via a second trained anonymizer.
- Applications were returned to the organizations for assessment by reviewers.
For the NBR assessment method, the following information was redacted from job applications:
- last name, first name, initials and any other references to the candidate’s name
- citizenship and country of origin
- mailing address(es) and telephone number(s)
- educational institutions
- any references to organizations, businesses and establishments where general training and professional experience were acquired
- languages spoken and written
- any references to geographical locations, other than those related to a professional association
- any references that may indicate the candidate is a member of an employment equity group, other than the female genderFootnote 5
- any references to religion
- any references to publications (university or other)
Although care was taken to redact personal information, in some instances, it was not possible as the information to be redacted was central to the assessment of an essential qualification. Conversely, additional information may have been redacted to prevent the identification of a candidate’s origin.
4. Analysis and findings
As mentioned in the section above, the sample consisted of 2,226 applications from 27 different external recruitment processes. This section presents the distribution of the candidates according to some of their characteristics.
Sample characteristics
The following tables present a breakdown of the sample along key characteristics. As per Table 2, 30.8% of the NBR Pilot Project candidates are members of visible minorities. Although the pilot project only included external recruitment processes, 47.3% of candidates indicated either having experience or currently being employed in the Federal Public Service.
Variable | Value | Percentage of Candidates |
---|---|---|
Visible minorities |
Yes | 30.8% |
No | 69.2% | |
Experience in Federal Public Service |
Yes | 47.3% |
No | 52.7% |
As indicated in the methodology section, the sampling strategy sought to include processes from all occupational categories, with the exception of the Executive category.
Table 3 shows how the distribution of candidates varied across occupational categories with the Technical category having fewer applications and the Administrative Support and Operational categoryFootnote 6 with the most applications.
Across occupational categories, representation rates of visible minorities varied between 22.8% in the Technical Category and 39.5% in the Scientific and Professional Category.Footnote 7 In addition, the representation of visible minorities from one process to the next also varied, ranging from 3% to 60%.
Scientific and Professional |
Administrative and Foreign Service |
Technical |
Administrative Support and Operational |
Total |
|
---|---|---|---|---|---|
Number of process |
6 | 7 | 5 | 9 | 27 |
Number of candidates | 600 | 594 | 307 | 725 | 2226 |
Number of visible minorities | 237 (39.5%) | 201 (33.8%) | 70 (22.8%) | 177 (24.4%) | 685 (30.8%) |
Analysis of Assessment Decisions
A number of factors other than visible minority status and assessment methods (NBR versus Traditional) could influence screen-in rates. For example, when combining all candidates, the overall screen-in rate was 46%. However, the screen-in rates differ by processes, ranging from 11% to 93%. Factors such as occupational category and experience in the Federal Public Service could also independently influence candidate screen-in rates.
A multivariate analysis was undertaken to assess the effect of various factors on the candidates’ screen-in rates (See Annex E). Given the dichotomous nature of screening decisions (in, out), a logit model was developed to determine what variables influenced screen-in rates.
As a result, the model included the following variables:
- assessment methods (NBR and Traditional);
- visible minority status;
- occupational category;
- experience in the Federal Public Service;
- number of essential qualifications;
- order in which applications are reviewed.
Controlling for these factors using a logit model permitted the isolation of the impact of NBR on screen-in rates. Depending on the analysis, the interaction term of the model was adjusted and the statistical significance was established at p ≤ .05 (See Annex F).
ResultsFootnote 8
The following section presents screen-in rates generated by the logit model.
As can be observed in Table 4, results show that the NBR assessment method significantly reduces the screen-in rate of candidates.
Screen-In Rates |
|
---|---|
Traditional | 48.3% |
Name-Blind Recruitment | 43.3% † |
† The difference between the assessment methods is statistically significant
Assessment method and visible minority status
Table 5 compares the screen-in rates for visible minorities and all other candidates under the Traditional and NBR assessment methods. Across methods, for visible minorities, results indicated no significant effect on the screening decisions of applications. However, for all other candidates, a significant reduction in the screen-in rates was observed with the NBR assessment method. There were no significant differences between the visible minorities and all other candidates when comparing candidates under the same method.
Visible Minorities | All Other Candidates | |
---|---|---|
Traditional | 47.3% | 48.7% |
Name-Blind Recruitment | 46.0% | 42.0%† |
† The difference between the assessment methods is statistically significant
Assessment method and occupational categories
Within occupational categories, only the Administrative Support Category showed a significant (negative) effect of NBR on the screen-in rate of candidates. Although no other significant differences were observed, the consistent trend towards lower screen-in rates across all other occupational categories is suggestive of a potential generalized effect of lower screen-in rates when the NBR assessment method was applied (see Table 6).
Scientific and Professional | Administrative and Foreign Service | Technical | Administrative Support | Operational | |
---|---|---|---|---|---|
Traditional | 47.7% | 37.6% | 62.6% | 50.0% | 46.3% |
Name-Blind Recruitment | 42.3% | 36.5% | 55.3% | 42.9%† | 36.0% |
† The difference between the assessment methods is statistically significant
Assessment method and experience in the Federal Public Service
In Table 7, screen-in rates have been compared for candidates who had Federal Public Service experience with those with no experience. This can be seen as an important factor since candidates with Federal Public Service experience may have gained inside knowledge and experience when applying for public service positions.
As Table 7 demonstrates, having experience in the Federal Public Service had a positive effect on a candidate’s screen-in rate in both assessment methods. The NBR assessment method significantly reduced screen-in rates of candidates whether they had Federal Public Service experience or not.
Experience in Federal Public Service | No Experience in Federal Public Service | |
---|---|---|
Traditional | 54.8%* | 43.2% |
Name-Blind Recruitment | 48.7%†* | 39.1%† |
† The difference between the assessment methods is statistically significant
* The difference within the same screening methods is statistically significant
Visible minority status and occupational categories
Table 8 presents the screen-in rates of candidates according to their visible minority status across occupational categories. As Table 8 illustrates, there were no significant differences in the screen-in rates between visible minorities and all other candidates across occupational categories.Footnote 9
Scientific and Professional | Administrative and Foreign Service | Technical | Administrative Support | Operational | |
---|---|---|---|---|---|
Visible Minorities | 47.9% | 38.2% | 61.5% | 50.3% | N.A. |
All Other Candidates | 45.8% | 38.7% | 60.0% | 47.0% | N.A. |
† No differences between visible minority status are statistically significant in this table
Visible minority status and Federal Public Service experience
Given the main effect of having experience in the Federal Public Service, an analysis was undertaken to assess for potential interaction between this variable and visible minority status. As can be seen in Table 9, the effect of having experience in the Federal Public Service is essentially the same for both visible minorities and all other candidates.
Experience in Federal Public Service | No Experience in Federal Public Service | |
---|---|---|
Visible Minorities | 52.3%* | 41.6% |
All Other Candidates | 50.9%* | 40.3% |
* The difference between candidates who had Federal Public Service experience or not is statistically significant for both visible minorities and all other candidates
5. Operational challenges during NBR Pilot Project
However carefully designed, the NBR Pilot Project included certain limitations:
- Reviewers were aware they were participating in the NBR Pilot Project; this awareness could have potentially affected their assessment.
- The process of anonymizing a candidate’s information proved to be very labour intensive. Depending on the complexity of the application and the amount of information provided by the candidate, anonymizing each application took between 15 to 20 minutes. Given the high number of candidates to external recruitment processes in the Federal Public Service, full implementation of name-blind recruitment would require that new methods and technologies be explored to reduce the operational burden and mitigate any negative impact on the overall time to staff a position. For more information regarding the operational process, see Appendix C.
- The name-blind method itself can introduce some limitations. Even though each redacted application was subject to quality control, some identifying information may have been inadvertently left unconcealed, possibly revealing information about the candidate.
- It was not always possible to redact personal information that could identify a candidate’s origin without incurring the risk of removing tangible skill-related information that might have affected the screening decisions of reviewers.
- Processes with fewer than 50 applications were not included in the pilot project. This decision was made given the workload associated with reaching the total sample size of applications required for the project. Processes with a total number of candidates ranging from 50 to 100 were sought.
6. Conclusion
The objective of the NBR Pilot Project was to determine whether concealing personal information (NBR assessment method) which could lead to the identification of a candidate’s origin from job applications, had an impact on the screening decisions made by reviewers when compared to the Traditional assessment method where all personal information was presented
In summary, results of the pilot project indicate that NBR assessment method decreases candidate screen-in rates in external recruitment processes when compared to the Traditional assessment method where all information is available for review. When the effect of NBR is compared across visible minority status, results indicate that, although there is no net benefit or disadvantage with the NBR assessment method for visible minorities, NBR significantly reduces the rate of being screened-in for all other candidates.
When the impact of NBR was assessed across occupational categories, a significant reduction was observed only in screen-in rates for the Administrative Support category. However, consistently lower (yet not significant) screen-in rates were noted in the NBR method for all other occupational categories. This is suggestive of a potential generalized reduction in screen-in rates when NBR is applied. Moreover, no significant difference was noticed in the overall screen-in rate of visible minorities across occupational categories.
Results also showed that candidates with Federal Public Service experience had higher screen-in rates than candidates with none and that this effect existed for both visible minorities and all other candidates across assessment methods (Traditional or NBR). Although this effect was not in-scope for this pilot project, the fact that 47% of applicants to an external process had Federal Public Service experience and that experience significantly improved their odds of being screened-in should be explored further in light of current Federal Public Service renewal efforts.
It is important to note that the pilot project relied on volunteer organizations and a non-random selection of external recruitment processes. Such limitations are common in name-blind recruitment studies and need to be taken into account when discussing implications of results. As such, although the pilot project provided valuable insight into the potential impact of using the NBR assessment method in the Federal Public Service, generalizing these results to the whole of the public service or their use in exploring public policy options for system-wide implementation is not possible.
Given the above, it is essential that the pilot project be considered as one source of information that will be added to other sources in order to provide a broader understanding of the impact of NBR in the Federal Public Service. Possible next steps include how audits or studies can be leveraged to improve the understanding of any potential bias during selection of candidates.
One key advantage of the audit methodology is the palliation of the potential effect of hiring managers being aware of their participation in the pilot project and its potential impact on screening decisions. An audit would not be subject to such limitations as it would assess the drop-off rates of employment equity groups for appointments that had already occurred and for which hiring managers made decisions while being unaware of their subsequent participation in a review of their selection decisions.
Should such further work identify specific circumstances in which NBR could prove beneficial, solutions would need to be explored to alleviate any additional operational burden associated with the process of anonymizing job applications. The PSC is currently looking at modernizing the Government of Canada’s recruitment platform and exploring how technology could incorporate name-blind principles in its design, should the need be raised.
Bibliography
Aslund, Olof, and Oskar Nordström Skans. 2007. Do anonymous job application procedures level the playing field? Working paper, Uppsala: Institute for Labour Market Policy Evaluation.
Banerjee, Rupa, Jeffrey G. Reitz, and Phil Oreopoulos. 2017. Do large employers treat racial minorities more fairly? A new analysis of canadian field experiment data. Research report, Toronto: University of Toronto.
Behagel, L., Crépon, B., Le Barbanchon, T. 2014. Unintended effects of anonymous résumés. Bonn: IZA Discussion Paper. Accessed November 2017. http://ftp.iza.org/dp8517.pdf.
Behaghel, Luc, Bruno Crépon, and Thomas Le Barbanchon. 2011. "Evaluation of the impact of anonymous CVs." Paris.
Bog, Martin, and Erik Kranendonk. 2011. "Labour market discrimination of minorities? Yes, but not in job offers. ." Munich Personal RePEc Archive.
Eid, Paul, Meisson Azzaria, and Marion Quért. 2012. Mesurer la discrimination à l'embauche subie par les minorités racisées : Résultats d'un ''Testing" mené dans le grand Montréal. Montréal: Commission des droits de la personne et des droits de la jeunesse Québec.
Hiscox, Michael J., Tara Oliver, Michael Ridgway, Arcos-Holzinger, Alastair Warren, and Andrea Willis. 2017. Going blind to see more clearly: unconscious bias in Australian Public Service (APS) shortlisting process. BETA: Behavioural Economics Team of the Australian Government.
Joseph, J. 2016. What companies use blind/anonymous resumes and what benefits have they reported? Accessed November 2017. http://digitalcommons.ilr.cornell.edu/student/102.
Krause, Annabelle, and Ulf, Zimmerman, Klaus F. Rinne. 2011. A little less discrimination? Anonymous job applications of fresh Ph.D. Economists. Discussion paper, Bonn: Institute for the Study of Labor.
Krause, Annabelle, and Ulf, Zimmermann, Klaus F. Rinne. 2012. "Anonymous job applications in Europe." IZA Journal European Labor Studies.
Mauer, R. 2016. Blind hiring may be missing the point. Accessed November 2017. www.shrm.org/resourcesandtools/hr-topics/talent-acquisition/pages/blind-hiring-practices.aspx.
Oreopoulos, Philip. 2011. "Why do skilled immigrants struggle in the labor market? A field experiment with thirteen thousand resumes. ." American Economic Journal: Economic Policy 3 148-171.
Rinne, U. 2014. Anonymous job applications and hiring discrimination. IZA World of Labor. Accessed November 2017. doi:10.15185/izawol.48.
Annex A – Safeguards for merit-based appointments
- informing hiring managers of their staffing responsibilities. Prior to exercising appointment authorities, managers must sign an attestation form which includes a requirement to ensure “the assessment is conducted in good faith, free from bias and personal favouritism, and in a manner that is supportive of an individual’s right to accommodation”.
- a recruitment system with features that allows automation of the screening process to reduce subjectivity.
- screening questions related to qualifications or conditions of employment
- machine scored tests such as Unsupervised Internet Tests (UITs) can also be used to screen-in candidates
- random selection of candidates for referral.
- standardized assessments that are machine scored;
- promotion of bias-free assessment via universal test design;
- involving employment equity groups in developing and piloting assessment methods;
- the option to restrict recruitment to members of employment equity groups to improve their representation;
- offering in-depth training to assessors to prevent bias and promoting best practices in assessment;
- guidance and web based resources on fair assessment practices and other tools to promote bias free assessment of candidates;
- monitoring the performance of PSC standardized tests to ensure they do not pose an unfair barrier to any of the designated employment equity groups
Annex B - List of the 17 participating organizations and selected external processes
Organization | Acronym |
---|---|
Agriculture and Agri-Food Canada | AAFC |
Employment and Social Development Canada | ESDC |
Environment and Climate Change Canada | ECCC |
Fisheries and Oceans Canada | DFO |
Global Affairs Canada | GAC |
Immigration, Refugees and Citizenship Canada | IRCC |
Indigenous and Northern Affairs Canada | INAC |
Infrastructure Canada | INFC |
National Defence | DND |
Natural Resources Canada | NRCan |
Parole Board of Canada | PBC |
Public Service Commission | PSC |
Public Services and Procurement Canada | PSPC |
Royal Canadian Mounted Police | RCMP |
Statistics Canada | StatCan |
Transport Canada | TC |
Treasury Board of Canada Secretariat | TBS |
Classification | Number of candidates |
---|---|
BI-2 | 85 |
PC-2 | 100 |
EN-ING 4 | 53 |
EC-3 | 100 |
EC-5 | 135 |
EC-5 | 127 |
Classification | Number of candidates |
---|---|
AS-1 | 102 |
AS-1 | 59 |
AS-2 | 100 |
AS-2 | 69 |
AS-3 | 100 |
AS-6 | 64 |
PE-2 | 100 |
Classification | Number of candidates |
---|---|
EG-5 | 57 |
EG-4 | 104 |
EG-4 | 64 |
G-5 | 33 |
TI-7 | 49 |
Classification | Number of candidates |
---|---|
CR-4 | 100 |
CR-4 | 100 |
CR-4 | 82 |
CR-4 | 100 |
CR-4 | 55 |
CR-5 | 50 |
CR-5 | 84 |
CR-5 | 67 |
Classification | Number of candidates |
---|---|
GL-ELE-3 | 87 |
Annex C - Operational process
The following outlines the major steps and activities associated with the Name-Blind Recruitment Pilot Project:
Step | Main activities |
1 | Representatives of participating organizations submit their staffing plans for the targeted period |
2 | The Name-Blind Recruitment Pilot Project (the Project) team contacts human resources advisors for details regarding the processes that will be part of the pilot regarding posting dates |
3 | As the closing date for each process draws near, the Project team organizes an information session with human resources advisors and assistants of the organization in question to explain the Project |
4 | When each process has closed, human resources sends the Project team copies of the PDFs containing the applications selected for the Project. If the number of candidates is less than 100, all applications will be used; if the number is greater than 100, a random sample of 100 applications will be used |
5 | Once the applications have been received by the Project team, the anonymization process can begin.
|
6 | Once the applications for each process are anonymized and verified, the applications are printed and a package is prepared. Each package contains the following information:
|
7 | The package is then sent to the reviewers, who will screen the applications for the process under review:
|
8 | Once the reviewers have been briefed, they can independently begin screening the applications for the process under review; the initial results of the screening exercise are then sent to the human reasources advisors |
9 | The human resources advisor records the initial decisions made by each manager assigned to the process being screened in an Excel file sent by the Public Service Commission. In cases where the screening decisions differ, the reviewers meet to discuss the applications in question to achieve a consensus. At this stage, reviewers have been provided with the “traditional” applications that contain all identifiable information. When the final decision is made, reviewers transmit the results to the human resources advisor, who updates the Excel file that is then returned to the Public Service Commission for analysis. |
Annex D – Example of NBR and Traditional Applications from Public Service Recruitment System (PSRS)
Partial Example of a Traditional Application
(For illustration purposes the example has been shortened to two pages.)
Jane Doe (C001994)
Personal Information
PSRS no: | C001994 |
Last Name: |
Doe |
First Name: |
Jane |
Date available: |
2017-12-06 |
Citizenship : |
Canadian Citizen |
Permanent home address
Address: |
10 Nowhere St. Ottawa Ontario Canada A1A 1A1 |
Email: |
|
Telephone: |
123-456-7890 |
Languages
Working Ability: |
French: Advanced English:Advanced |
First Official Language: |
English |
Written Exam: |
English |
Correspondence: |
English |
Interview: |
English |
Screening Questions
Questions: |
Have you successfully completed two years of secondary school? Or Do you have an acceptable combination of education, training and/or experience? Or Do you have a satisfactory score on the Public Service Commission test? |
Candidate Answer: |
Yes |
Complementary Question: |
If you answer Yes to one of the three questions, please identify when and where you completed the two years of secondary school. If you have not completed the two years of secondary school, please describe how your education, training and/or experience could be considered as an acceptable combination to two years of secondary school? If you have a satisfactory score on the Public Service Commission test, please provide when and where this was acquired. |
Complementary Answer: I obtained my secondary school diploma in 2007 from Eastern Ontario High School located in Ottawa, ON. Following the completion of my secondary school education, I achieved a diploma in Business Administration in Eastern Ontario University Ottawa, ON, attending from 2007 to 2010. |
Essential Qualifications
Question : |
Do you have experience in the use of Microsoft Office, specifically Microsoft Word and Microsoft Outlook? |
Candidate Answer: |
Yes |
Complemetary Question: |
*If you answer Yes, using concrete examples, describe where, when and how you acquired these experience. |
Complementary Answer: I have developed a proficiency with Microsoft Word through home, business, and academic use for various projects since 2002. It is a program I used on a daily basis in my role as a Customer Service Representative with AirSupply Company to compose correspondence to clients. I continue to use it on a daily basis now in my role of Financial Advisor with Money Bank. My position of Financial Advisor with Money Bank requires that I use Microsoft Outlook on a daily basis to ensure that I am always reachable for any queries or requests that co-workers may have for me. I use this program for daily correspondence, agenda management and for scheduling future activities and tasks that require my attention. |
Résumé
Jane Doe
10 Nowhere St. ON A1A1A1
jane.doe@email.com
123-456-7890
Linguistic profile: English, French and Spanish
Education
- High School diploma Eastern Ontario High School 2007
- Bachelor of Business Administration Eastern Ontario University 2010
Experience
Financial Clerk – Money Bank – 2013 to 2017
- Verification of financial data to maintain accurate client records and monitor for any discrepancies.
- Ensure all documents are properly signed and distributed for accurate processing.
- Monitor and follow up with financial transactions and client accounts.
- Communicate effectively with all clients in response to questions or concerns.
- Answer and direct incoming call and e-mails in a prompt, professional and courteous manner.
- Data entry and maintenance of updated documents and reports.
- Customer Service Representative – AirSupply Company (Private Sector) – 2008 to 2013
- Experience in receiving, screening and re-routing telephone calls including contacting visitors, responding directly to routine inquiries.
- Experience in interpreting and applying administrative policies and procedures.
- Experience in producing quotes for Little Angels Children’s Hospital (LACH) and other hospitals for medical supplies as well as producing quotes and receipts for clients and insurance companies.
- Experience in placing orders for customers based on their purchase history.
- Experience in maintaining Branch’s BF (Bring Forward) and filing system.
Community extra curricular activities
Eastern Ontario Church Summer Day Camp
- Volunteer at the church Sunday program working with children of different ages in the supervision of indoor and outdoor activities.
- Ensured an engaging, safe and clean environment at all times.
- Provided leadership and mentoring to camp counselors-in-training.
Achivements and interests
- Dean’s honor list (2010) Eastern Ontario University
- A average for 3 years at Eastern Ontario University
- Earned over 100 hours of community service
- Assisted in the creation of a fundraiser for a local animal shelter
*References available upon request
Partial Example of a Name-Blind Application
xxxxxxxxxxxxxxxxx (C001994)
Personal Information
PSRS no: | C001994 |
Last Name: |
xxxxxxxxxxxxx |
First Name: |
xxxxxxxxxxxxx |
Date available: |
2017-12-06 |
Citizenship : |
xxxxxxxxxxxxx |
Permanent home address
Address: |
xxxxxxxxxxxxx |
Email: |
xxxxxxxxxxxxx |
Telephone: |
xxxxxxxxxxxxx |
Languages
Working Ability: |
xxxxxxxxxxxxx |
First Official Language: |
xxxxxxxxxxxxx |
Written Exam: |
xxxxxxxxxxxxx |
Correspondence: |
xxxxxxxxxxxxx |
Interview: |
xxxxxxxxxxxxx |
Screening Questions
Questions: |
Have you successfully completed two years of secondary school? Or Do you have an acceptable combination of education, training and/or experience? Or Do you have a satisfactory score on the Public Service Commission test? |
Candidate Answer: |
Yes |
Complementary Question: |
If you answer Yes to one of the three questions, please identify when and where you completed the two years of secondary school. If you have not completed the two years of secondary school, please describe how your education, training and/or experience could be considered as an acceptable combination to two years of secondary school? If you have a satisfactory score on the Public Service Commission test, please provide when and where this was acquired. |
Complementary Answer: I obtained my secondary school diploma in 2007 from xxxxxxxxxxxxx located in xxxxxxxxxxxxx. Following the completion of my secondary school education, I achieved a diploma in Business Administration in xxxxxxxxxxxxx, attending from 2007 to 2010. |
Essential Qualifications
Question : |
Do you have experience in the use of Microsoft Office, specifically Microsoft Word and Microsoft Outlook? |
Candidate Answer: |
Yes |
Complemetary Question: |
*If you answer Yes, using concrete examples, describe where, when and how you acquired these experience. |
Complementary Answer: I have developed a proficiency with Microsoft Word through home, business, and academic use for various projects since 2002. It is a program I used on a daily basis in my role as a Customer Service Representative with xxxxxxxxxxxxx Company to compose correspondence to clients. I continue to use it on a daily basis now in my role of Financial Advisor with xxxxxxxxxxx Bank. My position of Financial Advisor with xxxxxxxxxxxxx Bank requires that I use Microsoft Outlook on a daily basis to ensure that I am always reachable for any queries or requests that co-workers may have for me. I use this program for daily correspondence, agenda management and for scheduling future activities and tasks that require my attention. |
Résumé
xxxxxxxxxxxxx
xxxxxxxxxxxxx
xxxxxxxxxxxxx
xxxxxxxxxxxxx
Linguistic profile: xxxxxxxxxxxxx
Education
- High School diploma xxxxxxxxxxxxx 2007
- Bachelor of Business Administration xxxxxxxxxxxxx 2010
Experience
Financial Clerk – xxxxxxxxxxxxx Bank – 2013 to 2017
- Verification of financial data to maintain accurate client records and monitor for any discrepancies.
- Ensure all documents are properly signed and distributed for accurate processing.
- Monitor and follow up with financial transactions and client accounts.
- Communicate effectively with all clients in response to questions or concerns.
- Answer and direct incoming call and e-mails in a prompt, professional and courteous manner.
- Data entry and maintenance of updated documents and reports.
- Customer Service Representative – xxxxxxxxxxxxx Company (Private Sector) – 2008 to 2013
- Experience in receiving, screening and re-routing telephone calls including contacting visitors, responding directly to routine inquiries.
- Experience in interpreting and applying administrative policies and procedures.
- Experience in producing quotes for xxxxxxxxxxxxx Hospital (xxxx) and other hospitals for medical supplies as well as producing quotes and receipts for clients and insurance companies.
- Experience in placing orders for customers based on their purchase history.
- Experience in maintaining Branch’s BF (Bring Forward) and filing system.
Community extra curricular activities
xxxxxxxxxxxxx Summer Day Camp
- Volunteer at the xxxxxxx Sunday program working with children of different ages in the supervision of indoor and outdoor activities.
- Ensured an engaging, safe and clean environment at all times.
- Provided leadership and mentoring to camp counselors-in-training.
Achievements and interests
- Dean’s honor list (2010) xxxxxxxxxxxxxxxxxxxxxxx
- A average for 3 years at xxxxxxxxxxxxxxxxxxxxxxxx
- Earned over 100 hours of community service
- Assisted in the creation of a fundraiser for a local animal shelter
*References available upon request
Annex E - Multivariate Regression Model Theory
Multivariate Regression Model
Given the dichotomous nature of the screening decision (in, out), a logit model is used to control the impact of auxiliary factors to measure the screen-in rates of applications and, therefore, assessing the impact of the name-blind assessment method. The screen-in rates are expressed by the following equation:
Where Pr(Y=1|X) represents the screen-in rate. represents the interaction between the observed variables, WZ represents other factors controlled by the model. α is the constant, β1 and β2 are the regression coefficients representing the rate of change of the dependent variables as a function of changes of the independent variable. Finally, εi is the random error component.
Annex F - P-Values from the Multivariate Regression Model
Value 1 | Value 2 | P-Value |
---|---|---|
Candidates assessed under traditional method | Candidates assessed under NBR method | 0.0015* |
Value 1 | Value 2 | P-Value |
---|---|---|
Non-Members of Visible Minorities assessed under traditional method | Members of Visible Minorities assessed under traditional method | 0.531 |
Non-Members of Visible Minorities assessed under traditional method | Non-Members of Visible Minorities assessed under NBR method | 0.0003* |
Members of Visible Minorities assessed under traditional method | Members of Visible Minorities assessed under NBR method | 0.6223 |
Non-Members of Visible Minorities assessed under NBR method | Members of Visible Minorities assessed under NBR method | 0.0662 |
Value 1 | Value 2 | P-Value |
---|---|---|
Scientific and Professional assessed under traditional method | Scientific and Professional assessed under NBR method | 0.0591 |
Administrative and Foreign Service assessed under traditional method | Administrative and Foreign Service assessed under NBR method | 0.6847 |
Technical assessed under traditional method | Technical assessed under NBR method | 0.0743 |
Administrative Support assessed under traditional method | Administrative Support assessed under NBR method | 0.01* |
Operational assessed under traditional method | Operational assessed under NBR method | 0.1689 |
Value 1 | Value 2 | P-Value |
---|---|---|
Experience in Federal Public Service assessed under traditional method | Experience in Federal Public Services assessed under NBR method | 0.006* |
Experience in Federal Public Service assessed under NBR method | No Experience in Federal Public Service assessed under NBR method | <.0001* |
No Experience in Federal Public Services assessed under traditional method | No Experience in Federal Public Service assessed under NBR method | 0.0465* |
No Experience in Federal Public Services assessed under traditional method | Experience in Federal Public Services assessed under traditional method | <.0001* |
Value 1 | Value 2 | P-Value |
---|---|---|
Scientific and Professional and Members of Visible Minorities | Scientific and Professional and non-Members of Visible Minorities | 0.4722 |
Administrative and Foreign Service and Members of Visible Minorities | Administrative and Foreign Service and non-Members of Visible Minorities | 0.8409 |
Technical assessed and Members of Visible Minorities | Technical and non-Members of Visible Minorities | 0.7309 |
Administrative Support and Members of Visible Minorities | Administrative Support and non-Members of Visible Minorities | 0.2683 |
Operational assessed and Members of Visible Minorities | Operational and non-Members of Visible Minorities | N/A |
Value 1 | Value 2 | P-Value |
---|---|---|
Non-Members of Visible Minorities and Experience in Federal Public Service | Non-Members of Visible Minorities and no Experience in Federal Public Service | <.0001* |
Non-Members of Visible Minorities and Experience in Federal Public Service | Members of Visible Minorities and Experience in Federal Public Service | 0.5234 |
Members of Visible Minorities and no Experience in Federal Public Service | Non-Members of Visible Minorities and no Experience in Federal Public Service | 0.5606 |
Members of Visible Minorities and no Experience in Federal Public Service | Members of Visible Minorities and Experience in Federal Public Service | <.0001* |
*statistically significant at the 0.05 level
Glossary
Aboriginal peoples
As defined in the Employment Equity Act, persons who are Indians, Inuit or Métis.
An inclusive workplace is fair, equitable, supportive, welcoming and respectful. It recognizes, values and leverages differences in identities, abilities, backgrounds, cultures, skills, experiences and perspectives that support and reinforce Canada’s evolving human rights framework.
The PSC is a federal institution that is part of the Public Services and Procurement Canada portfolio and it reports independently to Parliament on its mandate.
Variable
A characteristic that may assume more than one set of values to which a numerical measure can be assigned.
Page details
- Date modified: