Audit of human resource data integrity, 2011

Internal Audit Report

378-1-254

May 31, 2011

Table of Contents

EXECUTIVE SUMMARY

Background

The Audit of Human Resource Data Integrity has been conducted as part of Correctional Service Canada’s (CSC) Internal Audit Branch (IAB) 2009-2012 revised Risk-based Audit Plan. CSC identified the area as one requiring examination due to concerns expressed by senior staff relating to the quality of data used to support human resource (HR) planning, decision-making and monitoring of results.

The objectives of this audit were:

To achieve these objectives, the audit team reviewed key documentation and policies. In addition, the team visited each region, conducted interviews with national, regional and institutional staff, and carried out detailed audit tests.

Conclusion

The results of the audit indicate that a management framework to support HR data integrity is in place. The framework is newly created and while we did find some elements of the framework that were inconsistent and incomplete, we did not find any issues within the scope of the audit that could be directly attributed to these inconsistencies.

More specifically, we found that:

However, we found that HRMS generated reports do not fully support HR planning and management requirements in a timely and accurate manner.

Areas for improvement are:

Recommendations have been made in the report to address these areas for improvement. Management has reviewed and agrees with the findings contained in this report and a Management Action Plan has subsequently been developed to address the recommendations (see Annex F).

STATEMENT OF ASSURANCE

This audit engagement was conducted with an audit level of assurance.

In my professional judgment as Acting Chief Audit Executive, sufficient and appropriate audit procedures have been conducted and evidence gathered to support the accuracy of the opinion provided and contained in this report. The findings and conclusions are based on a comparison of the conditions, as they existed at the time, against pre-established audit criteria that were shared with management. The findings are applicable only to the area examined.

Date:

_
Sylvie Soucy, CIA
A/Chief Audit Executive

1.0 INTRODUCTION

Background

According to the Treasury Board Office of Chief Human Resources Officer, human resources integrated planning is an important building block in continuously improving and building the human capacity of the Public Service to deliver services to Canadians (Results for Canadians). Integrated, rigorous planning can mitigate risks associated with aging workforces, tight labour markets, technological change, and so on. It can help identify optimal strategies and activities for such important human resource management components as recruitment, retention, learning, development, employee engagement, promotion, succession, employment equity and official languages.1

In addition, the federal government’s Management Accountability Framework requires departments to conduct an HR planning process that “incorporates the future needs, effective recruitment and retention, succession planning, learning and diversity.”2

According to the TB Human Resource Guide on Integrated Planning, “planning is information driven and planning processes and decisions are based on factual and timely information on current and future needs”3. This is especially true for CSC where there are pressing human resource needs. Over the next decade, for every two people who are retiring, there will be less than one person available to take the position.4 �In addition, many CSC work sites are outside major urban centres so that in some parts of the country, the location of our work sites creates recruitment challenges, especially for staff from the Employment Equity groups. There are also challenges in recruiting staff members able to provide services in both official languages.5 Finally, with additional staffing requirements created as a result of the Truth in Sentencing Act, accurate, effective and timely HR planning is more critical than ever for CSC.

CSC is one of the largest federal government employers, currently employing almost 16,500 employees.6�It uses and relies on the federal government�s approved system, Human Resource Management System (HRMS) to collect, store, and report on employee data, upon which it bases its HR planning decisions. (CSC has migrated to HRMS system 8.9 as of February 17, 2011).

The 2007-08 to 2009-10 CSC Strategic Plan for Human Resource Management, recognized that “the current infrastructure utilized by human resources does not permit the timely analysis of data, the identification of data quality issues or the provision of adequate support for all levels of management so that sound, timely decisions can be made. Due to these problems, senior management, in particular, is being forced to make decisions based on data that should be more accurate.”7

Further, the 2009-2012 Strategic Plan for Human Resource Management stated that “There are data quality issues in certain portions of the HRMS. This is apparent with the inability to produce accurate, simple reports. Data is a key element to all HR decision-making and planning processes. CSC is therefore currently converting to GC HRMS version 8.9. This provides an opportunity to enhance data quality procedures and monitoring.”8

Based on the recognized importance of strong data to support a good HR planning process and in anticipation of the impending future resources requirements at CSC, the quality of the data to support this HR planning has become an area of increased significance to CSC management.

2.0 AUDIT OBJECTIVES AND SCOPE

2.1 Audit Objectives

The audit objectives were:

Specific criteria related to each of the objectives are included in Annex A.

2.2 Audit Scope

The objective of this audit was to provide a reasonable assurance that the integrity of data used to support HR planning, management and reporting is adequate and effective.

For the purposes of this audit, data integrity is described as having four elements: completeness, precision, authenticity and validity.

The scope of this audit was national and included an examination of controls related to the manner in which some key HR-related information is recorded, what information is recorded, the quality checks in place to verify integrity and whether the controls in place are sufficient to support the HR strategic plan. As well, the audit looked for the existence and consistency of data definitions, documentation and consistent application of business rules for management reporting.

The HRMS system contains a large data base and with it comes the potential for huge numbers of reports that could be created for HR planning purposes. In the interest of providing an audit pertinent to the issues identified by senior staff and the CSC Audit Committee, the audit team elected to limit its examination to the newly created Human Resources Management Dashboard (the Dashboard). In this way, the audit focused on elements deemed to be key by the organization to the HR planning function and met the audit objectives (see Annex E for an example of the Dashboard).

The Dashboard was developed by the HR Strategic Planning, Reporting and Systems Directorate hereafter referred to as the Directorate, to respond to senior management requirements related to HR planning and management. The Directorate has indicated it is their intention that the Dashboard becomes senior management’s key source of information for HR planning requirements. It contains information pertaining to employment profile, leave and vacation, official languages, grievances, and succession planning.

Specifically, the Dashboard categories of data are:

  1. Employees by Type of employment
  2. First Official Language
  3. Linguistic status
  4. Average Age of Employees
  5. Employees by Occupational Group
  6. Grievances
  7. Average Years of Service of Employees
  8. Employment equity representation rates
  9. Leave
    1. Employees on leave of Absences by Category
    2. Employees with Negative Sick Leave balance
    3. Employees with Vacation balance over 30 days

The audit examined the first seven areas listed above. The Employment Equity representation rates were not included because they are inputted into HRMS by employees themselves and on an optional basis. We did not examine the three categories of leave because they were examined in earlier audits. However, results from these audits are included in this report.

The Dashboard’s underlying data was examined for accuracy and completeness as the usefulness of the Dashboard is entirely reliant on the quality of its supporting data.

In addition to an examination of data, interviews were conducted with various staff at National Headquarters and in the five regions to provide a comparison with the information gathered from HRMS and gain an understanding of the perceived usefulness of this new tool.

The audit examined the systems and procedures in place from April 1, 2009 to August 31, 2010.

Outside the scope

This audit did not include analysis of HRMS data not discussed above. This audit is not an audit of HRMS conversion process to version 8.9 that was underway at the time of the report. Also, the audit did not include a review of the security features of HRMS as this was addressed in the Audit of Logical Access Controls, approved in May 2008 by the Audit Committee.

Leave data was also excluded from the audit examination process because two internal audits on leave were completed on this subject in 2009 and 2010 as was an audit in 2008 by the Office of the Auditor General.

3.0 AUDIT APPROACH AND METHODOLOGY

3.1 Audit Evidence

Audit evidence was gathered through a number of techniques as follows:

Interviews: We conducted 48 interviews (see Annex C for a list of interviewees) with the key business process owners at National Headquarters (NHQ), Regional Headquarters (RHQ) and in institutions.

Review of Documentation: Relevant documentation such as process documentation, procedure manuals, training material, work descriptions and project charters was reviewed.

Testing of data: The testing of data focused on the accuracy of HRMS data included in the Dashboards selected from the 15 sites. The reports were dated August 31st, 2010. (See Annex B for the list of Dashboards selected). A total of 428 file reviews were performed across the country. We compared the information in HRMS to employee compensation files. This insured that for the purposes of this audit, a random sample size selection using 95% confidence level and 10% confidence interval was applied to each region. We also conducted a reconciliation of 860 grievances between HRMS and the national grievance reports.

Walkthroughs: We conducted walkthroughs, looking at controls, of some processes in HRMS following the data from the time of entry to inclusion in reports that relate to the two HRMS modules that applied to our audit: Administer Workforce and Develop Workforce. The two modules were chosen because they contain the data of the Dashboard that was examined in the analysis of data.

3.2 Limitations

The report should be read with the following considerations in mind:

4.0 AUDIT FINDINGS AND RECOMMENDATIONS

4.1 Management Framework

We assessed the extent to which an appropriate management framework is in place to support HRMS data integrity. This included a review of the governance structure, policies and procedures, resource levels, roles and responsibilities, training and monitoring activities.

4.1.1 Management Framework Expectations

We expected to find an accountability structure in place which defined, documented and communicated roles and responsibilities to all relevant CSC staff involved in the process. We also expected to find that CSC procedures are consistent with Treasury Board (TB) and its Chief Information Officer Branch (CIOB) policies and that training is available and provided to all staff involved in the support of HRMS data integrity. In addition, we expected that resource requirements for data management are formally identified and addressed and that a process has been established to track, monitor and report on HRMS data quality & accuracy.

4.1.2 Findings and Explanations

Overall, we found a management framework in place to support HRMS data integrity. This structure is relatively new, having been implemented over the past 12 months at CSC. We did find some shortcomings within the current framework related to a lack of resources and monitoring which could lead to problems with consistency and quality control over the long run.

Accountability structures were defined, documented and communicated to all relevant parties.

Over the previous year, the Directorate has undergone a significant restructuring process. They have implemented a new reporting structure which is represented in a revised organization chart. The accountability structure has been communicated through the Infonet and through informal communication with staff. It is noted that since the time of the audit examination, the organizational chart has been amended (December 2010).

CSC procedures were consistent with Treasury Board and CIOB policies.

CSC follows relevant Treasury Board policies, legislation and regulations on information management. More specifically, the TB Policy Framework for Information Technology and the TB Policy on Information Management requires the establishment of a management framework to support HR data integrity. The audit team reviewed and compared CSC work plans against the TB policies and found evidence that CSC meets the policy requirements. For example, governance and accountability are clearly delineated and shared with CSC staff.

Resource requirements for data management were formally identified. However, within the organizational structure of the Directorate, there were several vacant positions in the Data Analysis and Reporting Group at the time of the report.

The CSC 2009-2012 Strategic HR Plan formally identified a need to improve data quality within HRMS. Accordingly, the Directorate was restructured to address this need. While resources have been identified, positions remain vacant, and particularly germane to this audit, positions in the data Analysis and Report Group are not staffed, meaning that there is no one to assume the associated responsibilities. A continued staff shortage would have consequences for the Directorate’s ability to complete its mandate over the long run.

Inquiry at the time of the writing of this report indicates that work is underway by the Directorate to staff these positions.

Roles and responsibilities of key CSC staff were defined, communicated and documented.

TB Guide to the Review of Management of Government Information Holdings indicates that accountability and responsibility for the management of information must be assigned across the organization to senior management, information-based function specialists, line managers and staff. We found that responsibilities related to data integrity are defined and documented via job descriptions for key staff (those within the Directorate, Compensation and Staffing Advisors and Regional Business Analysts) involved in the process. Job descriptions for key positions clearly outline a responsibility to contribute to the data integrity of HRMS. In addition, tasks are communicated informally on commencement in the position.

The audit was performed during the year prior to the implementation of the HRMS 8.9 version. As a result, some defined responsibilities related to HR data integrity were not performed consistently across the regions due to the Directorate’s decision to focus resources on the tasks related to the migration to HRMS 8.9.

Accountability structures supporting the role of RBS’s are ambiguous. A review of reporting structures found that HRMS Regional Business Analysts (RBA) report to the Regional Administrator–Human Resources (RA-HR) and not to the Directorate. However, the requirement to train, track, monitor and report is driven by a Directorate initiative. RBAs have multiple tasks to complete within their area of responsibility and the Directorate requirements are just one part of their overall mandate. There may be conflicting or competing requirements. This potential conflict may have an impact on consistent service delivery in the future. A Service Level Agreement or Memorandum of Understanding between the parties may be worthy of consideration to clarify expectations.

Training was available and provided to all staff involved in the support of HRMS data integrity.

The Directorate developed a user guide to train the Regional Business Analysts and in turn, they use this user guide to train Compensation and Staffing Advisors although the timing, content and delivery of training vary somewhat between regions.

HRMS data quality and accuracy was tracked, monitored and reported primarily at the regional levels. Content and timing of the monitoring varied between regions. The Directorate’s draft plan for national monitoring was not implemented at the time of this report.

Monitoring of the data is conducted at both the national and regional levels. At the regional level, the monitoring is conducted based on each region’s schedule and requirements.

Tools to conduct monitoring of HR data on a national basis have been developed by the Directorate, but the standardized process to utilize those tools had not been implemented at the time of the audit. Since a national monitoring system is not yet in place, any system-wide problems that might arise cannot be identified in a timely manner. As well, errors cannot be resolved and measures put into place to rectify the situation to ensure the error is not repeated.

Conclusion:

Overall, a management framework to support HR data integrity is in place. The framework has been in place for a short time and there are some inconsistencies and gaps evident which could potentially lead to some difficulties in the future. Based on those circumstances, we have identified some areas for improvement:

A consequence of a non-standardized approach to monitoring and training for a standardized system, such as HRMS, is that the information may be only as reliable as its least controlled-point.

To address the issues identified above, we propose the following recommendations:

Recommendation 19 **

The Assistant Commissioner, Human Resource Management should establish a standardized process to:

  1. train users, and
  2. track, monitor, and report HRMS data quality & accuracy at the national, regional, and institutional levels that includes measures to correct individual & systemic issues.

Recommendation 29 **

The Assistant Commissioner, Human Resource Management should:

4.2 Data Integrity

This audit’s second objective was aimed at assessing the extent to which the HRMS generated reports support HR planning and management requirements in a timely and accurate manner. This included an examination of the data integrity (as defined in section 2.2 Audit Scope), human and systemic controls, and data quality (the error rate) and report usefulness.

4.2.1 Data Quality Expectations

We expected to find that HR reports only include data which has been verified for quality assurance and that controls are in place to verify the reports accuracy before they are released. In addition, we expected to find that the data examined under the audit was accurate and complete. Finally, we expected to see that the different reports produced by the Directorate address the requirements of the users for HR planning purposes.

4.2.2 Findings and Explanations

The controls and quality assurance process for reports was not consistent.

Quality assurance (QA) of reports is conducted at both the national and regional levels by the Data Analysis & Reporting Group and the Regional Business Analysts (RBA) for their reports. Over and above the Dashboard, a variety of reports are provided both nationally and regionally. The RBAs run reports developed by the Directorate, but based on their own schedule depending on priorities and regional needs. When a report is produced, the originator uses individual judgment to determine if the results appear to be reasonable. As a result, there is no consistency in QA practices between the regions.

In addition, the HRMS controls do not consider the appropriateness of the information being inputted into the fields. There are specific fields that must be completed in order to submit a transaction into HRMS; however, the system (including the new 8.9 version) is currently not built to control the logical nature of the information inputted into those fields.

HRMS data used to create the HR Dashboard was relatively accurate and complete with a few exceptions.

Based on interviews, 61% of the interviewees reported they do not have confidence in the data within HRMS. Overall, they expressed unease with the timeliness of the data input and some areas of HRMS were of specific concern such as leave balances and vacation management. The timeliness of data entry in HRMS was mentioned as an area of concern by 34% of the interviewees.

To assess the quality of the data used to create the Dashboard, we chose two methods to compare the data found in HRMS and one to verify the data quality from its area of input into HRMS. First, we compared information in HRMS to the paper-based employee compensation files in the regions and at headquarters. Second, we performed reconciliations between the Salary Management System (SMS) and the HRMS for each responsibility centre selected (See Annex B for the list of sites selected per region). Third, we performed walkthroughs of two modules of HRMS where the data sits to look for internal controls at an entry level and to confirm the source of data.

1. HRMS and Compensation File Review

The audit team performed a total of 428 file reviews. We selected seven of the HRMS fields used to create the Dashboard information from HRMS and compared the information against documentation found in the employees’ paper-based compensation files. Information in HRMS was compared to external sources on compensation files, such as letters of offer, Public Works and Government Services Canada forms, employee pay cards, birth certificates, etc.

The time lag between when an action occurred and the instance when the related information was actually placed in file created a challenge for the auditors limiting the effectiveness of the comparison particularly when the action was very recent.

The following table presents an overall assessment of the accuracy of information for seven of the nine categories found in the Dashboard:

File Review Statistics
Audited areas10 Accurate Not Accurate Missing information on file Missing information in HRMS
Occupational Group 90.89% 7.01% 1.40% 0.70%
Employment Type 99.53% 0.23% 0.24% 0.00%
Status 99.07% 0.70% 0.23% 0.00%
First Official Language 97.20% 0.70% 2.10% 0.00%
Age 98.60% 0.23% 0.70% 0.47%
Years of Service11 75.70% 6.08% 18.22% 0.00%
Appointment Official Language Status 80.37% 0.70% 7.94% 10.99%

Additional information on a regional basis is found in Annex D.

With the exception of Occupational Group, Years of Service and Appointment Official Language Status, information was found to be fairly accurate. For these three areas, we would suggest that some improvements could be made as they affect planning, and eventually staffing.

Grievances represent the eighth category of information found in the Dashboard.

Information on grievances was not available in the compensation files. To examine the Grievances section of the Dashboard, the audit team performed reconciliations between the Dashboard data as of August 31, 2010 (extracted from HRMS) for the sites selected and the September & December quarterly National Reports on Grievances obtained from NHQ. The objective of the reconciliations was to ensure that all grievances were recorded in the HRMS at the appropriate level.

We found 860 grievances listed in HRMS. Of those, there were 172 instances (20%) where the grievance should have been included in the Dashboard but was not, or where it should not have been listed but was. We also found 95 instances (11%) where the grievances were recorded at the wrong level.

The Employment Equity section of the Dashboard was not examined because this area is inputted directly into HRMS by employees on an entirely voluntary basis. There is no means to verify this, short of asking individual employees directly to confirm the data.

During this audit, we did not validate the completeness of the leave and vacation data since the accuracy in HRMS was reviewed with the Follow-up Audit of Recording of Employee Leave, presented in June 2010 to the Audit Committee. Some information contained in that audit is pertinent to this audit and is provided here for ease of reference:

Specifically, the 2010 leave audit found 92% of CX (1011 of 1095) and 93% of Non-CX (1014 of 1082) transactions tested were recorded in HRMS. Of the transactions that were recorded in HRMS, 71% of CX and 94% of Non-CX transactions were accurate in terms of numbers of hours and type of leave. In terms of quality control, a reconciliation of employee leave was not always performed, and subsequently monitored by senior management. The frequency of the practice varied between sites. Since then, management has committed to implement additional monitoring and processes.

2. HRMS and Salary Management System Reconciliation

The objective of the reconciliation was to ensure that all employees were recorded in the HRMS. We used SMS as a comparative tool because it is a critical report used by senior management to verify their salary numbers against budgets. Although some lag is to be expected, it would also be anticipated to be a fairly accurate report as it is updated monthly and managers are required to sign off on the report. We compared the information in the two systems as of August 31, 2010.

To do this, we took the 4141 employees’ records from the Dashboards selected for examination from the 15 sites and compared them against the SMS reports for the corresponding Responsibility Centres.

There were issues with regard to employees being recorded in one system and not in the other when we attempted to reconcile HRMS and the Salary Management System (SMS). There were a total of 594 discrepancies; 313 names were in SMS but not in HRMS, and conversely 281 were in HRMS but not in SMS. These seemingly high overall numbers are tempered by the fact that there were instances of overlap (such as errors being counted for in the two systems because employees were recorded with different names). Please refer to the chart below for more detail.

Within these discrepancies, 135 or 23% (74+58+3) cannot be explained appropriately. For the others, 35% can be explained by a difference between the functional reporting structure tracked by HRMS and the funding tracked by SMS. This situation occurred when an employee was funded by one responsibility centre, but was actually working in another area (such as mental health workers or IT staff). Also, 21% of the discrepancies were explained by matters of timeliness with the data input. In this circumstance, we were informed that either the data was late or the audit team reviewed later versions of the Dashboards and found the data updated.

Some of the causes of the total number of discrepancies were explained by the regions as follows:

Reconciliations: National Results
Discrepancies explained by: Number Rate
Reporting or funding structure 12 207 34.85%
Timeliness of data input 127 21.38%
Inconclusive information provided by HR13 74 12.46%
No explanation provided by HR 58 9.76%
Name change14 47 7.91%
Termination 15 46 7.74%
Dashboard or data entry error 18 3.03%
Long term sick or Leave Without Pay 14 2.36%
No record found 3 0.51%
TOTAL 594 100%
3. HRMS Modules Testing

As further means to look for ways to verify data quality, we used computer assisted audit techniques to perform walkthroughs of two modules. The walkthrough provided the audit team with a confirmation of the source of data for the Dashboard and an opportunity to search for controls.

The two modules where walkthroughs were conducted were Administer Workforce, which is the input area for:

and Develop Workforce, which is the input area for:

We found that the information inputted in those modules is appropriately transferred to the Dashboard and that some system controls are in place to ensure that all appropriate fields are filled when performing a specific action.

Overall, we found that although there were some accuracy issues in the file review process and in the SMS/HRMS reconciliation but that the magnitude or consequences of these issues is not clear due to a lack of any benchmark processes. A conclusion that may be reached as a result of these audit findings is that a level of tolerance for data accuracy needs to be established. Senior management relies on this information to determine future HR requirements for CSC and inaccurate and incomplete data could lead to inappropriate decision making for the organization. Understanding the reality of time lag with data entry, we would suggest that senior management determine their tolerance level with regard to the HRMS data accuracy and timeliness so as to be able to solely rely on the reports generated by the system with sufficient confidence. They should chose what elements are key, and given limited resources, where they believe efforts should be focused to reduce error levels.

The Dashboard addressed many of the requirements of the report users for HR planning purposes; however, some additional information was identified and requested by users.

Interviews were conducted with various Dashboard users in order to obtain their opinion regarding their information requirements for HR planning purposes. Overall, the Dashboard users consider the information useful for their HR planning purposes; however, some additional information elements were identified as areas where they would like further information. They are:

The tool is currently accessible to the following CSC staff:

Some users suggested that the Dashboard be more widely accessible to other key players in the HR planning at the national, regional and institutional levels, on a “need to know” basis given the sensitive nature of some of the information.

Conclusion:

Overall, it would appear that the HRMS Dashboard report does not yet fully support HR planning and management requirements in a timely and accurate manner. There are completeness issues with regard to the outstanding grievances entered in HRMS. Also, there are precision issues related to some of the employees’ data entered in the HRMS.

The impact of not meeting the end-users’ planning requirements is that they may resort to the use of secondary databases to meet this need. This practice would reduce the compliance on a national basis and provide opportunity for errors to arise because of the use of data without CSC approved controls and no monitoring at a national level to ensure appropriate error levels, consistent content and quality of data. Also, there are issues of efficiency with the existence of more than one tracking system.

As a mitigating factor, the Directorate plans on increasing the level and frequency of QA upon completion of the transition to HRMS 8.9 version, which is scheduled for the end of fiscal year 2010-2011.

To address the issues identified above, we propose the following recommendations:

Recommendation 316 **

The Assistant Commissioner, Human Resource Management, in consultation with the rest of the senior management team should:

Recommendation 4 16 **

With regard to the composition of the Dashboard, the Assistant Commissioner, Human Resource Management should:

5.0 OVERALL CONCLUSION

A management framework to support HR data integrity is in place; however, HRMS generated reports do not necessarily or fully support HR planning and management requirements in a timely and accurate manner.

Opportunities for improvement exist in the following areas:

A consequence of these issues is that a non-standardized process with limited support may lead to inconsistent, uncontrolled or inaccurate data entry in HRMS. The information upon which CSC staff relies is only as effective as that which has been entered.

ANNEX A

Audit Objectives and Criteria

Objectives Criteria
Objective 1:
To provide reasonable assurance that a management framework is in place to support HRMS data integrity.
1.1 Governance
Accountability structures are defined, documented and communicated to all relevant parties.
1.2 Policies and procedures
CSC procedures to support HRMS data integrity are clear and consistent with CIOB and TB policies.
1.3 Resources
Resource requirements for data management are identified formally and addressed.
1.4 Roles and responsibilities
Roles and responsibilities of key CSC staff are defined, communicated and documented.
1.5 Training
Training is available and provided to all staff involved in the support of HRMS data integrity.
1.6 Monitoring
There is a process established to track, monitor and report HRMS data quality & accuracy at the national, regional, and institutional levels and to correct individual & systemic issues.
Objective 2:
To provide reasonable assurance that HRMS generated reports support HR planning and management requirements in a timely and accurate manner.
2.1 Data Integrity
HR reports only use data which has been verified for quality assurance.
2.2 Controls
Controls are in place to verify the accuracy of HR reports before they are released.
2.3 Data Quality
HRMS Data within the HR reports is accurate and complete.
2.4 Report Usefulness
Reports produced by HRMS address the requirements of the report users for HR planning purposes.

ANNEX B

Location of Site Examinations

Region Sites
National Headquarters
  • National Headquarters
Prairies
  • Regional Headquarters
  • Okimaw Ohci Healing Lodge
  • Grande Cache Institution
Pacific
  • Kwìkwèxwelhp Healing Village
  • Kent Institution
  • Mountain Institution
Atlantic
  • Regional Headquarters
  • Atlantic Institution
  • Nova Institution for Women
Ontario
  • Bath Institution
  • Joyceville Institution
  • Kingston Penitentiary
Quebec
  • Drummond Institution
  • Joliette Institution
  • Montée Saint-François Institution

ANNEX C

Listing of Interviewees

Region Interviewees
National Headquarters
  • Director, HR Business Processes, Management Systems & Reporting
  • Manager, Data Analysis & Reporting
  • Senior HR Analyst
  • Project Officer, HR Business Process Improvement
  • Manager, Business Process Improvement
  • Project Officer, HRMS
  • Team Leader, HRMS Training Team
  • HRMS Manager
Prairies
  • Regional Deputy Commissioner
  • Regional Administrator - Human Resources
  • A/Warden - Grande Cache
  • Warden – Okimaw Ohci Healing Lodge
  • A/Assistant Deputy Commissioner Corporate Services
  • HRMS Business Analyst
  • 1 Staffing advisor
  • 1 Compensation advisor
Pacific
  • Regional Deputy Commissioner
  • Regional Administrator – Human Resources
  • Warden - Kent Institution
  • Warden - Kwìkwèxwelhp Healing Village
  • Warden – Mountain Institution
  • HRMS Business Analyst
  • 1 Staffing Advisor
  • 1 Compensation Advisor
Atlantic
  • Regional Deputy Commissioner
  • Regional Administrator – Human Resources
  • A/Assistant Deputy Commissioner, Corporate Services
  • Assistant Warden, Management Services – Atlantic
  • HRMS Business Analyst
  • 1 Staffing Advisor
  • 1 Compensation Advisor
Ontario
  • Regional Deputy Commissioner
  • Assistant Deputy Commissioner Corporate Services
  • Regional Administrator – Human Resources
  • Warden – Joyceville Institution
  • Warden – Kingston Penitentiary
  • HRMS Business Analyst
  • 1 Staffing Advisor
  • 1 Compensation Advisor
Quebec
  • Assistant Deputy Commissioner Corporate Services
  • Regional Administrator – Human Resources
  • Warden – Drummond Institution
  • Assistant Warden Management Services – Drummond Institution
  • Warden – Joliette Institution
  • Warden – Montée Saint-François
  • HRMS Business Analyst
  • 1 Staffing Advisor
  • 1 Compensation Advisor

ANNEX D

File Review by Region

The following chart is a breakdown of the findings by region for the comparison between HRMS and the employees’ compensation files. The numbers indicate the percentage accuracy for each of the types of data.

File Review – by Region
Region Occupational Group Employment Type Status First Official Language Age Years of service Appointment Official Language
Prairies 87% 99% 100% 98% 98% 61% 66%
Pacific 92% 99% 99% 96% 96% 87% 75%
Ontario 94% 100% 98% 97% 100% 79% 84%
Atlantic 90% 100% 100% 99% 99% 68% 83%
Quebec 92% 100% 99% 96% 100% 83% 93%

From the Treasury Board Secretariat dictionary, and from CSC training material, the Audited Areas are described as follows:

Occupational Group: A series of jobs or occupations related in broad terms by the nature of the functions performed. The occupational group, sub-group (if applicable) are managed and controlled by the Classification System. The Classification System ensures the determination of the relative value of work and provides a basis for employee compensation in the Public Service.

Employment Type: The intended employment tenure for the incumbent of the position. Values include: Indeterminate, Seasonal, Determinate (less than 3 months), Determinate (greater than or equal to 3 months but less than 6 months), Casual, Determinate (6 months or more).

Status: Active, Leave without Pay, Leave of Absence, etc.

First Official Language: The first official language of a person.

Age: Age of a person. Automatic calculation once the DOB is entered by CSC.

Years of Service: The base date used to determine the amount of pensionable service an employee has accrued under the Public Service Superannuation Act.

Appointment Official Language: The status of an appointment or a deployment to a position in accordance with the Public Service Official Languages Exclusion Approval Order (PSOLEAO), the Public Service Employment Regulations (PSER), and Treasury Board policies and guidelines. Ex: Bilingual Imperative or English Essential etc.

ANNEX E

Human resources dashboard - example

Description:

ANNEX F

Audit of HR Data Integrity
Management Action Plan (MAP)

Recommendation: Recommendation No. 1

The Assistant Commissioner, Human Resource Management should establish a standardized process to:
  1. Train users and
  2. Track, monitor, and report HRMS data quality & accuracy at the national, regional, and institutional levels that includes measures to correct individual & systemic issues.
Management Response/ Position: checked-box Acceptedbox not checked Accepted in partbox not checked Rejected
Action(s) Deliverable(s) Approach Accountability Timeline for Implementation
What action(s) has/will be taken to address this recommendation? Expected deliverable(s)/ indicator(s) to demonstrate the completion of the action(s) How does this approach address the recommendation? Who is responsible for implementing this action(s)? When will action(s) be completed to fully address the recommendation?
Document system procedures and put in place online content for training delivery and user reference. Incorporate online tutorial content for all key HR actions into HRMS. Ensures consistent documented business processes for use and reference across CSC. Director, SPRS March 2011
Establish a national training strategy and program. Put in place training plans, manuals and “train-the-trainer” sessions for all Regional HRMS Business Analysts. Ensures trainers are properly equipped and trained in order to deliver consistent and comprehensive training to users in their region. Director, SPRS March 2011
Establish a Data Quality Control Framework Define data quality indicators, data ownership, roles and responsibilities, and data quality reporting for monitoring and measurement purposes. Clearly defines accountabilities and, ensures consistent and continual monitoring of data quality through the integration of the data indicators in the HR Management Dashboard. ACHRM
Director, SPRS
RA-HR
March 2012
Recommendation: Recommendation No. 2

The Assistant Commissioner, Human Resource Management should:
  • Consider the creation of a Service Level Agreement or Memorandum of Understanding between the Directorate and Regional Deputy Commissioners (RDCs) to define expectations, responsibilities and timelines when sharing regional business analysts.
Management Response/ Position: checked-box Acceptedbox not checked Accepted in partbox not checked Rejected
Action(s) Deliverable(s) Approach Accountability Timeline for Implementation
What action(s) has/will be taken to address this recommendation? Expected deliverable(s)/ indicator(s) to demonstrate the completion of the action(s) How does this approach address the recommendation? Who is responsible for implementing this action(s)? When will action(s) be completed to fully address the recommendation?
Revise the Regional Business Analysts’ work descriptions and establish a National Data Quality Team Clarify roles and responsibilities of both the Regional Business Analysts and the NHQ Analysts as well as establish MOUs with each region Ensure that accountabilities relative to core activities such as training development and delivery, data quality monitoring, etc. are clearly defined and understood. Director, SPRS
RA-HRs
March 2012
Recommendation: Recommendation No. 3

The Assistant Commissioner, Human Resource Management, in consultation with the rest of the senior management team should:
  • Determine the organization’s tolerance levels with regard to the HRMS data accuracy and timeliness, based on the risk associated with data reliability, and take steps to reduce errors based on the agreed upon levels.
Management Response/ Position: checked-box Acceptedbox not checked Accepted in partbox not checked Rejected
Action(s) Deliverable(s) Approach Accountability Timeline for Implementation
What action(s) has/will be taken to address this recommendation? Expected deliverable(s)/ indicator(s) to demonstrate the completion of the action(s) How does this approach address the recommendation? Who is responsible for implementing this action(s)? When will action(s) be completed to fully address the recommendation?
Define tolerance levels with regards to HRMS data accuracy Establish measurable data quality tolerance levels for the indicators defined as part of the Data Quality Control Framework (see Recommendation 1) Incorporate data quality targets as part of the Performance Management Framework Puts in place consistent and measurable indicators which are monitored and drive corrective action when warranted. ACHRM and
Senior Management Team
March 2012

Recommendation: Recommendation No. 4

With regard to the composition of the Dashboard, the Assistant Commissioner, Human Resource Management should:
  • Consider reviewing the composition of the data available on the Dashboard based on the needs of the users and associated costs,
  • Reconsider the positions at CSC that have access to Dashboard to ensure all appropriate users are included.
Management Response/ Position: checked-box Acceptedbox not checked Accepted in partbox not checked Rejected
Action(s) Deliverable(s) Approach Accountability Timeline for Implementation
What action(s) has/will be taken to address this recommendation? Expected deliverable(s)/ indicator(s) to demonstrate the completion of the action(s) How does this approach address the recommendation? Who is responsible for implementing this action(s)? When will action(s) be completed to fully address the recommendation?
Enhance dashboard indicators and review access rights Put in place a National Working Group of users and subject matter experts in order to expand the number of indicators based on the needs of users. Ensure continued enhancement of the tool with a continued focus on the managers’ needs and accountabilities. Director, SPRS September 2011
Extend usage to institutional and regional management teams as well as to other key HR staff. Ensure all appropriate users have access to the tool. ACHRM
ADCCS
March 2011
Create sector/functional community specific dashboards (i.e. Corporate services, Information Management Services, Health Services, etc.) Ensure continued enhancement of the tool with a continued focus on the managers’ needs and accountabilities. Director, SPRS June 2011

1 TBS – Office of Chief Human Resources Officer, HR Integrated Planning Guide, http://www.tbs-sct.gc.ca/gui/plann01-eng.asp

2 MAF Indicators (2005)

3 TBS – Office of Chief Human Resources Officer, HR Integrated Planning Guide, http://www.tbs-sct.gc.ca/gui/plann01-eng.asp

4 Future Trends in Public Service: Meeting the needs of the 21st Century Public Service, Dr. Linda Duxbury, Professor Carleton University

5 Strategic Plan for Human Resource Management 2009-2010 to 2011-2012

6� Strategic Plan for Human Resource Management 2009-2010 to 2011-2012

7 Strategic Plan for Human Resource Management 2007-2010 p. 7

8 Strategic Plan for Human Resource Management 2009-2010 to 2011-2012, p. 21

9 Recommendations highlighted in red * require management’s immediate attention, oversight and monitoring. Recommendations in yellow ** require management’s attention, oversight and monitoring.

10 Please refer to Annex D for a description or definition of each Audited Area

11 It should be noted that the compensation files do not have a specific source to confirm service dates. The audit team searched through the files to find evidence of the date wherever available.

12 Includes employees reporting to a different responsibility centre, employees with positions funded elsewhere (i.e., paid by the health cluster) and employees seconded out.

13� 68 out of the 74 (92%) discrepancies come from the same region and no reasonable explanation was provided.

14 Name change in one of the two systems.

15 Includes all termination (retirement, resignations and end of term/casual)

16 Recommendations highlighted in red *require management’s immediate attention, oversight and monitoring. Recommendations in yellow ** require management’s attention, oversight and monitoring.

Page details

Date modified: