Audit of Grants and Contributions Control Framework

From: Employment and Social Development Canada

Official title: Audit of Grants and Contributions Control Framework

Request other formats online or call 1 800 O-Canada (1-800-622-6232). If you use a teletypewriter (TTY), call 1-800-926-9105. Large print, braille, audio cassette, audio CD, e-text diskette, e-text CD and DAISY are available on demand.

On this page

1. Background

1.1 Context

The 2017–18 Employment and Social Development Canada (ESDC) Departmental Plan presents $1.8 billion of planned expenditures on voted Grants and Contributions (Gs and Cs) programsFootnote 1. Transfer payments are made to organizations and individuals as a result of their participation in Gs and Cs programs that are mainly administered by the Program Operations Branch (POB). POB was responsible for the management of $1.1 billion of Gs and Cs via approximately 75,000 grants and 25,000 contributions agreements during 2015–16 which included $906 million issued by regional-delivered programs and $194 millionFootnote 2 issued by national-delivered programs.

POB developed the departmental Gs and Cs Control Framework (“the framework”) dated September 2014 that covers the seven stages of the Gs and Cs lifecycle.

  1. Application: Calls for proposals should be approved before being launched, applications received use standardized forms and acknowledgement of application letters should be sent within 21 days of receiving the application.
  2. Assessment: Eligibility of applications received must be assessed in accordance with the terms and conditions of the programs and in compliance with program objectives and government priorities.
  3. Recommendation and Approval: Applications for funding must be approved in accordance with the delegation of authorities before the agreements or amendments are signed.
  4. Agreement: Legally vetted agreements such as the standard contribution agreement and standard grant agreement must be signed by a departmental representative with proper delegation of authority.
  5. Monitoring: Financial, activity and/or results monitoring activities should be completed, if required, based on the Risk Assessment, Management and Mitigation (RAMM) score.
  6. Claims Processing/Payments: Expenditure claims submitted by recipients and payments should be reviewed and approved by an appropriate delegated authority.
  7. Close Out: Final payment should be calculated and standard close out documentation should be completed that includes an evaluation of the activity results.

1.2 Audit Objective

The objective of this audit was to provide assurance that adequate controls were in place for:

  • The assessment and approval of Gs and Cs applications
  • The processing of expenditure claims and payments
  • The monitoring of and reporting on Gs and Cs agreements in accordance with departmental policies and procedures and the Directive on Transfer Payments

1.3 Scope

A random stratified sample of 320 projects approved from 2015–16 and 2016–17 was selected from the following ten programs presented below according to the program groupings established by POB:

  • Group 1 – Transactional: Canada Summer Jobs and New Horizons for Seniors
  • Group 2A – Client–Based Programs: Opportunities Fund for Persons with Disabilities, Career Focus and Skills Link
  • Group 2B – Organizational Programs: Skills and Partnership Fund, Sectoral Initiatives Program and Social Development Partnerships Program-Disability Component and Children and Families
  • Group 3 – Community: Aboriginal Skills and Employment Training Strategy (ASETS) and Homelessness Partnering Strategy (HPS)

The sample of projects was tested to determine if the assessment and approval of applications, the processing of expenditure claims and payments and the monitoring of and reporting on agreements were properly completed in accordance with departmental policies and procedures and the Directive on Transfer Payments.

A second random sample of 40 non-approved applications was also tested to determine if the eligibility of the applications was properly assessed and if sufficient rationale for non-approval was provided.

1.4 Methodology

The audit was performed using a number of methodologies during the conduct phase that was completed between June 2017 and February 2018 including:

  • Process observation, documentation review and analysis
  • Data analysis of the Gs and Cs projects data sets provided by POB for 2015–16 and 2016–17
  • On-site walkthroughs at National Headquarters with Income Security and Social Development Branch (ISSDB), Skills and Employment Branch (SEB), POB and Chief Financial Officer Branch (CFOB) as well as at the Regional Offices presented below:
    • Atlantic Region: Charlottetown, Dartmouth and St. John’s
    • Ontario Region: Kingston, Kitchener, Mississauga and Toronto (St-Clair and Regional Headquarters)
    • Québec Region: Drummondville, Laval, Montréal and Québec
    • Western Canada and Territories Region: Calgary, Edmonton, Vancouver, Victoria and Winnipeg
  • Interviews with management and staff from POB, CFOB, ISSDB, SEB and Service Canada Regional Offices
  • Sampling and file review testing

2. Audit Findings

ESDC Gs and Cs help support jobs, training, and social development. The audit scope covered ten (10) programs targeting different groups including Indigenous, youth, seniors, homeless, people with disabilities, unemployed and children. The 2016–17 planned funding for these 10 programs was over $1,040.5 million and was mainly administered through agreements with provincial and territorial governments and agencies, municipalities, associations, not-for-profit and for-profit organizations.

The table below provides an overview of the 10 programs included in the scope of the audit.

Programs in Audit Scope Delivery structure Eligible Recipients Funding Type 2016-17 Funding
1. Canada Summer Jobs Regional Individuals, municipal governments, Indigenous organizations, not-for-profit and for-profit organizations, provincial/territorial governments and agencies Contribution $344,354,000
2. Career Focus National and Regional Contribution
3. Skills Link Regional Contribution
4. Opportunities Fund for Persons with Disabilities National and Regional Contribution $45,026,000
5. New Horizons for Seniors Regional For-profit and not-for-profit organizations, Indigenous organizations, coalitions, networks, committees, municipal governments, research, educational and public health institutions Grant $43,140,000
6. Skills and Partnership Fund National & Regional For profit and not-for-profit Indigenous controlled organizations, Indigenous controlled unincorporated associations, Indian Act bands, Tribal Councils and Indigenous self-government entities Contribution $50,000,000
7. Sectoral Initiatives Program National For-profit and not-for-profit organizations, Indigenous organizations, municipal governments, provincial/territorial governments Contribution $26,854,000
8. Social Development Partnerships Program National Not-for-profit organizations Contribution
Grant
$20,615,000
9. Aboriginal Skills and Employment Training Strategy National and Regional For profit and not-for-profit Indigenous controlled organizations, Indigenous controlled unincorporated associations, Indian Act bands, Tribal Councils and Indigenous self-government entities Contribution $343,943,000
10. Homelessness Partnering Strategy Regional Individuals, for-profit and not-for-profit organizations, municipalities, Indigenous organizations, public health and educational institutions, provincial/territorial governments Contribution $166,538,810
Total   $1,040,470,810

2.1 The existing Grants and Contributions Control Framework is generic and not risk-based

Control Framework

The control framework is based on the seven phases of the Gs and Cs lifecycle described on page 1 of this report. The audit found that the framework is generic in nature. This one-page framework outlines documentation, filing requirements, systems used in the process along with mandatory requirements per phase. Processes and tools supporting the framework have also been developed and are described in Section 2.2 of the report.

The audit team expected that the framework would facilitate risk identification and assessment. We also expected it would support the establishment of adequate controls proportionate to the risk levels of the programs and recipients in accordance with the Treasury Board Policy on Transfer Payments. However, the audit found that the framework:

  • Is not risk-based
  • Is composed of generic requirements
  • Does not establish risk tolerances
  • Does not define who is responsible for performing the controls
  • Does not specify the frequency and timing of controls and the information required to execute the controls

Further, the framework does not demonstrate how the following three potential drivers of risk in each program are addressed:

  • Delivery models (national and regional)
  • Intake processes (calls for proposals, targeted, continuous solicited or unsolicited intake)
  • Funding methods (grants or contributions)

The Treasury Board Policy on Transfer Payments requires that cost-effective oversight, internal control, performance measurement and reporting systems are in place to support the management of transfer payments. The Policy further states that transfer payments have to be managed in a manner that is sensitive to risks and the administrative requirements of recipients.

The audit team expected that key controls would be designed to mitigate the risks related to programs, projects, recipients, agreements and sub-agreements. These risks have not been explicitly and adequately identified nor assessed. Nonetheless, controls and monitoring activities have been put in place without a clear understanding of the risks that these activities are expected to mitigate.

The absence of explicit risk identification and assessment at the program level hinders the Department’s ability to establish adequate controls and to focus on areas of greater risk and significance. As a result, a risk-based approach is not used to manage Gs and Cs programs.

Roles and Responsibilities

Roles and responsibilities for the administration and delivery of national programs are outlined in a Memorandum of Understanding (MOU) between SEB, ISSDB and POB. Interviews with management and program staff confirmed that roles and responsibilities of policy and program delivery stakeholders are based on informal collaborations such as communication and coordination activities among these branches.

The audit found that the MOU includes generic roles and responsibilities. The audit team expected that these roles and responsibilities would be further clarified and integrated into the framework by detailing the timing, extent and nature of the involvement of the branches in each stage of the lifecycle. For example, the roles of SEB and ISSDB in the assessment process could be further clarified to optimize the selection of projects contributing to programs’ objectives.

Per the MOU, ISSDB and SEB are responsible for developing program risk assessment while POB is responsible for assessing project risks. In our opinion, the roles of SEB, ISSDB, POB, the National Gs and Cs Delivery Centre (NGCDC) and the Regions, with respect to risk management, throughout the project’s lifecycle, need to be further defined within the framework.

Recommendation

  1. POB, in collaboration with SEB, ISSDB and the Regions, should establish a Gs and Cs Control Framework that includes key risk-based controls per program (including frequency, timing and information required) and define the roles and responsibilities of stakeholders performing control activities.

Management Response

Management agrees that the Control Framework should only include key risk-based controls that take into account the characteristics of individual programs while favouring a streamlined approach to program delivery.

Management is committed to ensuring that any revisions to the Control Framework will be sensitive to risks, strike an appropriate balance between control and flexibility, recognize an organization’s capacity to deliver, and establish the right combination of good management practices, streamlined administration and clear requirements for performance reporting.

Management will undertake a comprehensive review of the existing Control Framework based on these principles. Actions are expected to be completed by November 2019.

2.2 Risk identification, assessment and the Risk Assessment, Management and Mitigation tool need to be improved

Risk Identification and Assessment and Limitations of RAMM

The audit expected that a formal risk identification and assessment would be completed for each program to support a risk-based control framework. We noted that risks were not identified for all programs included in the scope of the audit. High-level risks were identified for some programs at the time Treasury Board Submissions were developed. However, there was no evidence on files reviewed that these high-level risks were considered throughout the implementation of control activities. Furthermore, there was no evidence that the controls established took into account unique program risks.

As per our interviews and file reviews, project risks are mainly identified through RAMM. RAMM has four risk scores, namely Organizational, Financial, Activity and Results. Each risk score is based on five risk factors out of a total of ten available standard risk factors. These risk factors do not capture all of the dissimilar risks of the programs, projects, recipients, agreements and sub-agreements. We were informed that the RAMM is currently being reviewed by POB. We were also informed that an independent assessment of the RAMM was conducted by an external accounting firm in 2016 and 2017. The independent assessment included recommendations to improve the RAMM as follows:

  • Mandatory supervisory review and approval of RAMM assessments
  • Training and guidelines for Program Officers on how to complete RAMM assessments
  • Training for Program Managers on their oversight role and providing effective challenge
  • Review, analyze and amend RAMM risk factors
  • Identify and apply weightings to specific programs or by risk factor
  • Implement a working group
  • Implement a process to regularly review RAMM assessments, results and monitors
  • Perform program risk assessments

Impact of RAMM

The audit found that RAMM does not facilitate a comprehensive risk assessment at the program, project and recipient levels. The risk factors used by RAMM are at times irrelevant, insignificant or carry unreasonable weight as they all have equal contribution to their respective risk scores. The audit found no documented risk statements in any of the files reviewed that detail the project and recipient risks to achieving the program’s objectives. The absence of documented risk statements combined with the lack of formal program risk identification and assessment could result in inadequate risk assessments being used at the program and recipient levels.

RAMM is also used as a risk mitigation tool as its scores determine the frequency of the monitoring activities regardless of the project lifecycle. Since the monitoring three point scale is not granular enough to adequately map to the fifty (50) possible point risk scores, the risk score must change significantly in order to impact the monitoring score and the monitoring frequency.

For example, the ASETS and HPS are described as high-dollar value agreements, project durations of five years and complex activities. RAMM assesses ASETS and HPS projects with high scores, resulting in higher monitoring scores and more frequent monitoring activities. Projects in other programs which could be riskier may not necessarily be properly assessed by RAMM. The unique risks related to dealing with a new organization, short timeframes to deliver and the nature of the costs incurred are unlikely to score high. As a result, some organisations continue to be monitored more frequently than others due to the limitations of the tool and the calculation of risk scores.

Our interviews indicated that the frequency and scope of monitoring and quality control activities are inconsistent among individuals and Regions, leading to confusion and frustration of Gs and Cs recipients. Furthermore, monitoring plans are not customized to mitigate the risks unique to a project or recipient. Given the limitations of RAMM presented in this report, we suggest that monitoring plans be reviewed to determine to what extent they mitigate key risks. Additional details on RAMM can be found in Appendix A.

Recommendation

  1. POB, ISSDB and SEB, in collaboration with the Regions, should revamp RAMM to improve Gs and Cs risk identification, assessment and mitigation.

Management Response

Management agrees that the RAMM methodology should be reviewed to strengthen risk identification, assessment and mitigation.

Management will continue efforts currently underway to redesign the RAMM to ensure that risk assessments are more reflective of program-specific factors and the capacity of funding recipients to deliver on program objectives. This will result in monitoring, reporting and audit requirements that are proportionate to the risk level. Actions are expected to be completed by December 2020.

Processes and Tools

Many resources are available to program delivery staff such as operational guidelines, procedures, checklists, the Centre of Expertise for Gs and Cs and Regional Business Expertise Units. Business Expertise in the Regions acknowledged the limitations of using generic tools and having one process to provide guidance for multiple and dissimilar programs. The audit found that notwithstanding the tools, support and references available, step-by-step processes per program detailing the controls to be performed within the seven stages of the lifecycle have not been well documented. Interviews also confirmed that staff are not aware of the risks pertaining to the program(s) they deliver and the risks/issues specific to the file(s) they manage are not properly captured. For example, when program officers take over a new project, they do not have access to a central repository of issues or risks because explicit risk statements are not documented and assessed by program and by project. Program officers can review the RAMM score, discuss the file with the agent who was previously in charge, if still available, and review the communication exchanged and monitoring reports kept on file to find out details (risks/issues) pertaining to their respective files.

The absence of documented program process workflows and having to depend on a generic framework led the Regions, in some instances, to develop complementary tools, checklists and procedures to provide additional guidance to their staff. This contributed to the duplication of efforts and inconsistencies noted in program delivery.

Information Management Practices

There are no national requirements established on how to document project files in the Common System for Grants and Contributions (CSGC). Staff are currently using three repositories to store information related to a project: a shared network drive, paper files and the CSGC. This results in duplication of efforts in documenting and filing key documents, forms and checklists and increases the risk of misplacing documents. Furthermore, there is no naming convention or established requirements for filing information and key documents in CSGC. These inconsistent practices do not facilitate efficient information retention, retrieval and disposal.

Workflow in CSGC

CSGC is not a workflow system that structures the sequence of activities. The current Gs and Cs process does not take into account different program operational requirements. The audit found that CSGC does not have an integrated dashboard keeping senior management and staff informed on the status of their projects. An automated workflow requiring the sequential execution of steps in CSGC would, in our opinion, facilitate the routing and filing of key information within the programs’ lifecycle and ensure proper steps and controls are performed in the system.

Training and Guidance

Generic training is provided to staff which consists of nine modules. Generic Gs and Cs Phase One and Phase Two certifications and training in Financial Monitoring are also available and offered in the Regions. On-the-job training is provided to staff with the help of mentors. However, the audit team was informed through interviews that staff do not receive formal and comprehensive program-specific training.

The provision of generic training with the absence of documented step-by-step processes may result in inconsistencies.

Recommendation

  1. POB, in collaboration with the Regions, should provide program-specific training and step-by-step processes to all staff responsible for the delivery of Gs and Cs programs.

Management Response

Management agrees that program-specific training would be an effective means of equipping staff to apply their general program-delivery knowledge and skills to the delivery of individual programs thereby achieving greater consistency.

Management will identify programs most in need of specific training and gradually add program-specific training to its existing curriculum. Actions are expected to be completed by September 2019.

Testing of Controls

The section below indicates whether the controls, as designed in the generic framework, were operating as intended. It does not assess whether controls were appropriately designed. Design effectiveness was reviewed and reported in other sections of the report.

As part of our audit procedures, 320 projects approved and 40 non-approved projects have been tested. The audit team, for its file review purposes, established that a compliance threshold of 90% would indicate that controls in the framework are operating as intended. Overall, the audit found that controls were operating as intended for 2 of the 7 phases of the lifecycle (Monitoring and Claims Processing/Payments). The Assessment, Recommendation and Approval, and Agreement phases had compliance rates between 86% and 89%. The Application and Close Out phases had compliance rates of 75% and 84%, respectively.

Examples of issues identified for the five phases of the lifecycle that did not operate as intended are outlined below:

  • 75% of the applications tested were acknowledged within 21 calendar days of receipt.
  • The assessment process does not define when assessment checklists and grids have to be used on a national basis. Manager’s review and approval of assessment checklists and grids was not always completed as part of the assessment process.
  • 10% of the Contribution or Grant Funding Agreements were not signed by the recipient and /or by the proper departmental delegated authority.
  • Final reports submitted by recipients were not always properly signed and dated. We were informed there is no departmental policy or guidance that requires or recommends that final reports be signed.

A summary of the file review results for each phase is presented in Appendix B.

Monitoring Activities

Throughout the monitoring phase, staff perform the following types of monitors to verify that public funds are used as intended, project activities are conducted in compliance with the agreement and expected project results are achieved:

  • Claim reviews
  • Financial monitors
  • Activity monitors
  • Results monitors

These monitoring activities are completed based on the RAMM score which determines the frequency of monitors required for each project. As stated in Section 2.2, there is a risk that the same projects could be selected more often for monitoring while other projects may not be adequately monitored. This is mainly due to the RAMM limitations where projects with the highest scores are selected for monitoring and often within the same program such as ASETS and HPS. For example, risks at the sub-agreement level are not considered by the RAMM and may not be adequately monitored.

National Grants and Contributions Quality Assurance Review Process

In September 2016, POB launched a National Grants and Contributions Quality Assurance Review (QAR) Process for 3 of the 10 programs included in the scope of this audit (Skills Link, Career Focus and Opportunities Fund) to verify file integrity across the lifecycle in accordance with directives, policies and procedures.

A sample of 30 projects that were subject to the QAR Process during fiscal year 2016–17 was tested. The current QAR Process reviews one project per officer every six months instead of every quarter as required by the QAR Directive. The audit found that the QAR Directive does not integrate the concept of risk-based sampling to address non-compliance with policies and procedures for high risk projects. The audit also found there are no controls in place to verify and document that files from all officers were sampled across all Regions. Regional file review plans obtained by POB for fiscal year 2016–17 were not documented in a consistent manner to verify compliance with the QAR Directive sampling requirements.

Quality Assurance and Monitoring Unit

The Quality Assurance and Monitoring Unit (QAMU), within CFOB, completes file reviews and recipient audits on a sample of projects from one Gs and Cs program every fiscal year. Projects selected for review by QAMU are based on the RAMM score. QAMU is responsible for providing reasonable assurance that the terms and conditions established in the contribution agreement have been met by the recipient by assessing that:

  • The funds disbursed to the recipient have been used for the intended purposes in compliance with the financial terms and conditions of the contribution agreement;
  • The recipient has complied with the reporting requirements and other selected terms and conditions of the contribution agreement; and
  • The recipient has an appropriate system of internal controls in place.

Results from interviews indicated staff in the Regions are not always clear on QAMU’s role as they do not necessarily receive the results of QAMU’s reviews. A sample of 22 Gs and Cs file reviews and recipient audits completed by QAMU during 2015–16 and 2016–17 was examined. The audit found that risk assessments, procedures to be performed and sampling methodologies were not documented in the QAMU files.

Based on the audit results, there is an opportunity to improve the coordination and integration of the monitoring and quality assurance activities performed by POB and CFOB.

Recommendation

  1. POB and CFOB should review existing monitoring and quality assurance activities to ensure they are risk-based and integrated.

Management Response

Management agrees with the recommendation and will conduct a comprehensive review of existing quality assurance activities in order to eliminate duplication and ensure an integrated approach.

The revised approach will integrate findings from existing third-party reviews and from a new risk approach piloted with select programs. Actions are expected to be completed by September 2019.

2.3 Performance monitoring needs strengthening

Program delivery performance targets

POB prepares a Performance Package Dashboard that presents the quarterly and year to date program delivery performance targets for the Regions and the NGCDC. These results are reviewed by POB and the Regions prior to being presented to the Services and Programs Integrated Executive Committee as part of the quarterly Program Delivery Performance results review. These performance targets measure service delivery performance such as completion of project monitors within a prescribed timeframe, project close outs, length of time to acknowledge the application for funding and speed of service for payments.

The audit team reviewed the results of the Performance Package Dashboard and noticed that the acknowledgement of application funding, project monitors and project close outs were not completed by all Regions within the required timeframes. POB indicated they are currently reviewing their performance targets to include client service indicators. The audit team encourages the Department to finalize the review of its program delivery performance targets.

Performance indicators

The audit team expected to find performance indicators and targets established for each of the ten programs included in the scope of the audit. The terms and conditions for the ten programs include performance indicators to assess whether the programs are achieving their respective objectives. The audit found that not all performance indicators had targets established. In addition, we noted that performance indicators were not systematically reported on. Details of the 2015–16 Departmental Performance Report (DPR) and the 2016–17 Departmental Results Report (DRR)Footnote 3 are presented in the table below.

  2015-16 2016-17
Program Name Number of Performance Indicators Target Established and Reported in DPR Performance Indicator Result Reported in DPR Target Established and Reported in DRR Performance Indicator Result Reported in DRR
1. Canada Summer Jobs Four 1/4 1/4 1/4 1/4
2. Career Focus Three 2/3 2/3 2/3 2/3
3. Skills Link Three 2/3 2/3 2/3 2/3
4. New Horizons for Seniors Program Four 2/4 2/4 0/4 0/4
5. Opportunities Fund Four 2/4 2/4 2/4 2/4
6. Skills and Partnership Fund Four 1/4 1/4 1/4 1/4
7. Sectoral Initiatives Program Two 1/2 1/2 1/2 1/2
8. Social Development Partnerships Program Eighteen 2/18 2/18 2/18 2/18
9. Aboriginal Skills and Employment Training Strategy Ten 1/10 1/10 1/10 1/10
10. Homelessness Partnering Strategy Five 5/5 2/5 3/5 0/5

Similar observations have been brought to the attention of the Department by the Auditor General of Canada in the recent audit on Employment Training for Indigenous PeopleFootnote 4. The Department indicated that a new performance measurement strategy with strengthened outcomes, indicators, and clearly defined targets would be developed for the new Indigenous Skills and Employment Training Program.

There is an opportunity for the Department to examine its performance indicators, targets and reporting for the Gs and Cs programs to address the issues outlined above.

Recommendation

  1. SEB and ISSDB, in collaboration with POB, should establish targets and report on program performance indicators.

Management Response

Management agrees with the importance of having targets and reporting on performance indicators. Management will finalize its review of existing performance indicators and realign and adjust targets, as well, if needed, develop new indicators to strengthen reporting on performance. Actions are expected to be completed by March 2020.

3. Conclusion

The assessment and approval of Gs and Cs applications are adequately controlled. Some improvements are required to strengthen the consistency of the assessment process and the approval of applications. There were also adequate controls for the processing of expenditure claims and payments.

We found that the monitoring of and reporting on Gs and Cs agreements are not adequately controlled. The Gs and Cs framework is not risk-based contributing to control activities not being adequately designed to properly monitor and report program and project results. The existing design of RAMM also contributes to monitoring activities not being customized to mitigate risks specific to a project or recipient. Although performance indicators have been developed, targets are not always established and reporting is inconsistent.

4. Statement of Assurance

In our professional judgement, sufficient and appropriate audit procedures were performed and evidence gathered to support the accuracy of the conclusions reached and contained in this report. The conclusions were based on observations and analyses at the time of our audit. The conclusions are applicable only for the Audit of Grants and Contributions Control Framework. The evidence was gathered in accordance with the Treasury Board Policy on Internal Audit and the International Standards for the Professional Practice of Internal Auditing.

Appendix A: Risk Assessment, Management and Mitigation

RAMM is comprised of four risk scores that are used to assess project risks as outlined below:

Organizational Risk uses five factors: Activity Performance, Ability to Demonstrate Results, Financial Performance, Organizational Administration, and Stability of the Board and/or the Organization.

For example, we noted that “Stability of the Board” does not differentiate between for-profit and not-for-profit organizations. In addition, there is no clear correlation between the Stability of the Board and Organizational Risk. It might be riskier if the same board members stay for too long. Furthermore, some boards have specific time limits for board members and in this case, RAMM considers it high risk; however, in our opinion, this is a proper governance practice.

Financial Risk uses five factors: Project Value, Project Duration, Financial Performance, Organizational Administration, and Stability of the Board and/or the Organization.

The audit team believes there is no clear correlation between Organizational Administration, the Stability of the Board/Organization and the Financial Performance or Financial Risk. The Project Duration can be considered in the Financial Risk but again there is no validated correlation between the duration of a project and the Financial Risk. The Project Value can have an impact on the Financial Risk. However, some projects can be considered medium or low value but could actually be high risk. On the other hand, some projects have high project value but the risk could be considered low since they can include capital cost building/renovation projects in which there are architecture firms that have properly detailed payment certificates and accounting and financial administration.

Activity Risk uses five factors: Activity Performance, Project Duration, Complexity of Activities, Involvement of Partners and Public Sensitivity.

The audit team believes there is no clear correlation between the Project Duration and the Activity Risk. The involvement of partners is considered high risk in RAMM but in fact some projects have in-kind contributions from partners which could be very important to the project and/or carry an insignificant amount of Activity Risk. RAMM does not differentiate between positive and negative Public Sensitivity.

Results Risk uses five factors: Ability to Demonstrate Results, Project Duration, Complexity of Activities, Involvement of Partners and Public Sensitivity.

The audit team believes there is no clear correlation between the Project Duration and the Results Risk. The involvement of partners is considered high risk in RAMM but in fact some projects have in-kind contributions from partners that could be very important to the project and/or carry an insignificant amount of Results Risk. RAMM does not differentiate between positive and negative Public Sensitivity.

Appendix B: Testing of the Controls

The testing results presented below indicate whether the controls were operating as intended. The controls tested were the ones designed in the generic framework.

Application

  • Paper applications received were submitted in-person, sent by postal mail or e-mail. Electronic applications were submitted through the Interactive Fact Finding Service or through the Grants and Contributions Online Services.
  • Signed completed applications were received and assessed for the 320 approved and 40 non-approved projects tested. 75% of the applications tested were acknowledged within 21 calendar days of receipt.
  • Applications were received through Calls for Proposals, targeted, continuous solicited and unsolicited intake processes.

Assessment

  • Applications were assessed using various tools such as program and pre-assessment checklists as well as assessment grids. Each program used two to three tools to assess the applications received. These assessment tools contain one or more similar questions to determine if the applications received met the eligibility requirements and/or complied with the terms and conditions of the program. The audit team identified a duplication of time and effort in the assessment process in more than one tool (depending on the program). There is a risk that assessment of eligibility may be duplicated considering the number of checklists and grids used in each program.
  • Program checklists and pre-assessment checklists were completed for 81% of the projects. Both checklists contained similar eligibility assessment questions which contribute to the duplication of time and effort. In addition, there was no evidence that program checklists were reviewed by a supervisor or manager.
  • Assessment grids were properly completed and reviewed by a supervisor or manager for 58% of the projects that required completion of the grid. Documentation of who prepared the grids and evidence of review of the grids were not properly completed.
  • Projects meeting pre-established criteria are presented to the Internal Review Committee (IRC) for review. The IRC recommends for approval to the Delegated Signing Authority projects that meet the terms and conditions and have been subject to a risk assessment. An IRC Record of Decision (ROD) was completed for 90% of the projects that were subjected to an IRC.
  • A RAMM that includes risk score justifications and risk mitigation strategies was prepared for 80% of the projects. However, the RAMM was not updated throughout the project’s lifecycle. Documentation of the score justifications and risk mitigation strategies varied significantly between projects and agents as RAMM documentation requirements are not defined in the Gs and Cs framework. Interviews also confirmed that RAMM documentation is subjective. Documentation requirements, score justifications and risk mitigation strategies could be further clarified. In addition, there was no evidence of review by a supervisor or manager.
  • 67% of the approved projects were tabled at an IRC meeting with appropriate members in accordance with the IRC Directive. We were informed by staff that the RAMM was presented and discussed during IRC meetings. We reviewed the RODs and did not find evidence that discussions of projects’ risks and mitigation strategies occurred at the IRC. As a result, the audit team could not assess, based on evidence on file, whether the IRC fulfilled its objective which is to provide assurance that the project has been subject to a risk assessment and has an appropriate risk management plan in place to mitigate identified risks.

Recommendation and Approval

  • Recommendation and approval of projects is obtained by filling out a paper form, signed by a Director General, and approved by a Regional Assistant Deputy Minister (ADM), POB ADM or the Minister depending on the funding or amendment amount being approved. Recommendation and approval of applications for funding was properly documented for 93% of the projects. The approval signature form was not signed by a departmental representative with proper delegated approval authority for 3% of the projects and the signed approval signature form was not available for another 4% of the projects.
  • The Canada Summer Jobs projects had electronic approval in CSGC by a Regional ADM for all projects. Minister or ADM approval was provided for all New Horizons for Seniors projects in accordance with the financial authorities delegation charts for Gs and Cs.
  • The approval request form checklist was completed for 61% of the projects that required them.
  • Project assessments and rejection rationale were documented for 95% of the non-approved projects. Signed rejection letters were completed for 85% of the non-approved projects.

Agreement

  • A Contribution or Grant Funding Agreement was signed by the recipient and a departmental representative with proper delegation of authority for 88% of the projects. An agreement was not signed by the recipient’s designated signing authorities for 4% of the projects and 2% of the projects did not have a signed original agreement in CSGC or on file. The remaining 6% of the projects had agreements that were signed by a departmental representative who did not have the proper delegated authority within the payment limit.

Monitoring

  • Monitoring plans were completed for all projects and monitoring activities were completed in accordance with these plans for 90% of the projects.
  • Monitoring activity reports were properly completed and corrective actions were properly identified for follow-up when applicable for 85% of the projects.

Claims Processing/Payments

  • Payments of grants and eligible expenditures claimed were completed within 15 and 28 calendar days, respectively, for 82% of the projects. Canada Summer Jobs and HPS are 2 of the 10 programs tested that had 18% of the projects not adhering to this standard.
  • Payment approval and claim approval forms were approved by the appropriate delegated authority in accordance with payment limit authorized for 90% and 87%, respectively, of the projects.
  • 85% of the payments had no calculation, frequency and timing errors. Issues were identified in the remaining 15% of the projects:
    • The payment claim forms were not received according to the required frequency for 7% of the projects or within the required time period for 4% of the projects, in accordance with the terms and conditions of the agreement.
    • The date of when the payment claim form was received was not recorded for 1% of the projects, the payment claim form was not available for review for another 1% of the projects and the correct holdback amount was not retained for 1% of the projects.
  • Payment claim forms were properly completed for 94% of the approved projects. 98% of the payment claim forms were for eligible expenditure categories as per the terms and conditions of the signed agreement.

Close Out

  • Close out summaries were properly completed and reviewed in a timely manner for 84% of the projects.
  • Final reports submitted by recipients were not always properly signed and dated.
  • Final report requirements were not clearly defined by program. Our review of a sample of final reports showed that the contents of these reports do not always demonstrate how projects’ objectives and programs’ terms and conditions were met.

Appendix C: Audit Criteria Assessment

Audit Criteria Design Effectiveness Rating Operating Effectiveness Rating
It is expected that the Department has appropriate management frameworks for oversight and risk assessment in planning and administering calls for proposals. Controlled, but should be strengthened; medium-risk exposure Controlled, but should be strengthened; medium-risk exposure
It is expected that the Department has adequate processes for the assessment and approval of applications in accordance with the terms and conditions of the programs. Controlled, but should be strengthened; medium-risk exposure Controlled, but should be strengthened; medium-risk exposure
It is expected that the Department has adequate processes for the assessment of the eligibility of the applications received and for the approval of funding before Gs and Cs agreements or amendments are signed. Controlled, but should be strengthened; medium-risk exposure Controlled, but should be strengthened; medium-risk exposure
It is expected that the Department has adequate controls to assess the accuracy and validity of expenditure claims submitted. Sufficiently controlled; low-risk exposure Sufficiently controlled; low-risk exposure
It is expected that the Department has adequate controls to issue payments in a timely manner to legitimate recipients for eligible expenditures and activities. Sufficiently controlled; low-risk exposure Sufficiently controlled; low-risk exposure
It is expected that the Department has adequate controls to issue payments that are approved by the appropriate delegated authority. Sufficiently controlled; low-risk exposure Sufficiently controlled; low-risk exposure
It is expected that the Department has oversight and risk assessment processes to determine the level of program/project monitoring and reporting. Missing key controls; high-risk exposure Missing key controls; high-risk exposure
It is expected that the Department has mechanisms in place to monitor and report on results, in accordance with programs’ objectives. Missing key controls; high-risk exposure Controlled, but should be strengthened; medium-risk exposure
It is expected that the Department has taken corrective actions, when necessary, as identified by monitoring activities. Controlled, but should be strengthened; medium-risk exposure Controlled, but should be strengthened; medium-risk exposure

Appendix D: Glossary

ADM:
Assistant Deputy Minister
ASETS:
Aboriginal Skills and Employment Training Strategy
CFOB:
Chief Financial Officer Branch
CSGC:
Common System for Grants and Contributions
DPR:
Departmental Performance Report
DRR:
Departmental Results Report
ESDC:
Employment and Social Development Canada
Gs and Cs:
Grants and Contributions
HPS:
Homelessness Partnering Strategy
IRC:
Internal Review Committee
ISSDB:
Income Security and Social Development Branch
MOU:
Memorandum of Understanding
NGCDC:
National Gs and Cs Delivery Centre
POB:
Program Operations Branch
QAMU:
Quality Assurance and Monitoring Unit
QAR:
Quality Assurance Review
RAMM:
Risk Assessment, Management and Mitigation
ROD:
Record of Decision
SEB:
Skills and Employment Branch
Report a problem or mistake on this page
Please select all that apply:

Thank you for your help!

You will not receive a reply. For enquiries, contact us.

Date modified: