Neutral Assessment of the Evaluation Function

On this page

  1. Executive Summary
  2. Engagement background
  3. Considerations
  4. Findings
  5. Opportunities for Improvement
  6. Recommendations
  7. Appendix A - Acronyms

Alternate formats

Large print, braille, MP3 (audio), e-text and DAISY formats are available on demand by ordering online or calling 1 800 O-Canada (1-800-622-6232). If you use a teletypewriter (TTY), call 1-800-926-9105.

1. Executive Summary

1.1 Purpose

Employment and Social Development Canada’s (ESDC) Internal Audit and Enterprise Risk Management Branch engaged BDO Canada LLP as independent assessors to conduct a neutral assessment of the evaluation function pursuant to the 2016 Treasury Board (TB) Policy on Results and related Directive (2016 Policy). 

1.2 Methods

To conduct this assessment, BDO reviewed a sample of completed evaluations and relevant documentation (e.g., internal guidance documentation, planning documents, etc.); conducted interviews with members of the Performance Management and Evaluation Committee (PMEC), the Head of Evaluation, the Head of Performance Measurement, Assistant Deputy Ministers (ADMs), the Deputy Minister (DM), and other senior officials; and administered a survey to evaluation staff below the management level. 

1.3 Conformance with the evaluation elements of the Treasury Board 2016 Policy and Directive on Results 

The evaluation function of ESDC generally conforms (highest rating) with the evaluation requirements of the 2016 Policy. 

1.4 Fulfilment of the Head of Evaluation’s responsibilities

The Head of Evaluation is fulfilling its responsibilities according to the 2016 Policy. Evaluation reports completed over the period since 2018 meet TB policy requirements and standards for quality. The planning, conduct, and reporting of evaluations are aligned with the requirements of the 2016 Policy. Published reports are high-quality, informative, and well-written. Reports examine important issues. The evaluation function has provided expert advice and has made a valued contribution to the departmental performance measurement (PM) tools and products. Tracking of approved management actions to respond to recommendations is done with rigour. Based on document review, it was noted that the evaluation function has developed a detailed process for management action plans (MAPs) to ensure they are implemented and adhered to. As such, once a MAP is developed, a lead for the response, timelines, and expected results of the plan are established. The status of MAPs is monitored and reported to PMEC on a regular basis.

1.5 Usefulness of the Evaluation Function

The neutral assessment noted that the evaluation function conducts its work in a neutral, professional manner and supports the continuous improvement of programs. Evaluation staff are competent, qualified, and highly regarded by members of PMEC. Evaluations are approached with a collaborative mindset to validate the accuracy of findings, conclusions, and recommendations. Evaluation reports are thorough and exhibit the function’s extensive knowledge and capabilities. The evaluation function provides expert advice regarding program performance measurement and past evaluations to program officials, and for submissions to TB and other Central Agencies. The evaluation function has made a contribution to policy and program development.

Opportunities for improvement: BDO identified potential improvements to this well-performing evaluation function. These opportunities relate to: the perceived neutrality of the evaluation function; facilitating responding to evolving departmental priorities by building more agility into evaluation plans; applying a consistent quality assurance (peer review) process; and enhancing corporate knowledge transfer processes to support the evaluation function’s adaptability.

Recommendations:

  • Reinforce the perceived neutrality of the evaluation function.
  • Facilitate proactive response to evolving departmental priorities.
  • Make consistent use of a quality assurance process.
  • Enhance corporate knowledge transfer processes.

2. Engagement background

2.1 Purpose of the Report The 2016

Treasury Board (TB) Policy on Results requires a neutral assessment of the evaluation function every 5 years.

This is the second neutral assessment of Employment and Social Development Canada’s (ESDC) evaluation function under the 2016 Policy. The purpose of this report is to share the results from the neutral assessment of the evaluation function. 

2.2 Objective and Scope

BDO Canada LLP was engaged by the Internal Audit and Enterprise Risk Management Branch at ESDC to conduct the neutral assessment of the department’s evaluation function. The objective of this neutral assessment was to assess conformance with the 2016 Policy on Results with a focus on quality and use and to recommend improvements regarding the evaluation function, as appropriate. The assessment focused on the following:

  • Governance of the evaluation function, which addressed the responsibilities of the Deputy Head with respect to the evaluation function, the departmental Performance Measurement and Evaluation Committee (PMEC) and the Head of Evaluation
  • The evaluation organization’s professional practices, which support planning and conducting work at the function level; planning, conducting, and reporting at the project level; and managing the evaluation function
  • Monitoring and performance reporting associated with the evaluation function, to fulfill annual and ongoing reporting obligations; and
  • Any potential opportunities for improvement for the ESDC evaluation function 

The assessment focused on the following elements deriving from the 2016 Policy.

  • The scope of the assessment included: The approved 5-year departmental evaluation plan (DEP) 
  • The PMEC, the Head of Performance Measurement, and the Head of Evaluation have access to the departmental information needed to undertake evaluation activities resulting from the 2016 Policy and its related instruments 
  • A sample of evaluation reports and summaries, including complete management responses and actions plans, on web platforms as prescribed by the Treasury Board Secretariat (TBS)

The scope of the assessment excluded elements already reviewed in Management Accountability Framework (MAF) assessments and those currently under review as part of the Internal Audit and Enterprise Risk Management Branch’s (IAERMB) Audit of Performance Measurement (where applicable).

2.3 Methods

Over the engagement period from December 2022 to April 2023, the neutral assessment team conducted the following activities:

  • Reviewed internal documents, evaluation reports within the scope of the neutral assessment, and ESDC and TB Policy requirements as well as other instruments relevant to Evaluation
  • Conducted interviews with PMEC members, Program Officials, the Head of Evaluation, the Head of Performance Measurement, Evaluation Officials, ADMs, the Deputy Minister (DM); and other senior officials (32 people participated in interviews)
  • Administered a survey to evaluation staff below the manager level (85% or 41 out of 48 people responded to the survey)

The period examined for this neutral assessment was from fiscal year 2018-19 to fiscal year 2022-23. This neutral assessment was conducted in a manner consistent with TBS guidance.

3. Considerations

Changes to the Treasury Board policy suite and its current review efforts

The neutral assessment considered evaluations conducted since 2018. These were governed by a TB policy that was issued in 2016, with changes impacting evaluation functions throughout the federal government.

The 2016 Policy replaced the Policy on Evaluation (2009), the Policy on Reporting of Federal Institutions and Corporate Interests to TBS (2007), and the Policy on Management, Resources and Results Structures (2010). When the 2016 Policy was approved, a new Directive on Results replaced the Directive on the Evaluation Function and the Standard on Evaluation for the Government of Canada.

The 2016 Policy permitted more focused, timely, and strategic evaluations of high-priority initiatives, issues, and projects.

This neutral assessment has examined not only the conformance with the 2016 Policy, but has also identified potential opportunities for improvement to support the evaluation function in continuing to provide value to the Department (e.g., innovating through making use of the flexibilities afforded by the 2016 Policy).

Treasury Board Secretariat has announced that it has started a formal review of the 2016 Policy. The neutral assessment noted that this review of the 2016 Policy is underway; therefore, potential changes to the 2016 Policy and consequently, any potential impacts on ESDC’s evaluation function, are unknown at this time. The areas that are planned to be explored in the current TB Policy review (e.g., governance and leadership, departmental planning and reporting, flexibility regarding the issues to be addressed, etc.) are aligned with the areas of inquiry of this neutral assessment. During the period that the 2016 Policy is under review, the requirements of the 2016 Policy remain in force and govern the evaluation functions in federal departments and agencies.

Unique features of ESDC

ESDC is the fourth largest government departmentFootnote 1 and operates within a unique environment due to the size of the department and its 3 portfolios (ESDC, Labour Program and Service Canada). ESDC has a mix of statutory and non-statutory programs in the department’s program inventory (PI), and these are delivered across Canada via a relatively complex organizational structure. The majority of ESDC’s non-statutory programs are Grants and Contributions (Gs&Cs) programs and may require substantial departmental evaluation resources to meet the evaluation requirements of the federal government as specified in the Financial Administration Act and TB policies. More specifically, Gs&Cs programs with 5-year average expenditures greater than 5 million dollars must be evaluated at least every 5 years. This is an important contextual factor for ESDC’s evaluation function to consider when developing the DEP (e.g., selecting programs of highest priority to evaluate).

4. Findings

The following section describes the key findings of the neutral assessment.

Conformance with the Evaluation Elements of the Treasury Board 2016 Policy and Directive on Results

Finding 1: The neutral assessment found that the evaluation function at ESDC generally conformsFootnote 2 with the requirements of the 2016 Policy related to evaluation.

The neutral assessment noted that ESDC’s evaluation function generally conforms with the 2016 Policy. Except for issues requiring attention regarding the perceived neutrality of the function, there were no concerns noted by the neutral assessment team regarding the function’s conformance with the evaluation elements of the Treasury Board 2016 Policy and Directive on Results. As such, the rating of “generally conforms” is the highest rating an evaluation function can receive from our external neutral assessment.

ESDC has taken the required steps to implement the evaluation elements of the 2016 Policy. We found that the ESDC’s evaluation function generally conforms, particularly with respect to the responsibilities of the Deputy Head regarding the evaluation function; the responsibilities of PMEC regarding the evaluation function; and the responsibilities of the Head of Evaluation.

The responsibilities of the Deputy Head, PMEC, and the Head of Evaluation are recognized in ESDC documents and have been implemented in a rigorous manner. The operations of PMEC, the DEP, and published evaluations and executive summaries are aligned with the requirements of the 2016 Policy.

Evaluations are neutral, well-written, and meet TB requirements. Evaluations examine important programs that impact the lives of Canadians, and, in general, assess relevance, performance/effectiveness, and efficiency. Planning, conducting, and reporting of evaluations is aligned with the requirements of the 2016 Policy. Completed evaluations, including their associated executive summaries and MAPs, are published on ESDC’s website. Implementation of MAPs is monitored in a timely manner and reported to PMEC on a regular basis. The evaluation function is implementing some of the flexibilities authorized by the 2016 Policy. Based on interview analysis and survey analysis, the evaluation function is viewed as a trusted advisor within the department by providing credible and insightful advice that contributes to policy and program development. Interview analysis noted that Program Officials are confident in the guidance that the evaluation function provides them to adequately assess program performance. The evaluation function has developed innovative evaluation approaches, including work that supports the Government of Canada’s commitment to Gender-based Analysis Plus (GBA Plus) in both quantitative and qualitative evaluation design. Furthermore, interview analysis noted that innovative methodology and analysis are developed through research within the Strategic and Service Policy Branch to support the evaluation function in carrying out the federal government’s commitment to GBA Plus.

We noted an issue that requires attention related to the Head of Evaluation’s direct unimpeded access to the Deputy Head. While this access was confirmed in interviews, this should be made explicit in an appropriate formal document. In addition, to maintain the neutrality of the function, since the evaluation function is located within the Strategic and Service Policy Branch, having explicit special reporting arrangements in place for programs that may be evaluated within that Branch would enhance the perceived neutrality of the evaluation function.

Fulfillment of the Responsibilities of the Head of Evaluation

Finding 2: The Head of Evaluation is fulfilling its responsibilities, according to the 2016 Policy.

The Head of Evaluation is fulfilling the mandated responsibilities as set out in the 2016 Policy, and in particular, as noted at Section 4.4 of the Directive on Results.Footnote 3

The 2016 Policy is complex regarding the division of responsibilities in the area of performance measurement (PM). With regard to PM, the 2016 Policy includes descriptions of the expected roles of the DM, PMEC, the Head of PM, program officials, and the Head of Evaluation.

The neutral assessment examined the role of the evaluation function with regard to PM.

The Head of Evaluation is responsible for providing advice to PMEC and to program officials on the validity and reliability of Departmental Results Framework (DRF) indicators and on the availability, quality, validity, and reliability of the indicators and information in the Performance Information Profiles (PIPs), including their utility to support evaluations.

With regard to providing advice to assist ESDC in meeting the PM requirements of the 2016 Policy, as detailed in the Directive on Results, the responsibilities of the Head of Evaluation in the area of PM are advisory in nature. For example, expert advice from the evaluation function has supported preparation of draft performance indicators and draft PIPs for program officials. Program officials remain the owners of these PM tools. The evaluation function has provided advice to the Head of Performance Measurement, program officials, and to the Chief Data Officer regarding data quality. Interview analysis and document review noted that the evaluation function conducts evaluability assessments to establish a plan for each upcoming evaluation. These evaluability assessments include considerations such as the proposed evaluation approach, evaluation questions, and methodology. Evaluability assessments examine the availability and quality of performance data.

Although roles and responsibilities are clear, and although the 2016 Policy has directed increased attention to PM, consistent implementation of the TB policies for PM has been a longstanding challenge for many departments and agencies. Advice from the evaluation function at ESDC has been of high value to address these challenges in relation to the implementation of the 2016 Policy. During the course of planning, conducting, and reporting phases of evaluations, the evaluation function has opportunities to provide valuable advice to program officials regarding PM as it relates to the subject matter of the evaluation.

Given the above, the neutral assessment has confirmed that the evaluation function at ESDC participates in PM activities as would be expected based on the 2016 Policy. The evaluation function provides advice for the PIPs and input for the various PM products required to comply with the 2016 Policy’s PM requirements. TB Submissions and Memoranda to Cabinet (MCs) often require evaluation and PM information and advice in this regard from the evaluation function. This advisory work constitutes an important level of effort from the evaluation function and represents a significant value-add from the evaluation function as the performance of ESDC’s programs, that are directly impacting the livelihood of Canadians, can be adequately assessed and potential opportunities for improvement can be identified and implemented.

Usefulness of the Evaluation Function Finding

Finding 3: The evaluation function has delivered valued information, advice, and recommendations about important departmental programs and the risks and challenges associated with these programs.

The neutral assessment noted that the evaluation function at ESDC produces high quality evaluations and technical reports that support continuous improvement of the statutory and non-statutory programs of ESDC. Evaluation staff collaborate with the programs to ensure data is gathered and adequately triangulated to produce credible findings, recommendations, and conclusions that adequately support the efficiency and effectiveness of program delivery and management. Evaluation reports are well-written, thorough, and exhibit the function’s extensive technical knowledge and capabilities. Pursuant to the 2016 Policy, the evaluation function provides expert advice to the programs. At times, evaluations have been used to inform policymaking and to support funding decisions regarding statutory and non-statutory programs.

The neutral assessment team noted that PMEC, ADMs, the DM, the Head of Evaluation, and Head of Performance Measurement are pleased that evaluations are planned and implemented with a collaborative mindset between the evaluation function and the program officials. Evaluation reports are produced pursuant to the 5-year evaluation plan. Each specific evaluation is planned with rigour. Each evaluation begins with a detailed, consultative process for the preparation of the annual update to the departmental evaluation plan that is recommended by PMEC for approval by the Deputy Head. Evaluations are planned through the preparation of an evaluability assessment that is discussed with program officials, an evaluation working group, then with a DG-level working group, and finally, presented for approval at PMEC. The DG-level working group provides advice on data collections and reviews draft reports. The evaluation function has rigorously tracked implementation of approved management actions in response to recommendations which supports the continuous improvement of the programs.

The neutral assessment noted that evaluation staff are competent, professional, and highly regarded by Senior Management and the PMEC. Interview and survey analysis noted that the evaluation function conducts its evaluations in a professional manner through careful planning, data collections, and analysis, and through their interactions with the programs, Senior Management, and PMEC. Based on feedback from interviewees, a review of published evaluation reports, and our general knowledge of the federal evaluation community, the neutral assessment team found that ESDC’s evaluation function is seen as a leader within the federal evaluation community. The function has shown an impressive capacity to innovate to provide highest value to the department, including making innovative uses of labour market databases and using pop-up surveys to gather data for evaluations.

The function has considered innovative report formats to help different sets of readers understand and visualize the results of evaluations. The neutral assessment noted that the summaries of evaluations, published at the same time as the full evaluation reports, were much appreciated by readers. The assessment team encourages the function to continue to experiment with infographics as an effective means of communication with some audiences.

It is noteworthy that in recognition of the high quality and innovative evaluation work conducted at ESDC, the evaluation function maintained relations with the academic community and made presentations at professional gatherings, events, and conferences, as well as at the Organization for Economic Co-operation and Development.

5. Opportunities for Improvement

The following section describes opportunities for improvement identified during the neutral assessment.

Neutrality of the Evaluation Function

Pursuant to the 2016 Policy, the evaluation function must conduct its work in a neutral manner. The neutral assessment has found that the function uses excellent collaboration processes with the programs to support the conduct of its work. While there is extensive collaboration for the preparation of the DEP and for the individual evaluations, the Head of Evaluation and the work of the evaluation function must be, and be perceived to be, neutral.

The department must continue to safeguard the neutrality of the evaluation function to ensure evaluation reports are credible and adequately assess program performance. The neutrality and the perceived neutrality of the evaluation function is a key attribute that contributes to the credibility of the evaluation function, its value and its usefulness.

It is noteworthy that the evaluation function has established mechanisms such as in-camera bi-lateral meetings between the Head of Evaluation and the DM, and the Head of Evaluation’s presence at PMEC to reinforce the neutrality of the evaluation function at ESDC.

The neutral assessment noted that the Head of Evaluation reports, for administrative purposes, to an ADM within a sector that is responsible for overseeing some program areas. While the neutral assessment confirmed that the work of the evaluation function is conducted in a neutral manner, the current reporting and administrative structure can be perceived as a potential conflict of interest and can affect the perceived neutrality of the evaluation function at ESDC for these specific evaluations.

It would be beneficial for the evaluation function to re-examine the organizational reporting structure for those specific evaluations that are conducted on programming that is managed within the same sector as the ADM to whom the Head of Evaluation reports administratively. The objective of the re-examination would be to assess whether there may be a challenge to the perceived neutrality of the evaluation function for these evaluations, and as appropriate, take steps to reinforce the neutrality of the evaluation function for those evaluations.

In addition, it would be beneficial for the evaluation function to consider making explicit, in an appropriate corporate document, the current informal understanding that pursuant to the 2016 Policy, the Head of Evaluation does have direct, unimpeded access to the Deputy Head, when required, for the performance of his/her responsibilities.

Why is neutrality important?

Neutrality is important to ensure evaluations are credible, and that the findings are based on evidence which has been collected in an objective, appropriate manner, by evaluators who are distinct from the management and operations of the entity subject to the evaluation. The resultant credibility of evaluations is expected to enhance the value and usefulness of evaluations for decision-making and accountability. 

Agile Evaluation Planning to Facilitate Responding to Evolving Departmental Priorities

The neutral assessment noted that the evaluation function at ESDC has operated using a 5-year plan (DEP), updated annually, pursuant to the 2016 Policy. Based on documentation review and interviews, the DEPs are sufficiently detailed and provide a plan for the organization to achieve its evaluation objectives. The DEP presents well the planned coverage of non-statutory and statutory programs and the significant number of mandatory evaluations that must be completed to meet legislative requirements and TB evaluation requirements. The evaluation function has an excellent track record of completing the work that is planned, in a collaborative manner, according to the timeframes set out in the plan, and according to the insights gained through the conduct of an evaluability assessment (plan) for individual evaluations. As a rule-of-thumb, generally speaking, there is an expectation that most evaluations are planned to be completed over a 2-year period.

However, the capacity and resources of the evaluation function are finite. Considering the post-pandemic environment, there is a potential need for capacity to address significant issues that may arise where evaluation input and expertise may be of high potential value (if it can be made available at short notice). For example, the evaluation function provided this type of advice and input for issues related to policy and program development during the pandemic. Looking forward, there may be an opportunity for the DEP to provide more agility and nimbleness to ensure planned work can re-prioritized based on new and emerging risks and challenges. Considering the significant planned DEP-noted workload, the evaluation function could consider, at the DEP or the evaluability assessment phases of evaluation projects where there is room for discretion, identifying ways that evaluations can be focused or calibrated as necessary to permit either quicker completion, or postponement to accommodate potential higher priority and shorter term evaluations or evaluative advisory projects that may present themselves during the course of a fiscal year, or that may be requested by the Deputy Head outside the DEP planning cycle. The idea is to welcome potential high-priority requests for advice, a formative evaluative project, or rapid impact assessment, and have thought about ways to accommodate a request (outside of the annual DEP update cycle) in the manner least-disruptive to the DEP.

In addition, for some programs or program components that may experience policy or program changes during the period of an evaluation, evaluation planning should build in an opportunity for interim reporting on issues relevant to significant changes that may be expected or that occur unexpectedly during the evaluation in order to enhance the relevance of the evaluation work. The idea is to be prepared to provide interim reports based on the information that is available in a timely manner (with limitations noted as appropriate), rather than having an evaluation report about issues that may have reduced relevance after the change has been implemented.

Why is more agile evaluation planning important?

Evaluation plans are routinely updated annually. Implementing planned evaluations locks-in significant evaluation resources for the planned duration of approved evaluations. More agile evaluation planning is important to permit the department to have a more nimble and agile response to departmental priorities and to ensure evaluation resources can be reallocated more quickly in the event of unanticipated risks and challenges. Furthermore, more agile evaluation planning may allow for timelier advice and guidance to be provided to the programs.

Consistent Quality Assurance (Peer Review) Process

Pursuant to the 2016 Policy, a quality assurance process such as peer review is required when conducting evaluations to ensure their quality. Document review noted that the evaluation function has established an internal and external peer review process. The neutral assessment noted that in recent years, however, either resource or time restrictions have resulted in less frequent external peer reviews of evaluation reports. Interview and survey analysis noted that peer review of evaluation reports is inconsistently applied across evaluations (i.e., some evaluations use external or internal peer review and others do not). There is an opportunity to review the current process (both internal and external), build on its strengths, and recognize the high potential value of a more consistently applied peer review process. The value of a rigorously applied peer review process would support the consistency of the quality of evaluation results, methodologies, and deliverables. Quality assurance such as peer review requires time and resources and may not be possible in all circumstances; however, these should be seen as exceptions from a standard operating procedure.

Why is quality assurance such as peer review important?

Quality assurance is important because it ensures experiences and perspectives are accurately captured throughout the course of an evaluation. Quality assurance processes, such as peer reviews, allow for an additional critical thinking and analysis measure to ensure evaluation recommendations are logical, linked to findings, relevant, and achievable. Furthermore, quality improvement allows for an opportunity to enhance the readability of evaluation reports to ensure evaluations findings, recommendations, and conclusions are being effectively communicated to key stakeholders and Canadians.

Enhanced Corporate Knowledge Transfer to Support an Adaptable Evaluation Function

The neutral assessment noted that the evaluation function at ESDC is comprised of highly competent, skilled staff that provide valuable advice and develop and implement complex lines of evidence. Projects at ESDC and other departments and agencies indicate that the overall federal requirement for evaluations and for improved performance measurement has led to a shortage of staff compared with requirements, and therefore, a competitive staffing environment. The evaluation function has adopted several exemplary (or good) practices to attract, develop, and retain its highly qualified staff. Practices in place include measures such as: participating with other branches to hire and retain qualified persons with disabilities; using flexible staffing tools for short-term and temporary capacity needs such as hiring students, casual employees, retired public servants, and consultants; promoting professional development; building professional networks with the Canadian Evaluation Society; and promoting health, mental health, well-being and greening of the workplace.

To maintain capacity and the quality of the evaluation function’s work, there is an opportunity for the function to implement a formalized process to update training and development material, as needed and based on new and emerging challenges. Additionally, there is an opportunity to build a workforce that can proactively respond to ever-evolving needs of the department and the federal government, particularly as it relates to reporting on performance results. Document review noted that the evaluation function has established an abundance of resources and templates to support the professional development of staff and onboarding of new staff; however, there is an opportunity to implement a process to periodically assess the extent to which these materials are up to date. Additionally, document review of the Evaluation Human Resource Plan – DRAFT 2022-23 to 2024-25 noted that, like other evaluation functions across the government, the evaluation function is experiencing turnover amongst staff, and in particular senior staff (i.e., the EC-07 level). Periodically assessing the accuracy and currency of training and development material will more effectively support corporate knowledge transfer and ensure the quality of evaluation reports remains high in spite of staff turnover over time.

In addition to establishing a formal periodic process to update training and development material, there is an opportunity for the function to introduce job-shadowing as part of new employee onboarding processes. Document review noted that the evaluation function has established processes for new employees as part of their onboarding. One of these processes includes pairing new employees with a senior staff member as a point of contact for questions and guidance. While this is an excellent support system for new employees, this process can be enhanced if it included job-shadowing. Effective job-shadowing can be achieved by ensuring, to the extent possible within budget constraints, a balanced number of junior and senior evaluators are staffed on evaluations. Creating more opportunities for job-shadowing would allow experienced evaluators to act as an additional quality control for evaluation reports and provide an effective and efficient on-the-job training method for junior staff. Additionally, job-shadowing may be an effective method to ensure there is adequate corporate transfer within the evaluation function as it helps to build a workforce that is more to resilient and adaptable to change. As such, this would be expected to enhance the value-add of the evaluation function because junior staff would be expected to be better equipped more quickly in order to support senior staff in achieving the intended outcomes of the evaluation function.

Why is corporate knowledge transfer important?

Corporate knowledge transfer is important because it ensures the quality of evaluation reports are consistent across all evaluations. All organizations may experience fluctuations in staff retention at a given point in time; therefore, adequate training and development and corporate knowledge transfer improves the likelihood that these fluctuations do not impact the quality of ESDC’s evaluation products.

6. Recommendations

This is a well-performing evaluation function operating in general conformance with the 2016 Policy and providing recognized value-add to ESDC.

Our recommendations aim to improve the value-add (from the current high level) to ESDC from the products and services that are provided by the evaluation function.

The following recommendations build from the identified opportunities for improvement noted during this neutral assessment.

Recommendation 1: Reinforce the perceived neutrality of the evaluation function

Rationale

The neutral assessment noted that the Head of Evaluation reports, for administrative purposes, to an ADM within a sector that is responsible for overseeing some program areas. While the neutral assessment confirmed that the work of the evaluation function is conducted in a neutral manner, the current reporting and administrative structure can be perceived as a potential conflict of interest and can affect the perceived neutrality of the evaluation function at ESDC for these specific evaluations.

The neutral assessment team recommends that the evaluation function re-examine the organizational reporting structure for those specific evaluations that are conducted on programming that is managed within the same sector as the ADM to whom the Head of Evaluation reports administratively. The objective of the re-examination would be to assess whether there may be a challenge to the perceived neutrality of the evaluation function for these evaluations, and as appropriate, take steps to reinforce the neutrality of the evaluation function for those evaluations.

In addition, the neutral assessment recommends that the evaluation function consider making explicit, in an appropriate corporate document, the current informal understanding that pursuant to the 2016 Policy, the Head of Evaluation does have direct, unimpeded access to the Deputy Head, when required, for the performance of his/her responsibilities. The neutral assessment recognizes the advantages to the department to situate the evaluation function within the Strategic and Service Policy Branch, and in particular, the potential contribution of the evaluation function to policy and program development.

The neutral assessment applauds the collaborative environment, consultation processes, and governance structures that are in place for the evaluation function at ESDC.

Recommendation 2: Facilitate proactive response to evolving departmental priorities

Rationale

The neutral assessment noted that the evaluation function at ESDC has operated using a 5-year plan (DEP), updated annually. The evaluation function has an excellent track record of completing the work, according to the timeframes set out in the plan, and according to the insights gained through the evaluability assessments.

However, the capacity and resources of the evaluation function are finite. There is a potential need for capacity to address significant issues that may arise where evaluation input and expertise may be of high potential value (if it can be made available at short notice). It is recommended that the evaluation function consider, at the DEP and the evaluability assessment phases, identifying ways that evaluations can be focused or calibrated to permit either quicker completion, or whether all or parts can be postponed to accommodate potential shorter term evaluation or evaluative projects that may present themselves during the course of a fiscal year, or that may be requested by the Deputy Head outside the DEP planning cycle.

Some programs may experience policy or program changes during the time required to conduct an evaluation. Evaluation planning should build in an opportunity for interim reporting on issues relevant to significant changes that may be under consideration or expected during the evaluation in order to enhance relevance and report the information that is available in a timely manner.

Recommendation 3: Make consistent use of a quality assurance process

Rationale

A quality assurance process such as peer review is required by the 2016 Policy.Footnote 4 The current peer review process within the evaluation function is well-established and has worked well. However, in recent years, it has not been used consistently across evaluations. Where possible, it is recommended that the evaluation function ensure a peer review process is consistently applied to enhance the value-add of the evaluation function and build on its strengths. Cost-effective and agile peer review processes that are applied consistently and strategically to consider the quality of evaluation reports can contribute to ensuring that all evaluation products continue to meet the high standards of the department. The evaluation function may wish to acknowledge that there may be circumstances where peer review is not possible due to budget or timeliness constraints; however, these should be recognized as exceptions from the standard operating procedure.

Recommendation 4: Enhance corporate knowledge transfer processes

Rationale

Maintaining capacity in the current competitive staffing environment is challenging. In addition to implementing current strategies to retain, develop, and attract its highly qualified staff, we recommend additional measures to maintaining corporate knowledge. Periodically assessing the accuracy and currency of training and development material will more effectively support corporate knowledge transfer and ensure the quality of evaluation reports remains high in spite of staff turnover over time. The neutral assessment noted that a large department such as ESDC may be limited by its budget to provide extensive training and development tools as evaluators are required to have several different skillsets from oral and written communication to the design and application of quantitative and qualitative data analysis. As such, in addition to the current processes in place for the onboarding of new employees, there is an opportunity for ESDC’s evaluation function to examine other possibilities to enhance its corporate knowledge transfer processes, such as including job-shadowing opportunities for junior staff to facilitate knowledge transfer from senior staff during evaluation engagements.

Appendix A - Acronyms

2016 Policy
Treasury Board Policy on Results, Directive on the Evaluation Function, including the Mandatory Procedures for Evaluation and the Standard on Evaluation
ADM
Assistant Deputy Minister
DG
Director General 
DEP
Departmental Evaluation Plan
DM
Deputy Minister
EAC
Evaluation Advisory Committee
FAA
Financial Administration Act
Gs&Cs
Grants and Contributions
GBA Plus
Gender-based Analysis Plus
MC
Memorandum to Cabinet
MAP
Management Action Plan
PI
Program Inventory
PIP
Performance Information Profile
PM
Performance Measurement
PMEC
Performance Measurement and Evaluation Committee
TB
Treasury Board TBS Treasury Board Secretariat

Page details

Date modified: