Outcome Management: Lessons Learned and Best Practices

Table of Contents

Executive Summary

This report documents the lessons learned by twelve Government of Canada initiatives that used the Outcome Management or cost-benefit analysis techniques to demonstrate their value. It also documents a series of related best practices for effective Outcome Management from other jurisdictions that are in various stages of introduction and adoption of Outcome Management (or equivalent).  It includes recommended next steps for consideration should the Government of Canada decide to pursue the implementation of the Outcome Management approach as an improved means of realizing expected outcomes.

Outcome Management is the set of activities for planning, managing, and realizing the desired outcomes from initiatives.  An outcome is the consequence of an intentional change imposed on people, processes, and technology.  Outcome Management builds on management fundamentals such as cost-benefit analysis, but is designed to address their limitations by providing a process for precise identification and qualification of both hard and soft benefits.  Outcome Management complements and extends the cost-benefit analysis and results in a more robust and comprehensive understanding of the outcomes expected by the initiative.

Several Outcome Management lessons learned were identified through the interview process.  These lessons highlight strengths and weaknesses of using an Outcome Management approach and provide direction for future development of the Outcome Management method and practice.  In summary, these lessons provide qualitative substantiation that the Outcome Management process is an improvement on and complements traditional processes.  The Outcome Management process encourages a focus on the initiative's holistic outcomes including soft benefits; not just the outputs.  The process itself is a strong team building exercise and it should be conducted early in the initiative lifecycle, involve all stakeholders and should be revisited on a regular basis at major milestones.  While the tools and methods can be improved, Outcome Management should be integrated with existing methods, frameworks, and management regimes and must be supported by champions, education, and communication.  Other concerns and challenges expressed by respondents include systemic disincentives for identifying and claiming results in the current management environment.

A set of four immediate next steps were recommended, relating to demonstrating the value of Outcome Management and integrating the approach into existing frameworks.  As well, should the Government decide to formalize the Outcome Management approach, twelve additional actions required have been included to maximize the probability of success.

The key conclusion of the report is that Outcome Management has significant value in the planning stages of an initiative, as demonstrated by the lessons learned detailed in this report.  However, there needs to be follow-up in the latter stages of those initiatives in the measuring and monitoring and benefits harvesting stages to demonstrate the full value of Outcome Management.

1. Introduction and Background

1.1 Introduction and Report Purpose

The Government On-Line (GOL) Initiative is the Government of Canada's (GoC) undertaking to make Canada the most connected country in the world by placing the 130 most commonly used services to be available on the Internet.  GOL has three goals of greater accessibility, more responsiveness, and secure interaction.  Over the past several months, several GOL projects undertaken by various federal government departments or agencies have used the Outcome Management approach being explored by the Treasury Board Secretariat (TBS) Chief Information Officer Branch (CIOB).  Outcome Management is a methodology which CIOB believes could be useful across the GoC with renewed emphasis on outcomes and performance measurement.

The TBS Service Management Board (SMB) is an ADM-level committee and is part of the GOL governance structure.  On , SMB mandated the GOL Initiative to:

“Work in collaboration with departments that have done cost-benefit analysis for their projects and/or services and depending on availability of resources and time constraints, either perform some case studies OR apply a methodology developed by other countries (e.g. UK, Australia, EU) to identify best practices to help future service transformation initiatives assessing cost effectiveness;

“Work in collaboration with the TBS, promote the Outcome Management Methodology endorsed by the TBS through holding of information sessions and development of tools (such as self evaluation, check list, etc.) and lessons learned on outcome realizations for use by departments and agencies with future projects/initiatives.”

The GOL Initiative will conclude on , having completed its terms of reference.  As part of the GOL Initiative's close-out activities, and in conjunction with the emerging TBS Outcome Management Practice, it was agreed to identify and document lessons learned about the Outcome Management approach.  In addition, cost-benefit analysis practices were considered.  The findings and best practices listed in this report will be referenced in the GOL close-out report.

The report will also be provided to TBS CIOB as a legacy for the GoC Service Transformation Agenda, and in guiding the Outcome Management activities going forward.  A companion document, the Outcome Management Guide and Tools, is intended to provide a series of checklists and a base of helpful tools and templates to implement the Outcome Management approach and practice.

1.2 Brief Overview of Outcome Management

Outcome Management in the Government of Canada currently has two components that, while related, are different. From the perspective of a GoC Program, Outcome Management is the set of activities designed to monitor, and adjust as required, the way in which the Program, and its associated Services, Processes and Activities, contribute to meeting the needs of Canadians. From the perspective of a GoC Project or Initiative, Outcome Management is the set of activities designed to manage and oversee the change in a way that ensures it contributes to improving the capability and or capacity of a Program to meet the needs of Canadians. The difference being one manages meeting the needs of Canadians while the other manages the development of capabilities that supports meeting the needs of Canadians.

This report deals with Outcome Management for Projects where a GoC project (or a collection of projects) consists of the set of activities for planning, managing, and realizing the desired outcomes from a change. In other words, it is focused on ensuring that a project contributes to improving the capability of the GoC to deliver Programs and Services that meet the needs of Canadians. To date the methods and tools of the TBS Outcome Management Practice have focused on supporting these activities. In this context Outcome has been defined to be “something that follows as a result or consequence” of an action. In other words, an Outcome is the consequence of an intentional change imposed on people, processes, and technology. Outcome Management of Projects is about having the same focus and discipline around aligning a project to achieving outcomes or results as the domain of Project Management provides focus and discipline around delivering a capability or system in an on-time and on-budget manner. In fact, focusing on Project Management only provides the deliverables of the project – it does not provide the outcomes themselves, or the “big picture” as to why we are undertaking the initiative.

A Logic Model (also known as an outcomes map or strategy map) is a visual model that shows how a project (or a set of projects) or all activities within a project will drive the attainment of outcomes. In other words, it shows how each output of an activity contributes to an immediate outcome, how these immediate outcomes contribute to an intermediate outcome, and how these intermediate outcomes contribute to a final outcome. An Immediate Outcome is the first level effect of an Output from a Project or a Project Activity. An Intermediate outcome is a capability delivered by a project or a business impact resulting from a group of projects within the initiative – the benefits and changes resulting from the outputs. A Final Outcome is the end result expected from an initiative – the final or long-term consequences.

The Outcome Management Process is divided into five stages:

  • Stage 0: Launch Outcome Management
  • Stage 1: Develop Outcome Realization Model
  • Stage 2: Develop Outcome Realization Plan
  • Stage 3: Monitor Delivery of Outcomes
  • Stage 4: Realize and Optimize Outcomes

Outcomes can be intermediate or can occur at the end of an initiative.  To achieve the end outcomes, it is helpful to identify and track intermediate outcomes that can be used as milestones along the way.  Tracking intermediate outcomes, or leading indicators, allows for issues to be detected earlier in the initiative lifecycle, rather than having to wait until the end to determine if the initiative was a success or failure.  It also provides the information to make the necessary adjustments along the way, or find another solution to solving the problem, or even not undertaking the initiative at all.  Use of a performance management framework helps identify those initiatives and activities that contribute to the organization's desired outcomes.

Outcome Management also provides support for complex horizontal or business transformation initiatives that require multiple stakeholders, by breaking down silos and crossing traditional organizational boundaries. A governance framework for Outcome Management supports a clear understanding of how outcomes will be realized, the quantification of outcomes, and the assignment of accountability.  Active accountability is necessary in the realization process for outcomes, and consists of establishing the right accountability; gaining buy-in, constructive performance management, and reinforcing desired behaviour.Footnote 1

Appendix C provides a summary across the five countries that have most adopted the Outcome Management approach to date.  It includes the legislative, regulatory or policy framework that Outcome Management or equivalent falls under, the methodology used, and the central agency charged with introducing and championing Outcome Management in their country.  It indicates how Outcome Management is emerging as a significant matter in the public sector around the world.

1.3 Outcome Management and Cost-benefit Analysis

Cost-benefit analysis is a long-standing best practice of effective management in both the public and private sectors.  It focuses on identifying both a comprehensive definition of the costs (direct up-front costs, direct on-going costs, and indirect/hidden costs) of the initiative as well as the benefits (tangible and intangible benefits or outcomes).  This analysis allows management to make decisions based on a thorough understanding of the cost and benefits. Cost-benefit analysis is considered to be a subset of Outcome Management.

While cost-benefit analysis is a well-entrenched best practice, it has notable limitations.  Most significantly is that because of its emphasis on financial costs and benefits, the analysis results in weak or non-existent on the definition of “soft” or non-financial benefits.  This limitation has particular significance in the public sector where soft benefits (such as protection of citizens) are often significant drivers of major initiatives.

Outcome Management builds on the management fundamentals of cost-benefit analysis, but is designed to address these limitations by providing a process for strong identification and qualification of both hard and soft benefits.  Outcome Management complements and extends the cost-benefit analysis and results in a more robust and comprehensive understanding of the outcomes expected by the initiative, and should be done at the beginning of the initiative and throughout its lifecycle.

1.4 Assignment Methodology

To begin, research was conducted regarding best practices in Outcome Management or equivalent from other public sector organizations around the world.  In addition to methodologies regarding Outcome Management, existing GoC TBS guidelines and frameworks were included.  Appendix D contains an annotated bibliography of these references.  Appendix E contains a glossary of definitions relating to Outcomes Management.

Interviews were conducted with members from 12 projects that used either the Outcome Management methodology or a cost-benefit analysis approach.  The interviews were conducted between and .  These 12 interviews consisted of the eight projects that have used Outcome Management (as determined by the TBS CIOB) and a sample of four that used cost-benefit analysis, (as determined by the Working Group) as a cross-section of projects currently underway.  The list of these projects is as follows, and Appendix A contains the list of interviewees for each project.

The eight projects using Outcome Management are:

  • Shared Travel Service Initiative (STSI) – Public Works and Government Services Canada (PWGSC)
  • e-Information (e-Payroll) – Canada Revenue Agency (CRA)
  • Various projects – Inter-Agency Committee on Geomatics (IACG)
  • National Land and Water Information Service (NLWIS) – Agriculture and Agri-Food Canada (AAFC)
  • GeoBase – Canadian Council on Geomatics (CCOG)
  • Real Time Identification Project (RTID) – Royal Canadian Mounted Police (RCMP)
  • Government of Canada Marketplace (GoCM) and Acquisitions Branch Portfolio Management projects –PWGSC
  • e-Contact – PWGSC

The sample of four projects using cost-benefit analysis included:

  • Netfile –CRA
  • Spectrum Management – Industry Canada (IC)
  • eCRM (Virtual Trade Commissioner) – International Trade Canada (ITCan)
  • eRecruitment – Public Service Resourcing Modernization Project - Public Service Commission of Canada (PSC)

A summary of the answers to the interview questions can be found in Appendix B

The interim results were analyzed and then discussed and reviewed with the GOL Outcome Management Steering Committee on .  Following completion of the documentation of the best practices and lessons learned from the interviews, the first two drafts were reviewed with the Outcome Management Working Group during the period of to .  A third draft was created and then reviewed with the Steering Committee during the month of .  After incorporating the feedback from these groups, the report was finalized during the month of .

2. Key Findings

2.1 Lessons Learned from Outcome Management in the GoC

Ten key Outcome Management lessons learned were identified through the results of the interview process.  They are summarized below along with illustrative examples from the projects.  These lessons highlight strengths and weaknesses of using an Outcome Management approach for the first two stages of the Outcome Management process (since none were far enough along to progress to Stage 3 or Stage 4), and provide direction for future development of the Outcome Management method and practice.

Lesson 1: Use Outcome Management to focus on the initiative's outcomes and results; not just the outputs.  Align these outcomes with departmental and government priorities.

Outcome Management was successfully used by projects and initiatives to challenge current business assumptions and focus the team on the outcomes (and value created) in addition to the outputs or deliverables created.  This structured review, unlike the cost-benefit analysis approach, asks why activities are being done, in addition to what the activities are producing and provides not only clarity around the vision and value but also highlights areas where improvements can be made.  Outcome Management highlights an initiative's alignment with government and departmental priorities and articulates this early in the project's lifecycle.

NRCan's GeoBase initiative used Outcome Management as an agent of change to understand and articulate the issues and relevance of GeoBase (a repository of geospatial data).  Using the Outcome Management process, they were able to identify how the GeoBase supports key government priorities such as emergency preparedness to provide value (in addition to data).

Using Outcome Management, PWGSC's Acquisitions Branch Portfolio Management Project identified two or three IT systems (outputs) that could be retired – one of which the business users identified having little business value and was still consuming resources.

Lesson 2: Outcome Management is a strong team building exercise.  But more than that, it can be used to articulate both the business and information technology outcomes of the initiative.

By following the Outcome Management process, the project team and stakeholders must clearly and precisely define the value of their project or initiative – both in business and information technology terms.  This process can be used to unite the team around a common understanding allowing the project to start with everyone on the same page and promotes effective communication through precise articulation of outcomes using common language.  It can be used to gain agreement on many issues and challenges, especially for horizontal initiatives that often include varied perspectives, and close the gap in viewpoint between business and technology staff.  Outcome Management also highlights the business results of a project or initiative – the raison d'etre of making an investment that is all too often overlooked (especially in technology-based projects).

The RCMP's Real-Time Identification initiative was initially seen as an IT project.  The Outcome Management process engaged the business participants broadening the perspective and demonstrating that the project provides measurable business value.

PWGSC's Shared Travel Services Initiative used the process and the deliverables to orient new team members.  Now the team has not only a common understanding of the objectives and outcomes, but a common language that everyone can use.

Lesson 3: Engage all stakeholders in the process – especially if the initiative crosses organizational or jurisdictional boundaries to leverage the results of the Outcome Management process.

Having all the stakeholders participate consistently (with no substitutes) in the process is critical for ensuring good results along with strong executive support.  A broad representation of the right stakeholders – including external stakeholders – is the key to success.  This is especially important when the initiative is a horizontal program or when stakeholders cross jurisdictional boundaries.

Furthermore, participating in the process was viewed as one of the most valuable results of Outcome Management.  The facilitated workshops are a key ingredient of Outcome Management and a strong facilitator is essential for ensuring balanced input.  The resulting report that was provided can be improved to be more usable.

CRA's e-Information project used Outcome Management to bring three different departments – each with its own vision – together to create a common understanding.  The process showed the linkages and interrelationships between stakeholders.  The sessions were not seen as an arduous process – it only took three � day sessions and they were well-facilitated.

Lesson 4: Outcome Management provides increased flexibility in defining intangible or “soft” benefits.

Initiatives are often faced with the challenge of defining and communicating the important soft or intangible benefits associated with the project.  Outcome Management provides increased flexibility around the definition of benefits or results and can be effective when used to describe and measure intangible benefits such as public good.  Unlike the cost-benefit analysis approach, it allows the linkage between project activities and these “softer” outcomes to be made clear using the cause-and-effect logic model.  Outcome Management's value case concept also extends the traditional business case to better express soft benefits and stakeholder impacts in addition to approaches centred only on costs.

The RCMP's Real-Time Identification project used the value case to articulate the intangible benefits of the initiative more clearly and effectively.

Lesson 5: Outcome Management should be conducted early in the initiative lifecycle with as much detail as possible.  It should also be done at major gates during the execution of the initiative.

Starting early in the process is a good practice – for example, before preliminary project approval (PPA) if it applies.  The Outcome Register (an Outcome Management tool) is the most useful when it specifies performance measures and accountabilities – the more complete the Register, the better.  The Outcome Register should also be reviewed and updated at key points during the project lifecycle – for example at effective project approval (EPA) and at specified review points to ensure that it remains relevant and the initiative remains aligned.

The Inter-Agency Committee on Geomatics lead by NRCan, representing 10 different departments with different objectives, used Outcome Management to precisely define the outcomes of the working group.  This precision forced the team to challenge assumptions and clarify common goals.

Lesson 6: Integrate Outcome Management with existing methods, frameworks, management tools.  Use it to strengthen or replace other deliverables.

Outcome Management needs to be aligned and integrated with existing federal government processes and best practices such as the project approval process, project management framework and other deliverables such as business cases, project plans, and scorecards.  When it is integrated, the benefits grow.  The value case extends the concepts of the business case for a more comprehensive description of the initiative.

PWGSC's Shared Travel Services Initiative engaged internal audit in the process strengthening the linkages between the project results and the model that will be used to audit the project.

The RCMP's Real-Time Identification project incorporated the measurable outcomes into their corporate performance measurement program.

Lesson 7: Successful Outcome Management requires champions, education, and communication.  Tools and methods can be improved.

While the concepts of Outcome Management are becoming more common in business management culture in general and in the GoC specifically, a successful implementation requires executive support (including Treasury Board), awareness building, promotion of the processes and communication with participants.  Also, better outputs from Outcome Management – user friendly tools, senior management communication tools – would improve the effectiveness of the approach.

NRCan's GeoBase initiative used the results of the Outcome Management process and developed one-page case studies – rather than charts and diagrams – to demonstrate the value of the initiative to senior management.  These one-page case studies which describe success stories were well received.

Lesson 8: Outcome Management still needs to overcome systemic challenges in the government context.

Outcomes – including tangible workforce adjustments and cost savings – are difficult to realize in the government.  Initiatives often realize these outcomes over time and often the anticipated cost savings and workforce savings are reallocated to other areas as part of the natural process of managing operational priorities.  In other words, because outcomes are generally realized slowly rather than as a “big-bang”, they often are not attributed to the initiative.  In addition, in a complex environment it is difficult to isolate the effect of a single initiative on a set of outcomes.

Often there is a challenge and even apprehension by participants to specify and later harvest outcomes because the impression is that any savings may be taken away from operating units.  At least one respondent remarked that some of these benefits were explicitly excluded from the business case for exactly the reason that they might be “scooped up” before they could actually be harvested.  This is an understandable action in an era of taxpayers demanding efficiencies, but needs to be balanced in terms of what will be identified voluntarily.  Section 3 further discusses similar challenges and opportunities.

Lesson 9: Cost-benefit analysis was useful to document an initiative's costs and areas for cost avoidance as well as conducting options analysis.

Included in the interviews were four projects that used cost-benefit analysis, not the Outcome Management approach.  The cost-benefit analysis resulted mainly in a thorough identification of costs, and in some cases identified a baseline of current costs in an activity-based costing framework.  In addition, the analysis was able to identify some areas of cost avoidance as a justification for the proposed investment.  Finally, the analysis techniques demonstrated that it can be used to conduct an options analysis that can guide a management decision to select the most effective option from amongst those available and considered.

Lesson 10: Cost-benefit analysis identifies an initiative's direct benefits; Outcome Management also provides a clearer definition of soft benefits (refer to Lesson 4).

Cost-benefit analysis resulted in a thorough definition of direct benefits, however, when these interview results were contrasted to the sample of projects that used Outcome Management, the approach did not result in as broad a definition of outcomes including indirect ones such as program outcomes, good governance outcomes, or Results to Canadians.  One of the projects used the cost-benefit analysis very effectively to drive business change.-

Canada Customs and Revenue Agency's Netfile project used cost-benefit analysis to identify areas for business improvement and because of the transactional nature of the domain, were able to build on this analysis and move towards activity-based costing in seven regions.

2.2 Best Practices in Outcome Management from Other Countries

This section presents best practices in Outcome Management in countries outside the Canadian context, which includes Organisation for Economic Co-operation and Development (OECD) member nations that have implemented their own form of Outcome Management.

Complement Performance Monitoring with Evaluations

To ensure optimal decision-making, performance monitoring and evaluations should be viewed as complementary elements of an overall Outcome Management approach.  Performance monitoring by itself can alert managers to problems that arise with regard to performance, but will not typically present solutions as the data alone may not be sufficient to solve most problems.  Evaluations determine the reasons behind the performance, such as cause-effect relationships, and make recommendations on how to improve.  Such activities are often overlooked in favour of simply presenting performance monitoring data; however, organizations need to integrate both monitoring and evaluation in an outcome management system to produce better decisions and results.Footnote 2

Adopt a Results-Oriented Set of Balanced Measures

Balanced measures are part of a strategic management system for achieving long-term performance goals.  Specifying balanced measures involves taking into consideration all stakeholders, including management, employees, partners and the public at all stages when conducting performance evaluations.  It also means going beyond standard financial measures in a traditional business case, to include both quantitative and qualitative measures that touch on “softer” issues such as morale, public health, efficiency, social and environmental aspects.  Using balanced measures offers a way for an organization to track the various factors that make up successful performance and outcomes.  The balanced measures approach sets an organization's focus across employee, client, and business perspectives.  Prevalent in the private sector, this approach has made its way to assessing government initiatives and has been implemented in departments and agencies of the United States.Footnote 3  Another measure often used for achieving a balance of quantitative and qualitative outcomes is the Triple Bottom Line (TBL) approach, which encompasses financial, environmental, and social results.  The Government of South Australia has begun implementing TBL reporting across departmental agencies, and notes that the benefits of using TBL include enhanced reputation, benchmarked performance, improved risk management, and improved communication with stakeholders.Footnote 4  The Organisation for Economic Co-operation and Development (OECD) E-Government Project released a report in 2005 outlining the costs and benefits in e-government across jurisdictions, and found that most public agencies operate with multiple “bottom lines” and that many are beginning to use a balanced measures approach in realizing outcomes.Footnote 5  Appendix C summarizes the Outcome Management frameworks across the five countries that have most adopted this approach.

Align Measures with Accountability and Decision-Making Authority

In developing performance measures, organizations should ensure that these measures are aligned with accountability and decision-making authority.  Measures should directly relate to assigned roles and responsibilities, and individuals should only be held accountable for the areas in which they have influence.  Managers should lead by example and cascade accountability across the organization by creating an outcome-based organization and encouraging the sponsorship of measures at all levels.  Staff should be kept informed at all stages of the process. The public should also be kept informed through the Internet and traditional reports.  After successful initiatives, staff should be rewarded on a team basis.Footnote 6  Complexity of accountability and authority rises quickly when horizontal or vertical initiatives are undertaken, which involves multiple jurisdictions that may be complementary or opposing in their mandates or strategic objectives.  Organizational members must know and understand their responsibilities and what they contribute to the group's goals.  Outcome owners and managers should have the ability to identify their own expected results and methods for data collection to ensure better reporting, decision-making, and outcomes.Footnote 7  Further to this, Benko and McFarlan in Connecting the Dots propose an alignment of project portfolios with objectives to enhance both public trust and return on investments.Footnote 8

Give Managers Autonomy

Managers given the responsibility of accounting for initiatives should at the same time be given the decision-making authority and ability to shift resources from low-performing activities and projects to ones that are performing at a higher level.  This idea of managing a portfolio of initiatives as opposed to a project is fundamental to obtaining the maximum set of results.  Unless invested with the authority, managers would be unable to directly improve performance results and would become disengaged in the process.Footnote 9  Maizlish and Handler in IT Portfolio Management present an approach to simplifying the process of achieving a rationalized and aligned project portfolio.Footnote 10  In addition, Lebow and Spitzer in Accountability propose a “freedom-based” approach to responsibility and workplace design that advocates individual freedom and personal accountability.Footnote 11

Use Measures that Provide Insight, not just Data

In selecting performance measures, there can often be a tendency to select ones for the simple reason that they are easily measured and produce ample amounts of data.  Instead of being based upon ease of data collection, measures should be related to organizational and strategic goals, and provide relevant and timely information for decision-makers to assess the progress of an initiative.  The measures should indicate the efficiency of the process, the results in comparison with the initiative's intended goals, and the effectiveness of particular activities in terms of contributions to overall program objectives.  Well-selected measures can describe the direction and accomplishments made during the course of a program, as well as serve as a guide for future improvements in serving stakeholders.Footnote 12  The Auditor General of Canada's Implementing Results-Based Management: Lessons from the Literature presents findings of best practices from across Canada and other jurisdictions for developing performance measurement systems.Footnote 13

Adaptation Instead of Adoption

Best practices are not necessarily suited for other organizations in similar initiatives.  However, practices can be adapted to fit a group's needs and culture.  For example, a U.S. performance management study found that departments implementing the Balanced Scorecard™ approach as developed by Kaplan and NortonFootnote 14 had adapted it into a set of measures uniquely suited to the structure, culture, and goals of the organization.Footnote 15

Conduct Pilot Projects

Pilot projects are useful for testing new outcome-based management systems.  The test period allows for problems in the Outcome Management approach to be discovered early on and worked out before the full program is launched.  To be completely effective, pilot projects should imitate exactly how the final program will operate and must last long enough to clearly indicate the potential of the system.  This requires that resources be allocated and that a representative sample of group members participate in the trial run.Footnote 16

Manage Cultural Change

In addition to ensuring that new systems and structures are in place when managing change in an organization, it is also important to manage the changes in the organizational culture, which includes the norms, values, and behaviours of members in the organization.  It also includes formal and informal rewards and recognition mechanisms, as a means of fostering the new desired behaviour, and consequences for those that do not.  Outcome Management requires a renewed focus on learning and integrating lessons into decision-making, on results rather than processes, and on transparent performance reporting.  In formulating and adopting a change management strategy, changes brought about by a new system based on results could be supported through offering coaching, staff training, help desks, technical assistance, or knowledge bases.  Managing cultural change allows for new organizational values and procedures to be institutionalized, and for performance to adapt to new standards.Footnote 17

3. Challenges and Opportunities

This section provides a summary of the major challenges and opportunities that lie ahead in order to continue with Outcome Management as a viable and sustainable initiative for the GoC.  They are broken down into issues around the Outcome Management approach itself, and issues relating to central agencies.

3.1 Challenges and Opportunities Related to Outcome Management

Outcome Management was not always done early enough in the project lifecycle.

Several respondents expressed the view that the earlier on in the lifecycle that Outcome Management was introduced, the greater value it offered.  Some had done the Outcome Management work either after the business case or after project approval, which they felt was too late in the process.  The opportunity is to use the Outcome Management techniques as early as possible in the project concept stage, then refine the deliverables throughout all subsequent stages in the project lifecycle, keeping them synchronized and integrated all the way through.

Outcome Management is not a one-time exercise, as was applied so far.  Outcomes must be reviewed and monitored throughout the project lifecycle at prescribed points and linked to the Performance Measurement Framework.

It was a challenge to get some departments to see the need to follow up on the attainment of the outcomes, and whether this represented “good news” or “bad news” of attaining or not attaining the intended outcomes.  Stages 1 and 2 of the Outcome Management Process specify the outcomes to be measured, the baselines and target values, and how they are to be measured.  Stages 3 and 4 specify the necessary requirements for the ongoing monitoring of the outcomes, and the link to the performance measurement framework, and ultimately up to the departmental performance report (DPR).  As one indication, there was typically no budget or resource allocated for the ongoing performance measurement and monitoring process in the project plan.  The use of Outcome Management throughout all phases of the project lifecycle at prescribed checkpoints, not just the planning phase, represents an opportunity to ensure that the principles of results-based management are followed all the way through to attainment of the intended outcomes.

Outcome Management so far has only been applied to Stages 1 and 2 of the Outcome Management process (Modeling and Planning), and not Stages 3 and 4 (Monitoring and Realizing outcomes).

In all the projects consulted with for this study, all of them were in the early stages of the Outcome Management process.  None of them were far enough down the road to enter Stage 3 or 4, where the actual attainment or realization of the outcomes would be in progress.  This latter part of the Outcome Management process has therefore not been validated.  There is an opportunity to follow these projects as they go through Stage 3 and 4 and demonstrate the full value of their initiatives and of Outcome Management.

Outcome Management so far has focused primarily on a single initiative at a time.

All the people interviewed as part of this study were focused on a particular project (with the one exception of the IACG portfolio), which is where Outcome Management has been applied to date.  There is an opportunity to apply Outcome Management more broadly at the program or portfolio level first, to establish a set of common objectives in concert with other projects, prior to looking at a specific project.  Similarly, there is an opportunity to look at horizontal initiatives or alternatives to avoid duplication or sub-optimization across initiatives.  In a couple of cases, the project team was aware of the interoperability or overlap between their initiative and another, but looking at the enterprise-wide “big picture” was beyond their scope and/or authority.  The Outcome Management techniques are not restricted by organizational boundaries, and can be used to provide this enterprise-wide view of outcomes.

Concerns were expressed about adopting a proprietary toolset for Outcome Management.

There were concerns expressed about some of the tools used supposedly being proprietary and not openly available to all.  Use of proprietary tools means not only potential extra costs for licensing, but more importantly, potential incompatibilities with standard departmental desktops and likely restrictions on sharing the approach across GoC Outcome Management practitioners.  There was a stated preference for the “open” approach to any tools and techniques employed in the Outcome Management process and that the intellectual property rights to the method must reside with the GoC for unrestricted sharing and access.  The material would be subject to the standard Crown intellectual property rights and copyright policy.

Soft skills are as important as the hard skills for Outcome Management practitioners.

There is an obvious need for Outcome Management practitioner training on logic model development and other specific techniques in order to develop and sustain the practice.  However, many respondents, in commenting on how well the workshops went and how valuable they were, also recognized the need for developing the softer skill sets such as facilitation that are equally important for Outcome Management practitioners to have.  The opportunity is to ensure that both sets of skills are found in the community of Outcome Management practitioners, which may require some specific skill development programs.

Complex logic models were difficult to read for people unfamiliar with the approach.

A common message heard was that the Logic Models that were built were very helpful and readable by the people who participated in the workshops.  However, for those people who did not participate, or for senior management levels that would want to see the product of the workshops in a simplified manner, there is a need to find an “executive summary” way to represent complex logic models when showing them to people who have not had the chance to see these models previously.  There is an opportunity for some development of model simplification techniques to meet this need.

3.2 Challenges and Opportunities Related to Central Agencies

Outcome Management needs to be integrated with other frameworks and tools.

A clear message from respondents was that integration with existing Treasury Board Secretariat documents, frameworks, and policies is essential.  This integration includes components such as the Preliminary Project Approval (PPA), Effective Project Approval (EPA), Business Case, Results-based Management Accountability Framework (RMAF), Program Activity Architecture (PAA), et cetera.  Otherwise the Outcome Management approach will be perceived as an extra burden, instead of something that will assist the project / department or replace an existing requirement.  The greater opportunity is in fact to consolidate some of the Treasury Board deliverables, for example, have the Value Case grow to become the TB Submission format, instead of being in addition to the Business Case.  It should also be integrated with the Enhanced Management Framework project management processes.

It can be difficult to overcome resistance in departments to adopt another TBS construct.

The warning by some respondents was that there would be a perception by some client departments that Outcome Management will represent the overhead of another “box to check” to satisfy Treasury Board Secretariat requirements, as opposed to a valuable exercise that they would benefit from.  There is an opportunity for an awareness campaign to promote the use of Outcome Management for a department’s own benefit, not just to satisfy a central agency.

There is a low level of awareness and active support by TBS Program Analysts.

There is currently a low level of awareness of the Outcome Management approach by TBS Program Analysts.  Their awareness has been increasing on a one-by-one basis, as their projects have gone through the process.  As TBS is able to adopt and integrate Outcome Management into the Enhanced Management Framework and take a cohesive approach, this will allow TBS to proactively get the message out.  Through a targeted awareness program and training, there is an opportunity for them to become aware of and active supporters of Outcome Management, and how it fits in to the integrated picture throughout the lifecycle.

The main sponsor of Outcome Management is currently in TBS CIOB – it will have to gain a wider base of support.

In its early stages, it is not uncommon for any initiative to have one main sponsor or champion.  However, the challenge is for Outcome Management to substantially increase its adoption (or take-up) rate and build a base of support in a majority of (and eventually all) departments.  There was also the question raised as to whether the CIO Branch is the appropriate owner, or whether it could better fit in with other area’s responsibilities, for example with the Departmental Comptroller.  This is what will be required in order for Outcome Management to be sustainable over the long term.  A targeted awareness program, complemented with available coaching and mentoring resources at a possible Centre of Excellence (the Outcome Management Practice) will play a major role in making this happen.

Accountability for results / outcomes is often problematic, especially in horizontal or vertical initiatives.

Although awareness and progress has been made in the area of accountability for results (as opposed to solely for expenditures), there is still a significant challenge in moving this topic forward.  Some departments have their management accords containing accountability for results starting at the Deputy Minister level and cascading down to the other senior management levels.  The whole of government is grappling with this issue and the Auditor General has raised this question many times.

There is an opportunity for awareness and education across the public service, perhaps to become part of the curriculum of project and program management at the Canada School for Public Service.

There is an additional level of difficulty to establish accountabilities for outcomes when initiatives spread across organizational boundaries.  The situation is exacerbated when there are initiatives that are either horizontal (cross-departmental) or vertical (across levels of government) or a combination of both.  Several of the initiatives had this exact challenge to deal with, and were looking for a consistent government-wide governance approach to dealing with these accountabilities for outcomes.

Systemic problems and disincentives exist in identifying and claiming results.

There are many forces at play when any organization considers identifying outcomes or results that can be achieved.  Frequently there are disincentives for doing so, such as the resulting reduction in the size of one’s organization, or lowering the funding reference levels.  These and other factors make senior managers reluctant to claim results.  There is an opportunity for Treasury Board Secretariat, in concert with the Privy Council Office, to set and apply the rules to find ways that Deputy Ministers and their organizations will embrace results that serve the greater good of government and Canadians, without penalizing their organizations, their pay-at-risk, or their careers.

Departments and central agencies have allowed inconsistencies to evolve in terms of business case and project approval.  At least one of the projects interviewed just put in the minimum level of detail and justification in order to “go through the motions” of the business case.  Varying degrees of details for defining outcomes needs to be replaced by a consistent level of detail, with consequences such as non-approval of projects that do not meet these requirements.  As a specific example, including Outcome Management questions in the Preliminary Project Approval process will require departments to begin applying the process up front and allow the submission review board to challenge projects that are not directly tied to government outcomes.

Although culturally difficult to introduce, the opportunity also exists to encourage the communication and sharing of failures to enhance the use and ongoing knowledge contained in Lessons Learned.  This will allow others to avoid making the same mistakes, and increase their chances of success.

4. Recommended Next Steps for Outcome Management

This section provides recommendations to consider, should the GoC decide to pursue the implementation of the Outcome Management approach as a means of realizing expected outcomes.  There are foundational requirements that should first be considered and defined before the formal introduction of the Outcome Management method to the GoC departments and agencies.  These include the readiness, timeliness, governance, and roles and responsibilities of the organizations leading Outcome Management.

The recommendations are presented in two parts: the first is what can be undertaken in the short term given the current situation and resource levels.  The second is what should be considered if the direction and resources for the formal introduction of Outcome Management are approved.  This would progress towards institutionalizing the Outcome Management practice, thus starting to become part of the culture of GoC departments and agencies.  Outcome Management activities would be incorporated into common processes across all levels of the various organizations.

Note that the next steps are not provided in a precise sequence, as many of the steps can be undertaken in parallel.

4.1 Next Steps – Short Term

The following lists the next steps that are recommended to be undertaken in the short term:

  1. Continue to explore the benefits of the Outcome Management method by selecting projects that have experienced Stages 1 and 2 of the Outcome Management process and encouraging them to continue into Stages 3 and 4, Monitoring and Realizing Outcomes.  The overall Outcome Management process should be validated all the way through to the end, in order to identify any additional lessons learned and best practices in these latter stages.
  2. As the best practices and lessons learned emerge from the previous point, undertake to better understand the outcomes of Outcome Management, to demonstrate the value that it brings.  All stakeholders consulted as part of this assignment felt intuitively that there is significant value in conducting an Outcome Management exercise.  The challenge will be to measurably demonstrate that doing Outcome Management results in an increased likelihood of the success of the project or other identified positive outcomes.
  3. Examine the existing links and the opportunities to incorporate the Outcome Management process with new and existing TBS policies, frameworks, and directives, possibly integrating Outcome Management into an overall TBS framework.  It would identify the linkages to constructs such as the Business Transformation Enablement Program (BTEP), Government Strategic Reference Model (GSRM), Enterprise Architecture models, MRRS policy, Treasury Board submissions and the Enhanced Management Framework.  This would mean including Outcome Management in the Treasury Board Secretariat’s current effort to map the various TBS frameworks, tools, and techniques under the Management Accountability Framework (MAF) to identify gaps, overlaps, and complementary aspects.
  4. In parallel to the integration mentioned above, continue the work underway to articulate the strategic outcomes for departments and the whole of government.  These strategic outcomes serve as the frame of reference for all Outcome Management exercises, to allow them to link their specific outcomes to these strategic outcomes.  This would allow the Outcome Management process to be applied at any level – project, program, department or even whole of government, and ensure that all outcomes are aligned with the broader set of outcomes that Canadians expect of their government.

4.2 Next Steps – Longer Term - Formalizing the Outcome Management Process

At this stage, the following next steps are suggested to be undertaken should the GoC decide to formalize Outcome Management.  These suggestions assume that TBS will be the leader of the Outcomes Management practice within the GoC, and that the direction and the necessary resources are in place.  These next steps include the following:

  1. Revisit the Outcome Management Practice Implementation Strategy study conducted in August 2005 and create an operational plan that sets out which specific activities from the identified roadmap that the TBS Outcome Management Practice should proceed with and in what priority sequence.
  2. Determine the disposition of the draft Guide to Realizing Outcomes from Government of Canada Programs of March 31, 2004 and the new Outcome Management Guide and Tools that is the companion document to this report.  Ideally, this new guide should be given a final edit, translated, and published on the Internet in both HTML and PDF formats.
  3. Create a Centre of Excellence in TBS for the Outcome Management practice as a repository for outcomes thinking, advice to departments, lessons learned and best practices, as well as a resource centre for facilitation of workshops, coaching and mentoring to build expertise within the GoC and the skills of practitioners in the field.
  4. Establish a standard set of deliverables, methods, and tools that any Outcome Management work package should consist of.  This would be derived from the new Outcome Management Guide and Tools, mentioned in a previous step.  This will ensure consistency and quality of the process, and form the basis for a training program to build the base of Outcome Management practitioners in the GoC.
  5. Ensure that all Outcome Management tools, methods, and deliverables are open and not proprietary, and that the intellectual property belongs to the Crown in order to facilitate ease of reuse and sharing across departments without restriction.  Verify the GoC’s guidelines on intellectual property to ensure proper protection and value for the Crown.
  6. Create a communication and awareness strategy and plan to promote the benefits of Outcome Management to the various target audiences (including TBS Program Sector analysts) through formal and informal communication channels.  Carry out the activities in this communication plan, and then follow up to see that the uptake of Outcome Management is following the expected adoption curve.
  7. Work with the Canada School for the Public Service to create awareness and develop skills in the public service through management training courses.  In the longer term, this could involve examining the post-secondary education curriculum to incorporate Outcome Management principles for the preparation of the next generation of public servants.
  8. Look at incorporating the Outcome Management approach and deliverables into the responsibilities of each department as part of the Management Accountability Framework (MAF) – analogous to how departmental comptrollership and the recruiting functions are being distributed to the individual departments.
  9. Consider strengthening the requirement for regular progress reports and final disposition on projects that is focused on the attainment of the anticipated outcomes, not just on the expenditure levels.  This would require a performance measurement framework to be put in place, along with gates to re-examine and re-evaluate the project throughout the project lifecycle.
  10. Set up a government-wide standing offer to simplify the procurement of the services of experienced Outcome Management practitioners for departments.  This will remove one obstacle to departments seeking to access experienced help through a pre-qualification process resulting in the standing offer.  This is also consistent with the recent stated preference from PWGSC Acquisitions Branch to have departments obtain goods and services from standing offers wherever possible.
  11. Consider engaging the provinces and municipalities, some of whom have already adopted some form of Outcome Management, in a community of Outcome Management practitioners to further share ideas, best practices, and expertise.  This would parallel the collaboration currently happening with the Business Transformation Enablement Program (BTEP) and the Government Strategic Reference Model (GSRM).  This community might manifest itself in a web site, annual conference or other information sharing vehicles.  It would likely become a liaison point to international endeavours in Outcome Management that are growing, particularly in the U.S., the European Community, and Australia.
  12. Keep abreast of the emerging developments in accountability for results in the public sector, the increase in the transparency of government operations and the public accounting of attaining outcomes.  This will have to be incorporated into the evolution of the Outcome Management approach and method, and adapted to reflect the emerging needs.

5. Conclusions

The consensus of all parties consulted during the study was that Outcome Management provides significant value in focusing their thinking on the outcomes as opposed to the deliverables, that it engages a broad set of stakeholders in the process, and that it should be conducted as early on in the initiative lifecycle as possible.

Outcome Management provided more value than traditional cost-benefit analysis through its techniques, facilitation, models, and ability to incorporate softer benefits on an equal footing as the hard benefits such as financial savings, which was especially helpful on the more complex initiatives.

However, demonstrating the value of Outcomes Management overall has been difficult to do with this set of eight pilot projects, as they are all still in the early stages in their lifecycle.  Intuitively, all participants agreed that Outcome Management is very valuable, but at this point the proof is elusive – it is anecdotal, without any demonstrated results in the Canadian federal government as yet.

As indicated by the research and best practices, other western countries are in various stages of introduction and adoption of Outcome Management (or equivalent), implying there are sufficient positive outcomes of Outcome Management.

The key to the successful adoption of the approach in Canada is the demonstration of that value, which can best be done by following a sample of the eight projects that were consulted as part of this assignment.  At the time of writing, Agriculture and Agri-food Canada’s National Land and Water Information Service (NLWIS) had embarked on Stage 2 of the Outcome Management Process, and some of the other projects expressing interest in continuing further as well.

The key conclusion therefore, is that Outcome Management has significant value in the planning stages of an initiative, as demonstrated by the lessons learned detailed in this report.  However, there needs to be follow-up in the latter stages of those initiatives in the measuring and monitoring and benefits harvesting stages to demonstrate the full value of Outcome Management.  Once this is done, the buy-in from departments to adopt the approach should follow, when they see that initiatives are more successful in attaining the intended outcomes by using the Outcome Management approach.

Appendix A: List of Projects and People Interviewed

Date

Project / Organization

Interviewees

Shared Travel Service Initiative (STSI) – Public Works and Government Services Canada (PWGSC)

  • Shawn Brennan
  • Michael Corbett
  • Jamie Nolan

e-Information (e-Payroll) –
Canada Revenue Agency (CRA)

  • Erin Ryan
  • Martha Coates

Various projects in the geomatics portfolio – Inter-Agency Committee on Geomatics (IACG)

  • Jeff Labonte
  • Adrian Camfield

National Land and Water Information Service (NLWIS) –
Agriculture and Agri-Food Canada (AAFC)

  • Bob Parkinson

GeoBase –
Canadian Council on Geomatics (CCOG)

  • Jean Cooper
  • Bob Johnson
  • Laurent Tardif

Real Time Identification Project (RTID) – Royal Canadian Mounted Police (RCMP)

  • Lloyd Bunbury
  • Lyse Langevin

Government of Canada Marketplace (GoCM) and Acquisitions Branch Portfolio Management projects –
Public Works and Government Services Canada (PWGSC) – Acquisitions Branch

  • Stephe Cooper

Spectrum Management –
(Cost-Benefit analysis)
Industry Canada (IC)

  • Michel Scott
  • Line Perron

Netfile
(Cost-Benefit analysis)
Canada Revenue Agency (CRA)

  • John Cheeseman
  • Dave DiMillo

e-Contact –
Public Works and Government Services Canada (PWGSC)

  • Dennis Duhaime

eCRM (Virtual Trade Commissioner) –
(Cost-Benefit analysis)
International Trade Canada (ITCan)

  • William Pound
  • Robert Ledermann
  • Robert Chass�

eRecruitment – Public Service Resourcing Modernization Project
(Cost-Benefit analysis)
Public Service Commission of Canada (PSC)

  • Arnie Simpson

Appendix B : Summary of Answers to Interview Questions

  1. Why did you undertake an Outcome Management approach with your initiative?
    • Suggested by TBS CIOB
    • Funded by TBS
    • Major crown projects are being expected to deliver outcomes and clarify expected results
    • Needed to focus on the outcomes of the project
    • Clarify the vision of the project
    • Horizontal initiative needed to establish outcomes and accountabilities
    • Prioritize initiatives to respond to fiscal pressures and ERC requirements
    • One department had the approach in their organizational culture
  2. Which of the following Outcome Management components have you accomplished to date:
    Component STSI e-Info IACG NLWIS GeoBase RTID GoCM e-Contact
    Built a Logic Model (e.g. Results Chain or Outcomes Map) Yes Yes Yes Yes Yes Yes Yes Yes
    Defined and assessed Risks (particularly risks to Outcomes) Yes Yes Yes Yes Yes Yes Yes Yes
    Identified Outcome owners and accountabilities Yes, limited Yes, limited Yes, limited Yes, limited Yes Yes Yes Yes, limited
    Built an Outcomes Register including Outcomes baseline and target values, data sources and timeframes Yes Yes, limited Yes, limited Yes, limited Yes Yes Yes, limited Yes, limited
    Linked the Outcomes to a performance measurement framework or the DPR / RPP Yes, limited No No No Yes, limited Yes No No
    Assessed the relative value of an initiative and put it on a chart such as a Value Plot or Value Graph No No No No No No

    GoCM: No
    PfM: Yes

    No
  3. What specific Costing Analysis method did you use for project and full life-cycle costs?
    • Traditional costing rollup
    • Followed business case or PPA submission methodology
    • One respondent issued an RFI and received cost estimates from vendors
    • Used Enhanced Management Framework for business case and cost estimates
  4. What went well in the work you have been involved in?  What could have been done better?  What opportunities for improvement do you see?
    • What went well:
      • Team-building
      • Clarity around outcomes and vision
      • Workshop format with good facilitation and participation; process of interviews to straw man and validation worked well
      • Broadening of perspective to include both business and IT outcomes
      • Good cross-section of stakeholders to engage those stakeholders
      • Strengthen linkages to audit model
      • Strengthen linkages to Performance Measurement Framework
      • Precision of language around outcomes
      • Allowed additional flexibility around definition of outcomes beyond tangible benefits
      • Logic model crossed boundaries; horizontal across GoC and beyond GoC
      • Value case was extremely helpful
      • Re-think roles and core business; raison d'�tre
    • Could have been done better:
      • Picked right people from the start; all stakeholders; no substitutes
      • Get executive buy-in; need champion support
      • Timeliness (start early in the process)
      • Better awareness and promotion of the process
      • Balance of input
      • In Outcomes Register, stakeholders expected more detail in outcomes measures
      • No follow-through after outcomes management exercise, therefore report went on the shelf; did not identify the events for revisiting
      • Not a stand-alone piece; needs to be integrated with project plans, PAA, DPR, RPP, etc.
    • Opportunities for improvement:
      • Do more workshops to fully complete the Outcomes Register
      • Recast final report to be more usable
      • Develop a tool for senior management communication; simplify the logic model results
      • Build user-friendly tools and templates
      • Change of culture; education of the outcome management language / methodology / practice
      • Strong leadership from the centre
      • There needs to be follow-through of the performance measures
      • Clear accountability for results must be defined
      • Difficult to harvest outcomes, especially in terms of workforce adjustments; disincentives for generating financial savings (to be taken away due to inflexibility of the system)
      • Should have an escape clause or off-ramp option to anticipate failure and stop the project
  5. What is your initiative's or organization's readiness to fully implement the formal Outcome Management process?
    • Most not ready until outcomes management can aligned and integrated with existing policy / procedures / best practices / frameworks
    • Needs more visible Treasury Board support
    • Lack of funding for what is perceived as an additional gate
    • More departments are realizing they need to get up to speed with the international movement towards Outcome Management
    • The management community is recognizing Outcome Management as a best practice
    • Skill sets and outcomes thinking have to be raised
    • Complementary skill sets (facilitation) need to be raised
  6. What business use have you made of the Outcome Management process and deliverables?  For example, did it help you secure or increase senior management support?
    • Team communication and understanding, especially of participants in the workshops
    • New employee orientation
    • Helped to re-prioritize and redefine the portfolio of programs offered (validate that you are doing the right things)
    • Retired low-value applications
    • Prepared the project team to articulate the project and its business value to senior executives and TBS analyst
    • Reused some of the material in other documentation
    • Feeds into the Performance Measurement Framework
  7. What aspects of Outcome Management have provided the most value to your initiative / organization?  What aspects provided the least value?
    • Most value:
      • Engaged stakeholders (interviews and workshops)
      • Clarified vision and outcomes, especially for horizontal initiatives
      • Provided technique to articulate soft benefits, and also a means to measure them
      • Precision of the language around outcomes
      • Discipline of the approach
      • The value is in the process
      • Visual representation very helpful
      • Value cases helpful
      • Provided a broader perspective than the traditional project management approach
    • Least value:
      • The report (benefits case)
      • The complexity of the logic model when shown to senior management
      • The Outcomes Register when not taken to completion
  8. What are the major barriers that you foresee to fully implement Outcome Management?  What are the opportunities in doing so?
    • Mandating/standardization of the process will limit flexibility that is seen as a strength of the process and diminish enthusiasm
    • Lack of integration or ability to integrate with existing policy / procedures / best practices / frameworks
    • Residual overhead; would add more burden/cost to the project
    • If horizontal initiatives cannot be accommodated
    • The differences in organizational culture
    • The skill sets are not transferable to departments
    • Organizational capacity to do Outcome Management, especially given the climate of new policies, restrictions, constraints, etc.
    • Lack of clear ministerial accountability; the effect of the second Gomery Commission report
    • Need openness to discuss failures
  9. What lessons have been learned about using the practice of Outcome Management?  What would you tell another initiative about Outcome Management?  What would you suggest or recommend to them?
    • Do it early in the process (pre-PPA), redo or revisit at major events (such as EPA)
    • Get all stakeholders to the table to build consensus
    • Start with the strategic outcomes first, then work back to the outputs and initiatives
    • Define the outcomes first and then the performance measures
    • Outcomes should be business-driven; outcome owners have to drive the process, not IT
    • Use a value case, not a business case to incorporate the public good of government
    • Communication to all players is crucial
    • Executive buy-in is crucial for success
    • Build expertise in house through knowledge transfers, education, and experience
    • Use precise language in Outcome Management
    • Keep Outcome Management as an interactive, consultative process, not an individual paper process
    • Having an introductory session before the main interviews and workshops (marketing and awareness plan)
    • Must be integrated with the Performance Measurement Framework
    • The logic model shows the complexity of cause and effect relationships much better than tables or text

Appendix C: Outcome Management Internationally

The following table summarizes the Outcome Management frameworks across the five countries that have most adopted the Outcome Management approach.

Country

Legislation/ Regulation/Policy

Current Methodology/Framework

Central Agency

Australia

n/a

  • Output Based Management (OBM)
  • Outcome Based Management Guidelines
  • Outcomes and Output Framework

Department of Treasury and Finance

Canada

  • Management, Resources and Results Structure (MRRS) Policy
  • Results-based Management
  • Outcome Management
  • Results-based Management and Accountability Framework (RMAF)
  • Management Accountability Framework (MAF)
  • Enhanced Framework for the Management of Information Technology Projects
  • Risk-Based Audit Framework (RBAF)
  • Results for Canadians

Treasury Board of Canada Secretariat (TBS)

France

Loi organique relative aux lois de finances (2001)

The Performance-Based Approach (methodological guide)

Ministry of the Economy, Finance and Industry

United Kingdom

n/a

  • OGC Gateway Process
  • Centres of Excellence

Office of Government Commerce (OGC)

United States

  • Government Performance and Results Act (1993)
  • Information Technology Management Reform Act (1996)
  • Federal Enterprise Architecture
  • Executive Branch Management Scorecard (results.gov)
  • Office of Management and Budget (OMB)
  • Government Accountability Office (GAO)

Appendix D: Annotated Bibliography

* indicates a Key reference

Agence pour le D�veloppement de l'Administration �lectronique, Guide m�thodologique MAREVA : Analyse de la valeur des projets d'ADELE, Ministere du Budget et de la reforme de l'Etat, 2005, http://www.adae.gouv.fr/IMG/pdf/050405_MAREVA_GuideMethodologique_vf.pdf

This document covers the e-government program ADELE from France, which sets out the government's online strategy for the period of 2004-2007.  To evaluate gains and savings for each e-government project, the French government developed the evaluation methodology called MAREVA, which enables a precise evaluation of financial gains of e-government services for the State and the public sector, as well as of gains and benefits for end users.

Australian Government, Demand and Value Assessment Methodology, Information Management Office, 2004.

The methodology presented in this document represents the culmination of over a year's effort designing and refining a standardized system to forecast and articulate demand and value in any proposed e-government service.

Benko, C. and F.W. McFarlan, Connecting the Dots: Aligning Projects with Objectives in Unpredictable Times, Boston, MA: Harvard Business School Press, 2003.

In this book the authors argue that an organization's project portfolio is its single most important asset for delivering on strategic and operational objectives.  To harness portfolios into an efficient, coherent whole, the authors propose a goal of alignment: better matching a company's portfolios with its objectives, since better aligned companies achieve greater investment returns.

* Binnendijk, A., Results Based Management in the Development Co-operation Agencies: A Review of Experience, DAC Working Party on Aid Evaluation, 2000, http://www.oecd.org/dataoecd/secure/14/29/31950852.pdf

This paper is based on a document review of the experiences and practices of selected OECD Member development co-operation agencies with establishing performance or results based management systems.  Covered in the review are the experiences of seven donor agencies establishing and implementing their results based management systems, comparing similarities and contrasting differences in approach.

Bryson, J. et al., Visible Thinking: Unlocking causal mapping for practical business results, Chichester, West Sussex: John Wiley & Sons Ltd, 2004.

This book shows how to create causal maps for individuals and groups to capture the power and broad applicability of causal mapping in a business, as well as a personal, context.  The process is illustrated through a series of real cases – from tackling personal problems to strategy-change issues in business, public, and not-for-profit organizations.  The cases are used to present a comprehensive set of process guidelines designed to help you create your own action-oriented causal maps.

Canadian International Development Agency, RBM Handbook on Developing Results Chains: The Basics of RBM as Applied to 100 Project Examples, Results-Based Management Division, 2000, http://www.acdi-cida.gc.ca/INET/IMAGES.NSF/vLUImages/Performancereview6/$file/Full_report.pdf

This handbook aims to provide the basic concepts behind Results-Based Management (RBM) supported with 100 examples to better articulate what projects, programs or organizations hope to achieve.  The purpose of the guide is to promote a better understanding of the key concepts of RBM illuminated with examples, graphics, and tools that help readers build their confidence with regard to RBM.

Canadian Transportation Agency, Performance Measurement Framework, 2004, http://www.cta-otc.gc.ca/about-nous/excellence/performance/performance-eng.pdf

This framework's purpose is to provide a consistent approach for systematically collecting, analyzing, utilizing, and reporting on the performance of the Canadian Transportation Agency's programs and activities.  This document presents an overview of the framework, as well as performance measurement principles, the program management process, and key steps for measuring performance.

Government of Canada, Canada's Performance: Annual Report to Parliament, Treasury Board of Canada Secretariat, 2004.

The purpose of this report is to provide parliamentarians and Canadians with a whole of government perspective from which to view the plans, results, and resources reported by individual federal departments and agencies in their spring planning and fall performance reports.  The report is divided into six themes: (1) Canada's Place in the World; (2) Canada's Economy; (3) Society, Culture and Democracy; (4) Aboriginal Peoples; (5) The Health of Canadians; and (6) The Canadian Environment.

Government of Canada Privy Council Office, Regulatory Affairs & Orders in Council Secretariat: Glossary, 2001, http://www.tbs-sct.gc.ca/ri-qr/index-eng.asp

This glossary provides unofficial definitions of Privacy Council Office (PCO) terms in order to facilitate users' understanding of PCO documents and information.

Government of South Australia, Triple Bottom Line, Department for Environment and Heritage, 2005, http://www.environment.sa.gov.au/sustainability/triple_bottom_line.html

This webpage provides an overview of Triple Bottom Line (TBL) reporting and its benefits, as well as measures taken by the Government of South Australia and departmental agencies in implementing the practice.

Industry, Science and Technology Canada, The Competitive Enterprise: An Executive's Guide to Investing in Advanced Manufacturing and Processing Technology, 1991.

This guide focuses on Advanced Manufacturing and processing Technology (AMT), a key means by which Canadian manufacturers and processors can improve their performance.  This guide provides an inexpensive, comprehensive, step-by-step method to assist in identifying, evaluating, and implementing AMT.

Kaplan, R. and D. Norton, The Balanced Scorecard: Translating Strategy into Action, Boston, MA: Harvard Business School Press, 1996.

The Balanced Scorecard is a management system designed to channel abilities, energy, and knowledge toward achieving long-term strategic goals.  Encompassing current and future performance, Kaplan and Norton's method can be used in four categories to meet organizational objectives: financial performance, customer knowledge, internal business processes, and learning and growth.

Kaplan, R. and D. Norton, Strategy Maps: Converting Intangible Assets into Tangible Outcomes, Boston, MA: Harvard Business School Press, 2004.

This book is a blueprint for describing, measuring, and aligning intangible assets for superior performance, providing the missing link between strategy formulation and implementation.  Based on their work with more than 300 organizations spanning over a dozen years, the authors have created a new tool, strategy maps, that allow organizations to clarify and communicate their strategies to all employees, identify the key internal processes that drive strategic success, align investments in people and organizational capital for the greatest impact, and expose gaps in the strategies and take early corrective action.

Lebow, R. and R. Spitzer, Accountability: Freedom and Responsibility without Control, San Francisco, CA: Berrett-Koehler Publishers, Inc., 2002.

This book looks at the past fifty years of control-based initiatives that underachieved and over-promised.  It challenges conventional beliefs about the true value TQM, MBO, incentive programs, personal improvement plans, quotas, systems thinking, performance reviews, and job descriptions have contributed to the organizational landscape.  It provides practical guidelines for transforming control-based operations into freedom-based work environments where managers take on the new role of Wise Counsels and employees design and fully own their jobs.

Maizlish, B. and R. Handler, IT Portfolio Management Step-By-Step: Unlocking the Business Value of Technology, Hoboken, NJ: John Wiley & Sons, Inc., 2005.

This book presents an approach to simplifying the process of achieving a rationalized and business-aligned IT portfolio.  It includes extensive coverage of the five-level IT portfolio maturity model, policies, principles, organizational roles, the IT life cycle, detailed stages of building the IT portfolio, advanced IT portfolio management topics, and the impact of future trends and technologies on IT portfolio management.

Millard, J., Best eEurope Practices: Final Project Report, European Communities, 2003, http://www.beepknowledgesystem.org/File/Beep_Final_Report.pdf

Beep (Best eEurope Practices) is a technical tool for identifying, capturing, evaluating, and learning from good practice in any field of interest.  This report provides an overview of the Beep objectives, methodology, and results.  Special focus is given to the main issues and policy and research recommendations arising out of an in-depth analysis of many of the over 270 cases in the Beep knowledge system.

* Office of the Auditor General of Canada, Implementing Results-Based Management: Lessons from the Literature, 2000, http://www.oag-bvg.gc.ca/internet/English/meth_gde_e_10225.html 

A follow-up to a review prepared by the Office of the Auditor General in 1996, this report is a concise synthesis of lessons learned from implementing results-based management in a variety of Canadian and international jurisdictions.  The first review, summarized in Annex A, focused on implementation, while this update also includes lessons learned on more operational issues such as development of indicators, data collection, analysis, monitoring, and reporting.

Organisation for Economic Co-operation and Development, Evaluation and Aid Effectiveness: Glossary of Key Terms in Evaluation and Results Based Management, DAC Working Party on Aid Evaluation, Development Assistance Committee, 2002, http://www.oecd.org/dataoecd/29/21/2754804.pdf

This document contains a glossary of terms relating to quality assurance, stakeholders, logical framework, results-based management, evaluation tools, and types of evaluations.

Plantz, M. et al., Outcome Measurement: Showing Results in the Nonprofit Sector, United Way of America, 1997, http://national.unitedway.org/outcomes/resources/What/ndpaper.cfm

This article describes the activities of non-profit agencies in relation to outcome management initiatives, discussing 30 lessons learned and seven key challenges to be overcome.  It also summarizes the history of performance measurement in the non-profit health and human services sector and defines the key concepts of outcome measurement.

Schacter, M., Results-based Management at the Water Cooler: Perspectives from the working level on RBM, Mark Schacter Consulting, 2004.

Based on opinion data gathered from approximately 100 public servants at a series of results-based management (RBM) workshops, this paper provides a window into perceptions of working-level officials about the implementation of RBM in Canada's public service.  Analysis of the data suggests that public-sector staff have six types of concerns about RBM implementation, two of which are predominant: high-level leadership for RBM and technical capacity to implement RBM.

Slevin, D. et al., Critical success factor analysis for information systems performance measurement and enhancement: A case study in the university environment, Elsevier Science Publishers, 1991.

Described in this study is the application of the Critical Success Factor (CSF) process as a performance measurement and enhancement device.  Measurement issues are discussed, along with the establishment of performance standards.  Case examples of actual behaviour change (performance improvement) due to the use of CSFs are given.  Finally, guidelines for the use of the CSF procedure in other Information Systems (IS) organizational contexts are discussed and prescriptive recommendations are presented.

Treasury Board of Canada Secretariat, Business Transformation Enablement Program – Strategic Design & Planning Methodology, 2004, www.tbs-sct.gc.ca/btep-pto/index_e.asp

This document describes the first release of the BTEP Design and Planning Methodology, which is the overall process methodology for business transformation.  It is intended to be used by business transformation teams responsible for producing transformation project deliverables.  The goal of BTEP is to enable coherent business design across the government with a formal, standards-based approach that will guide and expedite business transformation to meet the government's high-level business objectives.

Treasury Board of Canada Secretariat, Changing Management Culture: Models and Strategies to Make It Happen, 2003, www.tbs-sct.gc.ca/cmo_mfc/Toolkit2/GCC/cmc-eng.pdf

This guide presents a step-by-step approach to managing change, one that deputy ministers, heads of agencies, and their executive teams can follow when undertaking management reforms.  For illustration purposes, the guide focuses on Modern Comptrollership, but it is generic in nature and its approach can be applied to any effort to change management culture.

Treasury Board of Canada Secretariat, Companion Guide: The Development of Results-based Management and Accountability Frameworks for Horizontal Initiatives, 2002, www.tbs-sct.gc.ca/eval/tools_outils/comp-guide-eng.pdf

This guide was developed to complement the Guide for the Development of Results-based Management and Accountability Frameworks and provide federal managers with practical advice on how to develop effective RMAFs for horizontal initiatives.  It addresses the challenges of building an effective team that will draft the RMAF, covers the five main components of an RMAF, and provides a list of additional lessons learned and reference documents.

* Treasury Board of Canada Secretariat, An Enhanced Framework for the Management of Information Technology Projects, Project Management Office, Financial and Information Management Branch, 1996, www.tbs-sct.gc.ca/emf-cag/abu-ans/itp-pti/itp-pti00-eng.asp

This paper describes a proposed enhanced framework for the management of information technology projects in the federal government.  This enhanced framework is designed to ensure that government information technology projects fully meet the needs of the business functions they are intended to support, deliver all expected benefits, and are completed within their approved time, cost, and functionality.

* Treasury Board of Canada Secretariat, An Enhanced Framework for the Management of Information Technology Projects: Creating and Using a Business Case for Information Technology Projects, Project Management Office, Chief Information Officer Branch, 1998, www.tbs-sct.gc.ca/emf-cag/business-rentabilisation/business-rentabilisation-eng.asp

This document was designed to ensure that federal government IT projects fully meet the needs of the business functions they are intended to support and deliver all expected benefits, and are completed on time and within budget.  Moreover, it identifies the need for a business case analysis before a government IT investment can be approved.  The guide can be used as a planning tool for users to mark and monitor the factors that are crucial to implementing IT successfully.

* Treasury Board of Canada Secretariat, An Enhanced Framework for the Management of Information Technology Projects Part II – Solutions: Putting the Principles to Work, Chief Information Officer Branch, 1998, www.tbs-sct.gc.ca/emf-cag/abu-ans/ppw-slp/ppw-slp00-eng.asp

This document is a companion to Part I, which was approved and published in May 1996.  the purpose of the document is to facilitate implementation of the Enhanced Framework within federal government departments by providing an overview of the Enhanced Framework, identifying where and how to begin the process of implementation, outlining solutions to assist departments in applying the Framework, describing the roles and responsibilities of the key departmental players in project delivery, and providing guidance on how to get started.

Treasury Board of Canada Secretariat, Enterprise Value Management Outcome Management Practice Implementation Strategy, CIO Branch, Alignment and Stewardship, 2005.

This document outlines the recommended strategy for the establishment of an Outcome Management (Outcome Management) Practice within the Alignment and Stewardship Division of the CIO Branch, Treasury Board Secretariat.  The key objective of the Outcome Management Practice is to help the Alignment and Stewardship Division further evolve in its role and to support improved decision-making to ensure that the best value for the government enterprise is paramount in all choices made for Canada's portfolio of initiatives.

* Treasury Board of Canada Secretariat, Guide to Realizing Outcomes from Government of Canada Programs (Draft), 2004.

This guide presents the outcomes realization process, which is described as the set of activities for planning, managing, and realizing desired outcomes from initiatives.  Through value management, the guide provides a framework involving tools and techniques to proactively plan, manage, and monitor the realization of the outcomes of a change initiative.

Treasury Board of Canada Secretariat, Integrated Measurement Framework: Concept Paper (Draft), Chief Information Officer Branch, 2005.

This document presents ideas towards the development of an Integrated Measurement Framework (IMF) for the Chief Information Officer Branch (CIOB).  It presents the IMF vision, initial strategies, the initial IMF design, and the next steps in the IMF development.

Treasury Board of Canada Secretariat, Management Accountability Framework, President of the Treasury Board, 2003, www.tbs-sct.gc.ca/maf-crg/index-eng.asp

This document was developed to provide deputy heads and all public service managers with a list of management expectations that reflect the different elements of current management responsibilities.  It is intended to translate the vision of modern public service management into a set of management expectations.  The Framework focuses on management results rather than required capabilities, provides a basis of engagement with departments, and suggests ways for departments both to move forward and to measure progress.

Treasury Board of Canada Secretariat, Outcomes Management Realization: Service Canada Policy – Outcomes Sought, prepared for Service Canada, 2005.

This overview consists of an outline of the outcomes sought by Service Canada policy, as well as a summary of the four stages involved in outcomes management realization.

Treasury Board of Canada Secretariat, Results-Based Management and Accountability Framework of the Modern Comptrollership Initiative, 2003, www.tbs-sct.gc.ca/cmo_mfc/resources2/RMAF/RMAF.pdf

This document contains a profile of the Modern Comptrollership Initiative (MCI), guidance for ongoing performance measurement, in addition to evaluation and reporting strategies.  This Results-Based Management and Accountability Framework (RMAF) was developed as a tool for managers in departments, agencies, and at the Treasury Board Secretariat (TBS) to help in measuring and reporting on results being achieved through the MCI.

Treasury Board of Canada Secretariat, Results-Based Management in Canada: Country Report Prepared for the OECD Outcome-Focused Management Project, Planning, Performance and Reporting Sector, Comptrollership Branch, 2000, http://www.ppx.ca/NewsArchives/PDF/Result_Based_Management.pdf

The objective of this report is to describe how outcome goals are defined and used, and how progress towards them is measured in the Government of Canada.  In the report, there sections relating to how departments and agencies integrate results-based management in policy formulation and implementation, concrete examples to illustrate aspects of results-based management, a comparison of the working terminology of the OECD with that of Canada, and questions on results-based management for the annual OECD Survey of Budgeting Development.

* U.S. Government, Balancing Measures: Best Practices in Performance Management, National Partnership for Reinventing Government, 1999, http://govinfo.library.unt.edu/npr/library/papers/bkgrd/balmeasure.html

This report represents an extensive undertaking to survey and interview agencies and companies for practices that contribute to improving service as well as business results.  The findings show that the process followed has not been exactly the same in every instance.  Balancing business results with customer, stakeholder, and employee information generally produces marked improvement in performance, service, and overall satisfaction.  This study partners report gains in efficiency, data tied to strategic goals and measurement systems, and improved relationships with employees and customers.

U.S. Government, Rating the Performance of Federal Programs, Office of Management and Budget, 2004, http://www.gpoaccess.gov/usbudget/fy04/pdf/budget/performance.pdf

This document gives background information on the Program Assessment Rating Tool (PART) of the U.S. federal government.  The PART is a systematic method of assessing the performance of program activities across the U.S. government.  As a diagnostic tool, the main of objective of the PART review is to improve program performance.  The PART assessments help link performance to budget decisions and provide a basis for making recommendations to improve results.

* U.S. Government, Serving the American Public: Best Practices in Performance Measurement, National Performance Review, 1997, http://govinfo.library.unt.edu/npr/library/papers/benchmrk/nprbook.html

This report documents the Performance Measurement Study Team's findings, which are to be used as a tool for public and private leaders and managers in identifying and applying best-in-class performance measurement and performance management practices.  This intergovernmental benchmarking study identifies the processes, skills, technologies, and best practices that can be used by government to link strategic planning with performance planning and measurement by establishing and updating performance measures, establishing accountability for performance, gathering and analyzing performance data, and reporting and using performance information.

Appendix E: Outcome Management Definitions

These definitions are based in part on text from the documents TBS Guide to Realizing Outcomes from Government of Canada ProgramsFootnote 18 (draft of March 2004) and Evaluation and Aid Effectiveness: Glossary of Key Terms in Evaluation Results Based Management.Footnote 19

Accountability for Results:
The responsibility to report fairly and accurately on the attainment or non-attainment of outcomes, in addition to demonstrating that work has been conducted according to existing standards and/or agreements.
Assumption:
A condition for the realization of an outcome or of an initiative, over which the organization has no control.
Benefit:
Direct and indirect positive consequences resulting from an action.  Includes both financial and non-financial information. Footnote 20
Benchmark:
A reference point or standard against which performance or outcomes can be measured.
Best practice:
This concept refers to a proven and reliable technique or methodology for accomplishing a task, formulated by studying business cases, case studies and highly successful organizations performing specific functions.
Final outcome:
The end result expected from an initiative.
Initiative:
A structured grouping of projects designed to produce clearly identified business results or outcomes.
Intermediate outcome:
A capability delivered by a project or a business impact resulting from a group of projects within the initiative.
Lessons learned:
Generalizations based on the evaluation of experiences with projects, programs, or policies including strengths and weaknesses that can apply to broader situations or other initiatives.
Logic Model:
The causal sequence for an intervention that stipulates the necessary sequence to achieve desired objectives, beginning with inputs and ending with outcomes, impacts, and feedback.
Outcome:
The expected result or impact (either positive or negative) at the end of an initiative, intervention or change.
Outcome Management:
A management strategy focused on the achievement of results and outcomes from a set of deliverables, rather then on the creation of that deliverable.
Performance:
The degree to which an intervention is operating with respect to standards and guidelines, or the extent to which it is achieving results in accordance with stated goals or plans.
Portfolio:
A collection of initiatives, programs, or projects.
Program:
A set of initiatives with a broad mandate to deliver value.
Project:
A group of activities concerned with delivering a defined capability based upon an agreed schedule and budget.
Result:
The outcome or impact of an intervention or change.  Results can be intended or unintended, as well as positive and/or negative.

Page details

Date modified: