Evaluation of the Information Systems Lifecycle Program

October 2017

1258-3-004 (ADM(RS))

Reviewed by ADM(RS) in accordance with the Access to Information Act. Information UNCLASSIFIED.

Acronyms and Abbreviations

ADM(IM)

Assistant Deputy Minister (Information Management)

ADM(Mat)

Assistant Deputy Minister (Materiel)

ADM(RS)

Assistant Deputy Minister (Review Services)

CA

Canadian Army

CAF

Canadian Armed Forces

CBSA

Canada Border Services Agency

CFTPO

Canadian Forces Task Plans and Operations

CIO

Chief Information Officer

CProg

Chief of Programme

C4ISR

Command and Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance Systems

CSNI

Consolidated Secret Network Infrastructure

DADD

Director Air Domain Development

DAOD

Defence Administrative Orders and Directives

DGEAS

Director General Enterprise Application Services

DLCI

Director Land Command and Information

DND

Department of National Defence

DNIMR

Director Naval Information Management Requirements

DRT

Defence Renewal Team

EITSM

Enterprise Information Technology Service Management

FOC

Full Operational Capability

FTEs

Full Time Equivalents

FY

Fiscal Year

GC

Government of Canada

L1

Level 1

IM

Information Management

IMB

Information Management Board

IS

Information Systems

IT

Information Technology

MCSC

Military Command Software Centre

NATO

North Atlantic Treaty Organization

OGDs

Other Government Departments

PAA

Program Alignment Architecture

PMB

Programme Management Board

PAD

Project Approval Directive

RFC

Request for Change

RCAF

Royal Canadian Air Force

RCMP

Royal Canadian Mounted Police

RCN

Royal Canadian Navy

SIPRNet

Secure Internet Protocol Router Network

SSC

Shared Services Canada

TB

Treasury Board

TBS

Treasury Board Secretariat

Overall Assessment

The IS Lifecycle Program continues to fulfill a critical need for CAF military operations and training and is supported by TB policy. The program is also well aligned with Government of Canada (GC) and departmental priorities.

Effectiveness has been challenged in the past but numerous on-going initiatives are expected to improve performance. Several critical areas still require improvement and recommendations are made accordingly. Sustained efforts and adequate performance monitoring are required.

The DND/CAF has generally invested fewer resources on the development of information technology than other similar, operational departments.

Executive Summary

This report presents the results of an evaluation of the Information Systems (IS) Lifecycle Program, which is sub-program 4.4 of the Department of National Defence (DND) and the Canadian Armed Forces (CAF) Program Alignment Architecture (PAA). The evaluation was undertaken between June 2015 and November 2016 in accordance with the DND/CAF Five-Year Evaluation Work Plan.

The former Treasury Board (TB) Policy on Evaluation was rescinded on July 1, 2016 and replaced with a new Policy on Results. Since the new Policy on Results was effective only as of July 1, 2016, this evaluation was carried out in accordance with the former TB Policy on Evaluation (2009).

The purpose of the evaluation was to examine program relevance and performance for the fiscal years (FY) 2010/11 to 2014/15 and to inform future management decisions related to program and service delivery and resource allocation.

Description

The IS Lifecycle Program is the DND/CAF’s program for life-cycling information systems (information technology infrastructure and applications) for military use. The lifecycle process includes technology acquisition/development, production, maintenance, recapitalization, and disposal and/or retirement. The program delivers information systems that meet the operational requirements of user clients (those that operationalize the systems) and addresses capability gaps and deficiencies, while also ensuring adequate support services are provided. The overarching objective of the IS Lifecycle Program is to ensure that information systems are available in the quantity, mix and condition to enable military operations, readiness and training activities.

Over the evaluation period (FY 2010/11 to FY 2014/15) the IS Lifecycle Program was delivered in a decentralized manner, with Assistant Deputy Minister (Information Management) (ADM(IM)) as the functional authority and main service provider. The Canadian Army (CA), the Royal Canadian Air Force (RCAF), and the Royal Canadian Navy (RCN) have also been delegated responsibility as service providers, mostly to meet the needs of their own organizations, but have increasingly taken on responsibilities for providing user client support to others. Assistant Deputy Minister (Materiel) (ADM(Mat)) provides procurement support to all service providers for major capital projects. Within the past five years, Shared Services Canada (SSC) has also played a role as an external service provider but, for the purposes of this evaluation, is not considered a main stakeholder of the program because this is an organization external to the DND/CAF.

The service providers are also the main clients of the products and services produced by the program.

In FY 2014/15, estimated expenditures attributed to the IS Lifecycle Program were approximately $800 million, including the cost of military and civilian salaries for 2,627 Full Time Equivalents (FTEs – civilian and military) working under the program. Program expenditures in FY 2014/15 represented 4.3 percent of total departmental expenditures.

Key Findings and Recommendations

Relevance

The IS Lifecycle Program remains very relevant. Technology assets are critical to the success of CAF military operations and training activities, as well as the corporate activities that support them. The IS Lifecycle Program is needed to meet the specific technological requirements of the department. Within the GC, no other organization exists that could completely deliver the level of service and products required of such a large, complex, and mixed civilian and military organization.

The objectives and activities of the IS Lifecycle Program are well aligned with the priorities and strategic direction of the GC and of the DND/CAF, as well as aligned with the roles and responsibilities of a federal department. TB policy makes it clear that the DND/CAF has the authority to manage its own IT program. The transfer of resources from the DND/CAF to SSC in FY 2011/12 and in FY 2012/13 created some confusion with regards to roles and responsibilities between the two departments.

Performance

In order to assess the overall performance of the IS Lifecycle Program, the evaluation examined the extent to which the DND/CAF approach met expectations for the timeliness of project delivery and delivery of client requirements, the extent of systems interoperability and systems integration, the technical condition (‘health”) and availability of information systems, as well as service management. It also assessed the visibility, oversight and control over information systems; the ability to meet operational requirements and fill capability gaps/deficiencies, and the consolidation of systems and services. The impacts on the achievement of expected outcomes resulting from the transfer of resources to SSC were assessed, as well as the efficiency and economy of the program.

Overall, program effectiveness has been a challenge in the past and has impacted operations; however, there are numerous initiatives currently being implemented by the department that are expected to improve program performance. The initiatives have led to some positive improvements over the last five years and the program is now moving in the right direction. For example, systems interoperability and systems integration are now being prioritized by the department and positive improvements were acknowledged by stakeholders. The technical condition of enterprise applications has improved significantly in recent years and progress is also being made to improve service delivery to user clients.

With regard to efficiency, a comparison of resource spending on IT with other operational-type departments indicates that the DND/CAF has generally invested fewer resources on the development of information technology than other similar departments. There was also evidence that the department is making efforts to improve the overall efficiency of this program, particularly through DRT initiatives[1] in order to strategically reinvest resources in high priority areas. Sustained efforts and adequate resources are required for all initiatives to be completed successfully to improve program effectiveness.

Although initiatives are on-going in some areas, there are several challenges for which initiatives do not currently exist. These challenges are summarized in Table 1 along with their associated recommendations. Recommendations were only made by the evaluation for those challenges for which departmental initiatives do not currently exist and for which achievement of the outcome is strategically critical to achieving an effective program. A recommendation is also being made to ensure continued prioritization for existing initiatives.

Key Findings

Recommendations

Effectiveness

Major capital Information Systems projects are not meeting departmental standards for the timely delivery of major capital projects.

There is a need for the department to investigate options for a more flexible process for delivering on major capital information system projects, specifically targeting reduced timelines from project definition to Full Operational Capability.

For major capital projects, there may not be sufficient stakeholder consultations conducted with the user client community to ensure operational requirements are being met.

There is a need to ensure that sufficient stakeholder consultations are conducted with the user client community in the delivery of major capital IS projects to ensure operational requirements are met.

A baseline to assess progress toward improving the timeliness and quality of user client services could not be established due to a lack of historical data and client satisfaction feedback. The program could benefit from the collection of relevant service management performance data to adequately assess the extent to which current initiatives have a positive impact on the provision of support to user clients..

There is a need to collect user client satisfaction feedback on the timelines and quality of services

The visibility, oversight and control of IS investments and decision-making within the environmental commands require improvements. There is a need for a more robust process for oversight and IS Lifecycle management in order to implement good practices. RCN, RCAF and CA stakeholders should collaborate with ADM (IM) in the development of that process.
The department is undertaking numerous initiatives that need to be completed to improve the effectiveness of the IS Lifecycle Program, particularly in the areas of system’s interoperability, system’s integration, technical condition, service management, governance and consolidation of products and services. There is a need to ensure the completion of the numerous initiatives aimed at improving the program.
The transfer of resources to SSC is impacting some IS Lifecycle Program activities. There is a need for the DND/CAF to continue to work with SSC partners to finalize and clearly delineate roles and responsibilities and set service level standards in a timely manner.

Efficiency and Economy

The department has not accurately tracked expenditures related to the IS Lifecycle Program. As such, it was difficult to determine the extent to which the department has used cost-effective means in the production of outputs.

There is a need to improve tracking of program expenditures, particularly at the output level (i.e., project management costs, user support costs, in-service support costs).

A Performance Measurement Framework does not exist to sufficiently capture the performance of the IS Lifecycle Program as a whole.

There is a need to develop a Performance Measurement Framework that incorporates performance indicators for all stakeholders delivering the program and targets that consider industry best practices and standards or comparisons with other federal institutions. Tracking of performance should also be prioritized.

Table 1. Summary of Key Findings and Recommendations

Table Summary

This table provides a summary of the evaluation’s Key Findings, and the recommendations made in relation to them. This table has two columns and eleven rows. On the right column, key findings are listed. On the left column, the corresponding recommendations are listed. The first set of findings and recommendations relate to the Effectiveness and Efficiency of the program. The second set of findings and recommendations relate to the Economy of the program.

Note: Please refer to Annex A—Management Action Plan for the management responses to the ADM(RS) recommendations.

1.0 Introduction

This report presents the results of an evaluation of the Information Systems (IS) Lifecycle Program, which is sub-program 4.4 of the Department of National Defence (DND) and the Canadian Armed Forces (CAF) Program Alignment Architecture (PAA). This program has not been previously evaluated.

The evaluation was undertaken by Assistant Deputy Minister (Review Services) (ADM)(RS)) between June 2015 and November 2016 on behalf of the Chief of the Defence Staff and the Deputy Minister of the DND and in accordance with the DND/CAF Five-Year Evaluation Work Plan. The evaluation addresses the relevance and performance of the program for the period of FY 2010/11 to FY 2014/15, as directed in the 2009 Treasury Board (TB) Directive on the Evaluation Function.

The 2009 Treasury Board (TB) Policy on Evaluation was rescinded on July 1, 2016 and replaced with a new Policy on Results. Since the new Policy on Results was effective only as of July 1, 2016, this evaluation was carried out in accordance with the former TB Policy on Evaluation and its associated Directive and Standard.

An Evaluation Advisory Committee, representing all the main stakeholders, was convened to ensure that the evaluation was able to incorporate subject matter expertise and the input of program staff to enable the production of useful evaluation findings and recommendations. The members of the Evaluation Advisory Committee are listed in Annex B – Evaluation Methodology. The findings and recommendations in this evaluation may be used to inform management decisions related to program delivery and resource allocation, and will serve as a baseline for future evaluations.

1.1 Context for the Evaluation

During the conduct of this evaluation, the Defence Information Management/Information Technology (IM/IT) Program was undergoing transformation as part of the Defence Renewal initiative.[2] A new service delivery model, from decentralized to federated,[3] was being implemented. These changes were considered by the evaluation, where applicable, but were still being realized when the evaluation concluded.

Another aspect to be taken into account within the context of this evaluation is the department’s transfer of IT resources and responsibilities to Shared Services Canada (SSC). Following two Orders-In-Council[4] established in FY 2011/12 and FY 2012/13, the DND/CAF transferred $318 million and 761 positions (538 civilian and 223 military) to SSC. With the transfer of these resources, some of the IS Lifecycle Program activities were also transferred and are now the responsibility of SSC. SSC is now responsible for delivering email, data centre, and network services, as well as purchasing IT equipment.

Back to Table of Contents

1.2 Program Profile

1.2.1 Program Description

The IS Lifecycle Program is the DND/CAF’s program for life-cycling technology (referred to as information systems)[5] for military use. The lifecycle process includes technology acquisition/development, production, maintenance, recapitalization, and disposal and/or retirement.[6] Activities are similar to those conducted by other organizations that lifecycle IT; however, the activities have been organized under four separate PAA sub-sub programs:

  • 4.4.1 – Info Systems – Portfolio Management;
  • 4.4.2 – Info Systems – Acquisition, Development and Deployment;
  • 4.4.3 – Info Systems – System Management and User Support; and
  • 4.4.4 – Info Systems – Strategic Coordination, Development and Control.

The Logic Model at Annex C, developed in consultation with the Evaluation Advisory Committee members, provides details of the specific activities undertaken within each sub-sub program. The Logic Model also describes the outputs (systems and services) produced and the expected outcomes of the program.

Through major and minor projects, the IS Lifecycle Program produces information systems that support command and control, communications, computers, intelligence, surveillance and reconnaissance (commonly referred to as C4ISR). Systems produced include: applications, networks and architectures; system management technologies:[7] security technologies;[8] and distributed technologies.[9] In addition, services are also provided to user clients of the systems by various DND/CAF service providers, including engineering and break/fix services, as well as various service desks (or service centres) providing technical and functional support. Engineering services are required to proactively maintain the functionality of systems by repairing, testing, upgrading or enhancing systems. Meanwhile, the service centres respond to unplanned interruption of services (referred to as “incidents”) or specific requests from user clients for information, advice, or access to an IT service (referred to as “service requests”).[10]

This program does not include activities related to lifecycle management of IS for corporate use, such as the implementation of financial and human resource systems within the department which support internal services functions only. These expenditures are captured separately under PAA sub-program 6.7 – Information Technology and represent only one percent of the total IT expenditures reported in the PAA in FY 2014/15.

1.2.2 Stakeholders (Service Providers, User Clients and Other Clients)

The main stakeholders of IS Lifecycle Program can be divided into three categories:

  1. Service Providers. A service provider is any organization that employs resources to produce and sustain IS capabilities and/or provide services to user clients. This includes service providers both internal and external to the DND/CAF. Internal service providers can be further categorized as major service providers (Level one (L1) organizations that deliver services to other L1 organizations) and minor service providers (L1 organizations that deliver services mostly to their own L1 organization).
  2. User Clients. A user client is any organization that operationalizes the information systems and receives the services produced by the IS Lifecycle Program to conduct its business.
  3. Other Clients. Other clients can include any organization that has an interest in the capabilities produced by the IS Lifecycle, either because they sponsor or they direct projects to produce the systems. These clients may not necessarily be user clients. Examples include the Chief Force Development and some ADM(IM) level two organizations.

Within the IS Lifecycle Program, the major service providers of the program are also user clients, and include: ADM(IM), CA, RCN and RCAF.

Chief Military Personnel has also increasingly taken on responsibilities as a service provider to other L1s across the department. Other L1s also provide services to their own organizations; however, expenditures specifically attributed to the IS Lifecycle program by these other L1s are relatively minor.

The specific level two and level three organizations responsible for delivering the service and their responsibilities are listed in Annex E – IS Lifecycle Program Major Service Providers. As user clients, the CA, RCAF, RCN and the Director General Information Management Operations (within ADM(IM)) operationalize information systems to directly support military operations.

Over the past five years (2010/11 to 2014/15), the program has been delivered in a decentralized manner, with ADM(IM) as the functional authority[11] and a major service provider. Meanwhile, the environmental commands (CA, RCN and RCAF) have been delegated responsibility as service providers, mostly to meet the needs of their own organizations, but have recently taken on additional responsibilities for providing user client support to other L1 organizations. All service providers follow functional direction and guidance provided by ADM(IM). ADM(Mat) provides procurement support to all service providers for major capital projects.

Further detail on the roles, responsibilities, and authorities of the various stakeholders of this program is provided in Section 2.1 – Alignment with Roles and Responsibilities.

1.2.4 Objectives

The ultimate objective of the IS Lifecycle Program is to ensure that information systems (IT infrastructure and applications) are available in the quantity, mix and condition to enable military operations, readiness and training activities, and other defence services provided under the direction of the Government of Canada (GC).

The expected immediate and intermediate outcomes of the program are provided in the Logic Model in Annex C.

Back to Table of Contents

1.3 Evaluation Scope

1.3.1 Coverage and Responsibilities

As described in the introduction section, the evaluation focused on the relevance and performance of the IS Lifecycle Program from FY 2010/11 to FY 2014/15. This included all activities charged to PAA sub-program 4.4 – IS Lifecycle and its four sub-sub programs. Minor and major capital IS projects were in scope, as well as all IS lifecycle activities conducted by ADM(IM), CA, RCN and RCAF service providers. The activities of ADM(Mat) (Directorate of Electronic Systems Procurement) were considered important, as they provide procurement support to major capital IS projects managed by ADM(IM), but were not specifically evaluated. Significant impacts (positive and/or negative) on the IS Lifecycle Program of the transfer of activities to SSC were also assessed to a limited extent. SSC’s activities and outputs are not subject to internal audits or evaluations conducted by the DND/CAF, but are considered an external influence to the program.

The evaluation did not focus on lifecycle management of systems for corporate use, an “internal service” normally charged to PAA 6.7 – Information Technology. It is understood that systems specifically produced in 4.4 – IS Lifecycle to enable military objectives are also sometimes used corporately. Information management activities (e.g., records keeping) were not evaluated, as this is a very distinct set of activities charged to PAA 6.6 – Information Management.

IS lifecycle activities conducted by Director General (DG) Land Equipment Program Management, DG Maritime Equipment Program Management, and DG Aerospace Equipment Program Management were scoped out, as these specific level two organizations are the focus of other past evaluations. Governance and the role of ADM(IM) as the Defence Chief Information Officer (CIO)[12] was neither of particular focus, as this is the subject of other defence transformation initiatives, such as Defence Renewal Team (DRT) initiative 3.3.

1.3.2 Resources

In FY 2014/15, expenditures attributed to the IS Lifecycle Program were approximately $800 million, including the cost of military and civilian salaries for 2,627 Full Time Equivalents (FTEs) working under the program. Program expenditures in FY 2014/15 represented 4.3 percent of total departmental expenditures. Financial data for other fiscal years is not presented due to challenges with the accuracy of attributions to the PAA in previous years. For example, RCAF expenditures as a percentage of total program spending may be inflated by as much as $30 million due to inaccurate attributions to PAA 4.4 which only became evident post the data collection phase of the evaluation, but the exact figure is not known. Other L1s may have also made errors in attribution, but RCAF attributions were of greatest discrepancy. The discrepancy supports the finding made in Section 2.5 of the report regarding the inaccuracy of the financial data and the need to better track expenditures. As illustrated in Table 2, expenditures are distributed among the various service providers, reflecting the decentralized model of service delivery. As the main service provider, ADM(IM) spends the majority of the program resources. Expenditures attributed to ADM(Mat) are those related to procurement services of major capital projects managed by ADM(IM).

PAA 4.4 – FY 2014/15 Total Annual Expenditures $801,431,686
Total FTEs (excluding Reserves) 2,627
ADM(IM) 47.1%
ADM(Mat) 12.1%

RCAF[13]

9.9%
CA 5.5%
RCN 3.6%

Other L1s[14]

21.6%

Table 2. Relative Distribution of Program Resources in FY 2014/15—Percentage Breakdown by L1. (Source: DRMIS.)

Table Summary

This table provides a summary of allocation of the total annual expenditures under PAA 4.4 for the Fiscal Year 2014/15. It has two columns. The column on the left has descriptive text, while the column on the right consists of figures corresponding to the text. Row one included the total annual expenditures of PAA 4.4 for FY 2014/15 - $801,431,686 CAD. Row two included the total number of Full Time Equivalents (excluding reserves) in FY 2014/15 - 2,627 (two-thousand six hundred and twenty seven). The six rows that follow illustrate the distribution of FTEs by percentage. Read down the rows to determine which percentage of FTEs was allocated to each L1 organization.

Under PAA 4.4, ADM(IM) had 47.1% of the employees – the highest percentage. The second highest percentage was occupied by ADM(Mat) who held 12.1% of FTEs. The RCAF followed third, with 9.9% of FTEs, the CA with 5.5%, the RCN with 3.6% and all other L1s with a collective 21.6%.

Table 3 illustrates the resources used in FY 2014/15 broken down by PAA sub-sub program.

PAA Sub-Sub Program $ Spending % Total Program Spending FTEs

4.4.1 Info Systems – Portfolio Management

$246,338,578 30.9% 228

4.4.2 Info Systems – Acquisition, Development, and Deployment

$174,380,767 21.8% 449

4.4.3 Info Systems – System Management and User Support

$218,916,017 27.3% 1,617

4.4.4 Info Systems – Strategic Coordination, Development and Control

$160,670,611 20.0% 333
TOTAL $801,431,686 100% 2,627

Table 3. Financial and Human Resource Allocations by PAA Sub-sub program (FY 2014/15). (Source: DRMIS)

Table Summary

This table illustrates the resources used in FY 2014/15, as broken down by PAA sub-sub program. The first column of this table states the PAA sub-sub program. The second column, to the left, designates the spending in dollars. The third column is the percentage of total program spending, and the fourth and final column is the number of Full Time Equivalents. Readers can determine the financial and human resource allocations for each PAA sub-sub program by reading left to right.

1.3.3 Issues and Questions

In accordance with the TB Directive on the Evaluation Function (2009),[15] the evaluation addresses the five core issues related to relevance and performance. An evaluation matrix listing each of the evaluation questions, with associated indicators and data sources, is provided at Annex D. The methodology used to gather evidence in support of the evaluation questions can be found at Annex B.

Back to Table of Contents

2.0 Findings and Recommendations

2.1 Relevance—Alignment with Federal Roles and Responsibilities

Key Finding 1: The activities undertaken within IS Lifecycle Program are aligned with the roles and responsibilities of a federal department as defined by Treasury Board.

Through TB policy, the DND/CAF has been delegated the responsibility for the management of an IT program within the department and is fulfilling this role on the operational side through the delivery of the IS Lifecycle Program. The Treasury Board Secretariat (TBS) Chief Information Officer Branch has been assigned the role of functional authority at the federal level and issues directives and associated standards in the areas of IT governance and IT strategies to all federal departments. The TB Policy on Management of Information Technology (2007) is the instrument which assigns the Deputy Minister of the DND/CAF the responsibility for the effective management of IT within the department. TB has also issued a Directive on Management of Information Technology (2009) to provide guidance to ensure that the DND/CAF IT program is consistent with IT management processes across the GC.

Within the DND/CAF, the management of IS has been conducted in a decentralized manner, with ADM(IM) being delegated the role of CIO and Functional Authority for IT and for IT security within the department.[16] As the Functional Authority, ADM(IM) issued Defence Administrative Orders and Directive (DAOD) 6002-1- Management of Information Technology which establishes the roles and responsibilities of other L1s for the management, delivery and support of IT within their own organizations. In this decentralized model, L1s and Environmental Commands are responsible for overseeing their organization’s compliance with policies, directives and standards provided by the Functional Authority; integrating IT solutions and services into the enterprise architecture; informing ADM(IM) on IT challenges and needs; and monitoring and reporting results to ADM(IM) for assessment. The decentralized model, with some central oversight provided by ADM(IM), ensures the DND/CAF fulfills its roles and responsibilities in accordance with TB policy.

Although roles and responsibilities, as specified in the 2007 TB policy, make it clear that the DND/CAF has the authority to manage its own IT program, the creation of SSC in 2011 and the transfer of resources from the DND/CAF to SSC in 2012 has caused some confusion. The nature of the IT infrastructure, applications and networks required to support military operations (such as command and control systems, and classified systems that connect to allied networks) made the split of roles and responsibilities between the two departments challenging. As a result, SSC and the DND/CAF have yet to reach an agreement on the division of roles and responsibilities. Findings from an ADM(RS) audit in 2015 suggested that, as of March 31, 2014, the only document that had been signed by SSC and the DND/CAF was a Business Arrangement that describes the ongoing relationship between SSC and partner organizations in general terms. The evaluation confirmed that this is still the case, as recent attempts to sign an Operating Protocol describing roles and responsibilities between the two departments have not been successful.

While a formal agreement was being negotiated, the DND/CAF issued interim directions in 2013 to clarify roles and responsibilities of the DND/CAF organizations. Many of the items listed in the interim direction were labelled “shared responsibility” between the DND/CAF and SSC. This is problematic because “shared responsibility” does not make either party specifically responsible or accountable for some critically important IT functions. Adding to the confusion is also the fact that neither the TB policies nor the DND/CAF DAODs had been updated to reflect the change of roles and responsibilities that resulted from the creation of SSC. Although higher level DAODs (6000-0 and 6002-0) were refreshed in 2014, lower level DAODs (DAOD 6002-1 - Management of Information Technology and DAOD 6002-9 - Information Technology Asset Management) have not been refreshed since 2012 and 2013 and are not reflective of the split of responsibilities between the DND and SSC. Evaluation interviewees revealed that confusion and delays persist and there is evidence this is negatively impacting operations on both the corporate and military side, as detailed in Section 2.4 of this report.

Table of Contents

2.2 Relevance—Continued Need for the Program

Key Finding 2: There continues to be a need for the IS Lifecycle Program

The need for the DND/CAF to continue to operate its own IS Lifecycle Program is supported by evidence of current and future need for the program.

Government institutions are not immune to the overall increased dependence on technology and can no longer operate efficiently without it. As dependence on technology increases, so does the need for systems security and user support that is provided by an integrated program. The IS Lifecycle Program exists within the DND/CAF to meet the specific technological needs of the department.

Technology assets are critical to the success of CAF military operations and training activities, as well as the corporate activities that support them. Such assets are used to collect, store and retrieve information and disseminate it to the right people, at the right time, so that the right decisions can be made in a strategically timely and secure manner. On the military side, information systems produced by the program enable command and control, communications, computers, intelligence, surveillance and reconnaissance (C4ISR) activities. This enables commanders to have information and decision superiority by being able to collect and disseminate the most relevant and accurate information in a timely and secure manner. Technologies produced by the program also contribute to readiness through applications and networks that allow force generators to train and keep individual and unit readiness information current (e.g., Monitor Mass, Defence Resource Management Information System, Human Resources Management System, etc.). On the corporate side, technologies provide the business intelligence necessary to make strategic decisions that are important to support military operations and to fulfill obligations to safeguard information assets.

Demand for the products and services produced by the program has also remained high. Major and minor projects managed by all stakeholders within the IS Lifecycle Program produce technological solutions of high business value that directly enable existing military capabilities and/or fill capability deficiencies. ADM(IM), for example, manages an average of 20 major capital projects a year worth approximately $2.28 billion that produce common or joint capabilities for use by multiple stakeholders across the department. Other program stakeholders strategically manage minor projects with sufficient flexibility to meet their specific operational needs.

Continued demand for user support delivered by the DND/CAF service desks has also remained high, despite the transfer of some user support responsibilities to SSC. Within the DND/CAF, user support is delivered through various service desks managed by CA, RCN, RCAF, and ADM(IM) service providers, while centrally managed by ADM(IM). In 2014/15, these service desks responded to approximately 397,000 requests for user support (incidents and service requests).[17]

Within the GC, no other organization exists that could completely deliver the level of IS service and products required of such a large, complex and mixed civilian and military organization that operates around the clock. Since information systems are critical to the success of operations, the risk is too high to allow another organization to have complete control or influence over the DND/CAF information systems and services. In addition, there are security considerations on contracting out services to an external organization, particularly since DND’s systems are interoperable with North Atlantic Treaty Organization (NATO) and Five Eyes[18] systems.

Back to Table of Contents

2.3 Relevance—Alignment with Priorities and Strategic Direction

Key Finding 3: The Information Systems Lifecycle Program objectives and activities are well aligned with the priorities and strategic direction of the GC and of the DND/CAF.

In federal government policy, the GC committed to ensuring that the DND/CAF has the tools and capabilities (including technological tools) it needs to deliver on its mandate. The DND is called upon to derive maximum benefit from technology and to ensure that the CAF continues to be a state-of-the-art military. The IS Lifecycle Program plays a key role in facilitating this goal, as it is through this program that the department ensures the availability of technology to enable military operations and ensure success in meeting the departmental mandate. Furthermore, IS Lifecycle Program activities also ensure the DND/CAF is making optimal use of new technologies to “work better and smarter,” a priority that is also supported by GC initiatives, such as BluePrint 20/20, to transform and modernize federal departments.[19]

As previously mentioned, TB provides guidance to departments through policy, directives and standards on how IT should be managed. Within the DND/CAF, the development of the DND/CAF IM/IT policy and the development of departmental IT plans ensure alignment with GC strategic direction. The 6000 series of DAODs (6000-0 to 6003-3) form the basis for IT governance and management within the DND/CAF, as they are based upon TB policies and directives. The development of an annual departmental IT plan is also required by TB and further ensures alignment with GC direction. To fulfill TB mandated requirements, ADM(IM) goes through an annual process of analyzing and validating planned IT spending of all L1s to ensure continued alignment across the department.

Over the last five years additional processes/mechanisms have also been implemented to ensure strategic alignment, such as: the shared responsibility of the Vice Chief of Defence Staff and ADM(IM) for the governance of IT investment priorities; the dissemination of strategic documents providing functional planning guidance[20] to all L1s; and the Defence Renewal Team initiative to increase the visibility of ADM(IM), as the Functional Authority and CIO, through a new governance model.

Back to Table of Contents

2.4 Performance—Achievement of Expected Outcomes (Effectiveness)

The effectiveness of the IS lifecycle program was assessed against the following 10 areas, which align directly with the expected outcomes described in the Logic Model in Annex C:

  1. Timeliness of project delivery
  2. Delivery of client requirements
  3. Systems interoperability
  4. Systems integration
  5. Technical condition (“health”) and availability of information systems
  6. Service management (services to user clients)
  7. Visibility, oversight, and control over information systems
  8. Ability to meet operational requirements and fill capability gaps/deficiencies
  9. Consolidation of systems and services
  10. Impacts on the achievement of expected outcomes resulting from the transfer of resources to SSC

The assessment of performance in these areas is summarized in sections 2.4.1 to 2.4.11.

Ensuring IS security (or IT security) was also an expected outcome in the Logic Model; although the evaluation did not specifically assess performance in this area. IT security is an area continuously audited by the ADM(RS) to identify and mitigate risk. Numerous ADM(RS) audits have been conducted over the last five years and have provided specific recommendations on areas for improvement. The timely implementation of audit recommendations is expected to contribute to continual improvements to systems security.

2.4.1 Timeliness of Project Delivery

Key Finding 4: Major capital Information Systems projects are not meeting departmental standards for the timely delivery of major capital projects.

The fast pace of technology evolution needs to be considered in assessing the timely delivery of information systems projects. Literature review suggests that technology doubles its capability every 12 to 18 months.[21] This exponential growth means that at five years, for example, technological ability is 16 times more advanced than the original capability. At six years, this jumps to a capability that is 32 times more advanced. Thus every year that passes significantly increases the potential for technology to become obsolete. For this reason, the timely delivery of IS projects is critical.

Within the DND/CAF all major capital projects are treated equally. The fast pace of technology evolution is not uniquely considered for IS projects and, therefore, all major capital projects are subject to the same standard timelines for delivery. The Project Approval Directive (PAD) targets the delivery of all major capital projects, including information systems projects, within an average of seven years – from project definition to Full Operational Capability (FOC).

The evaluation found that the average length of major capital information systems projects from the definition phase to FOC was 9.6 years, more than two and a half years over the PAD standard. This average is based on the length of the 12 major capital information systems projects closed during the evaluation period (2010/11 to 2014/15). The average length could be longer, as this average does not include projects that are still open, some of which have been open for over 16 years.[22]

The evaluation also found that delays in the identification and options analysis phase of the project were not a key contributor to the long duration of major capital information systems projects. The variance between the planned project definition approval date and the actual project definition approval date for major capital information systems projects was only 6.8 months,[23] suggesting that the delays are occurring after the projects have received approval to enter the project definition phase.

There are also indications that original timelines for major capital information systems projects are often being extended, lengthening the duration of projects, even if they are considered to be “on schedule.” In the 2014/15 Departmental Performance Report, the department reported that 89 percent of information systems capital projects were on schedule and that no projects were delayed over three months. This measure of project timeliness does not take into account approved timeline extensions. The Capability Investment Database milestone data suggests that eight out of 10 major capital IS projects closed in the last five years had approved timeline extensions.[24] Analysis of the variance between planned FOC and actual FOC of seven closed projects with available data indicates that user clients could be waiting an additional three years for the capability to be delivered due to timeline extensions.

The long duration of IS projects has resulted in at least three major negative impacts. First, the evaluation found evidence that some projects have produced systems that have not been used in operations due, in part, to the delivery of technology that may have been outdated by the time the project was delivered.[25] Second, evidence suggests that additional resources were spent by user clients to implement temporary solutions while waiting for the permanent solution to be delivered. The Functional Assessment conducted by ADM(IM) in 2011 and evaluation interviews conducted four years later confirm this practice has persisted. As a result, the development of duplicative capabilities has presented a challenge for the acceptance of project deliverables, since the temporary solution is the one sometimes preferred by user clients. Temporary solutions are not ideal, as they are often not developed as projects, with no project oversight and no pre-planning for in-service support. Finally, interviewees also suggested that the length of projects has caused heavy reliance on other programs and capabilities or other organizations to meet capability deficiencies.

The evaluation did not explore in depth the reasons why major capital IS projects are taking so long to complete due to scope limitations preventing a thorough analysis of the procurement process; however, evidence suggests that more flexibility is required with regard to allowing major service providers to make decisions on priorities for information systems investments, as opposed to having to go back for approval when requirements change. This flexibility could help to shorten the duration of information systems major capital projects.

Several examples of more flexible project delivery processes were noted in the delivery of minor capital projects. For example, the Military Command Software Centre (MCSC) is seen by clients as an agile organization with significantly less delays due to their agility. Clients of MCSC provided very positive feedback on the ability of MCSC to deliver applications that meet client needs in a timely manner. Director General Enterprise Application Services (DGEAS) is also seeking to improve the timely delivery of application development projects by looking at the “Bi-Modal” project management process, whereby existing capabilities are leveraged rather than recreating a system that already exists or has been optimized by others. Regardless of the approach chosen, evidence indicates the department cannot afford to continue with the existing process for managing major capital IS projects due to the financial costs associated with long projects and the potential risk of delivering on outdated technology.

ADM(RS) Recommendation

  1. There is a need for the department to investigate options for a more flexible process for delivering on major capital information system projects, specifically targeting reduced timelines from project definition to Full Operational Capability.

OPI: ADM(IM)

2.4.2 Delivery of Client Requirements

For major capital projects, a process for stakeholder consultation exists to ensure that clients define the preferred option to satisfy operational requirements. Project management functional guidance[26] outlines the responsibilities of the Project Sponsor, Project Director and Project Manager to ensure the technical requirements of capabilities produced through major capital projects meet the needs of the user clients. While the Project Director is responsible for coordinating input from subject matter experts, including stakeholders from the sponsoring organization and the operational community, the Project Manager is responsible for drafting a project stakeholder management plan. The development of a project Stakeholder Management Plan is a project management best practice, as described in the Project Management Book of Knowledge. At the DND/CAF, the Project Manager is required to analyze all stakeholders’ expectations, assess their impact on the project, and develop appropriate management strategies for effectively engaging stakeholders in project decisions and execution.

Evaluation interviewees also acknowledged the existence of, and their participation in, various project working groups and steering committees to ensure original technical requirements remain relevant to the project sponsors. User acceptance testing is also conducted before project deliverables are accepted. Since technology and requirements may change over the length of the project, user client engagement throughout the entire duration of the project becomes very important.[27]

Key Finding 5: Although a process exists to ensure that major capital project deliverables meet requirements as defined by the client, user clients expressed concern with the level of consultation throughout the entire duration of the project to ensure that requirements remain relevant to those who will operationalize the capability.

Stakeholders expressed concern with using client acceptance as a measure of the degree to which the deliverables meet operational requirements. Some interviewees suggested that in some instances user clients were pressured to accept the solution, either because no other capability existed, or because many resources had already been invested, or a different and/or more appropriate solution could have taken a long time to develop. Project management stakeholders added that the challenge with IS projects is that the project sponsor (who identifies the original requirements) is not often the user client of the system. Although user testing occurs, there is no follow-up conducted to determine if what was delivered actually met the needs of user clients once the system is operationalized (which could be many years after the initial requirement was identified).

Stakeholder perceptions are supported to some extent by a review of documentation for past projects which revealed that, although most projects had some degree of stakeholder engagement built into the project plan, none of the projects had a specific Stakeholder Management Plan as an explicit part of the project plan (or as a standalone document) or any similar formal plan to engage with stakeholders throughout the project. The creation of a Stakeholder Management Plan was introduced in the Project Management Book of Knowledge as a best practice in 2013. As a result, it is not expected that major capital information systems projects launched prior to 2013 would have included such a requirement, but it does highlight the gap that may have existed with regard to stakeholder engagement.

There are impacts resulting from continuing with lengthy projects without having sufficient discussions with users, including a reluctance to use the system when a more appropriate solution may be available. This is especially true if projects are taking a long time to deliver, as requirements and technology can change over time. It is important to note that the user client community did not express concern with minor capital project deliverables, as these were often conducted in a shorter period of time, with a more flexible approach that allowed for constant stakeholder engagement at all phases of the project.

It is important to note that the user client community did not express concern with minor capital project deliverables, as these were often conducted in a shorter period of time, with a more flexible approach that allowed for constant stakeholder engagement at all phases of the project.

ADM(RS) Recommendation

  1. There is a need to ensure that sufficient stakeholder consultations are conducted with the user client community in the delivery of major capital IS projects to ensure operational requirements are met.

OPI: ADM(IM)

2.4.3 Systems Interoperability

To assess the achievement of this expected outcome, the evaluation reviewed 19 operational documents from nine operations and one joint exercise. The operations consisted of six international operations involving interactions with allies and three domestic operations involving interactions with OGDs. The documents consisted of After Action Reports, End Tour Reports, Lessons Learned Reports, and other reports capturing lessons observed. The specific documents reviewed are listed in Annex B – Evaluation Methodology.

To complement the review of operational reports, the evaluation also interviewed and/or received written submissions from eight service providers and 11 user clients and reviewed relevant strategic documents including Business Plans, Functional Guidance and IT planning documents.

Key Finding 6: Systems interoperability with allies, OGDs and between environmental commands has been a challenge in past operations, but ongoing projects are expected to address major challenges.

Four out of the six international operations reviewed that involved interactions with allies specifically mentioned challenges related to information systems interoperability. These challenges resulted in an inability to communicate, extra communications requirements, systemic slowdowns and subsequent effects on operations. The most recent joint exercise (JOINTEX 15) also mentioned systems interoperability challenges. Evaluation interviews also confirmed that interoperability with allies has been a challenge in the past.

Experience in domestic operations indicates there have also been some information systems capability deficiencies that have impacted systems interoperability with OGDs. Several examples were noted from operational documents reviewed and interviewees confirmed that systems interoperability with OGDs has been a challenge in past operations. However, it is noted that, domestically, temporary work-arounds have been found to limit negative impacts on operations.

As a result, significant improvements have been observed during recent joint exercises and operations. The JOINTEX 15 informal after-action review noted improvements with the Canadian Deployed Mission Network’s interoperability with NATO Mission Secret networks. As well, in a recent major international operation, observations were made that the addition of an Information Exchange Gateway and the addition of SIPRNet have both facilitated information sharing with mission partners. Interviewees were also unanimous in stating that improvements have been made, specifically regarding improvements as a result of PEGASUS.

The convergence of the Land Command Support System and CSNI is currently also underway and is expected to improve interoperability within the DND/CAF tactical and operational units, thus eliminating duplication of work and current additional costs.

Initiatives to improve systems interoperability are seen as good practices that must continue to be supported in order to advance effectiveness in this area.

2.4.4 Systems Integration

Stakeholders were nearly unanimous on the importance of systems integration. Interviewees commented that systems integration is important to avoid negative impacts such as:

  1. Inefficiencies of double data entry and the human intervention required to make systems talk to one another. Integration also limits inefficiencies related to acquiring stand-alone systems that may duplicate the capability being provided by existing systems.
  2. Data inaccuracy and non-integrity, as having systems that are not connected and simultaneously updated increases the number of repetitive data entries, thus increasing the risk of data errors.
  3. Interoperability challenges, as lack of integration may not ensure that standards for interoperability with allies are respected.
  4. Challenges with future requirements where there may be difficulty implementing upgrades to systems due to future requirements of re-engineering.
  5. Unsecured systems, as unintegrated systems may not benefit from cyber security monitoring activities and enterprise security initiatives.

Key Finding 7: Despite understanding the need for systems integration, systems have been developed by multiple stakeholders without prioritizing integration, resulting in duplicative systems and inefficiencies. Initiatives currently underway may address existing challenges.

The evaluation found two main reasons for challenges relating to systems integration in the past five years:

  1. An adequate oversight mechanism was not implemented to ensure newly acquired systems can be integrated into existing operational and corporate environments. The Information Management Configuration Control Board provided oversight and the Request for Change (RFC) process[28] was used as the tool for integrating new systems into the existing environments. However, multiple stakeholders suggested that the RFC process is not an adequate process to ensure systems integration, as it is not up to industry standards, it has not been enforced, it can be easily bypassed, and large projects are introduced too late in the oversight process.
  2. Investments in IT have also occurred in a decentralized manner, without horizontal integration considerations.

Stakeholders also suggested that misunderstandings of the architecture process may have contributed to the lack of priority for systems integration. This has resulted in the development of 177 separate networks and 73 protected B enclaves, often duplicative and making it very expensive for the department to maintain. This continues to be the case, as there are examples of systems (such as Monitor MASS and the Human Resources Management System) not yet integrated that likely should be. In the case of these two examples, at this time there is a redundancy in data entry because the systems do not “talk” to each other.

That said, recently the department has recognized the impacts and the need to prioritize integration. Systems integration is now specifically cited as a priority in the Departmental Performance Reports. Priority for systems integration has resulted in transformation initiatives currently being implemented, including the addition of an Architectural Review Board and a Chief Architect role to the IM/IT program to ensure standards and architectures are adhered to by all stakeholders so that capabilities are properly engineered to operate within and interoperate with the existing enterprise capabilities. The roles of the Chief Architect and the Architectural Review Board have yet to be defined and implemented as part of the new ADM(IM) governance structure, but once transformation is complete, both concepts are expected to enable more proactive efforts to ensure integration of new systems. The Security Assessment and Authorization[29] process may also help to address integration issues.

2.4.5 Technical Condition and Availability of Information Systems

In this evaluation, the terms “technical condition” and “health” are used synonymously to refer to the condition of systems based on criteria such as: age; ability to integrate into future platforms; or availability of user support. For example, applications based on dated software, applications that rely on increasingly rare technical expertise, and applications supported by old infrastructure that is becoming more expensive to operate could be considered to be outdated. Such applications are identified by stakeholders as requiring updates/upgrades or replacement with newer systems.

Technical condition differs from system availability (functioning), although the two may be connected. Sometimes even seemingly healthy systems may be unavailable due to unknown errors or the need to take them off-line to conduct scheduled maintenance or upgrades. The monitoring of systems’ health is required in order to proactively determine which systems will require action to keep them healthy and available.

2.4.5.1 Technical Condition

Key Finding 8: The technical condition of applications managed by DGEAS has significantly improved in recent years. The evaluation was not able to assess improvements to the technical condition of applications or infrastructure managed by other L1s due to the unavailability of a common asset management system.

The percentage of enterprise applications (managed by DGEAS) that are considered to be in low technical condition over the last six years is illustrated in Table 4. The data shows a significant improvement in the overall health of applications during the most recent fiscal year, with only 29 percent of enterprise applications in low technical condition in 2015/16, compared to between 62 percent and 69 percent in prior years. Stakeholders suggested DGEAS was heavily affected by the Deficit Reduction Action Plan between 2013/14 and 2014/15, and thus, resources were not available prior to 2015/16 to support many applications.

Significant progress was also noted in the improvement of the technical condition of enterprise applications considered “high business value”[30] to the department. The data in Table 4 shows that while 23 percent to 34 percent of enterprise applications of high business value were not in good technical condition for the past five years, only seven percent remain in low technical condition in 2015/16. This demonstrates strategic investment of resources to improve the health of applications most important to the department, which is seen as a good practice that should be continued.

The technical condition of applications managed by DGEAS had to be used as an approximate assessment of progress in achieving this expected outcome because the department had not yet implemented a universal IS asset management tool to capture all IS assets managed by all stakeholders in the past five years. Although most stakeholders suggested they have a “reasonable” picture of their assets from what they have characterized as an inventory perspective, many stakeholders did not have a good sense of the technical condition of applications, networks and infrastructure under their control. This issue was highlighted in a 2010 report by the Auditor General[31] and subsequently by TB in its response to the OAG audit.[32]

Fiscal Year Total # of apps % of apps in low technical condition

% of apps with high business value and low technical condition[33]

2010/11 397 68% 34%
2011/12 397 68% 34%
2012/13 336 62% 34%
2013/14 290 66% 23%
2014/15 249 69% 0%
2015/16 296 29% 7%

Table 4. Technical Condition of Enterprise Applications Managed by DGEAS over Six Years (2010/11 to 2015/16). (Source: DGEAS.)

Table Summary

This table compares the technical condition of enterprise applications managed by DGEAS over the scope of the evaluation. It has four columns, and seven rows. The first column lists the Fiscal Year for each of the later statistics, starting in 2010/11 and going down to 2015/16. The second column lists the total number of DGEAS enterprise applications, according to the fiscal year. The third column lists the percentage of applications that were found to be in low technical condition. The fourth column lists the percentage of applications with high business value and low technical condition.

The evaluation acknowledges recent progress made by the department implementing IS asset management tools. On the application side, FY 2015/16 was the first year in which all L1s employed the TBS mandated application management tool, CA Clarity (Clarity). On the infrastructure side, the Assyst[34] application is expected to be used as an enterprise tool to manage the technical condition of infrastructure. Migration of infrastructure to Assyst is still on-going. The effectiveness of Clarity and Assyst as enterprise asset management tools is highly dependent on the cooperation of all stakeholders to ensure all assets are reflected accurately in these tools.

The next evaluation should assess the effectiveness of these tools to manage and track the technical condition of IS assets in the department.

2.4.5.2 Availability

Key Finding 9: Based on data from the DND/CAF Departmental Performance reports, networks have a high rate of availability. Information to assess the availability of applications and infrastructure was not available.

The evaluation was unable to assess progress towards improving the availability of applications and infrastructure. Data regarding the availability of networks suggests that systems were functioning and available the majority of the time. As reported in the 2014/15 Departmental Performance Report, networks were available 99 percent of the time. Although the availability of networks has been reported as high and the technical condition of enterprise applications shows some improvement, anecdotal evidence provided by stakeholders suggests that the availability of systems could be compromised by continuing to operate with systems that are several generations behind those being used in industry and public domains, such as the use of Windows 7 and older versions of Internet Explorer. Older generations of systems are at risk of no longer being sustainable, dependable, or secure. The timely completion of initiatives to update or upgrade existing systems was seen as a critical factor in ensuring that systems continue to be available.

2.4.6 Service Management (Support Services to User Clients)

Key Finding 10: A new service delivery model is being implemented that is expected to improve services to user clients.

Until 2014/15, user client services were delivered by the program in a decentralized manner by approximately 120 service desks across the country without the use of common service level standards or a common service management tool. Service providers were utilizing various tools to track and resolve service requests and incident tickets. Service management systems such as Support Magic and Remedy were used by some service providers, while others were not using any formal ticketing system. As a result, the department did not have a common understanding of the quality and timeliness of user services provided.

Since 2014/15, progress has been made to standardize and centralize service delivery using the Information Technology Infrastructure Library[35] processes, services, and standards through the Enterprise IT Service Management (EITSM) project. As part of this project, the department consolidated approximately 120 service desks into roughly 22 Regional Service Management Centres, with central oversight to be provided by a National Service Management Centre. As part of this initiative a Service Catalogue is being developed with service level targets.[36] Previously used tools such as Remedy and Support Magic were decommissioned in 2014/15 and replaced with a new enterprise IT service management tool (Assyst). The tool will be commonly used by all service providers to track performance against service level targets. The ongoing EITSM project is anticipated to be completed in FY 2018/19.

Key Finding 11: A baseline to assess progress toward improving the timeliness and quality of user client services could not be established due to a lack of historical data and client satisfaction feedback. The program could benefit from the collection of relevant service management performance data to adequately assess the extent to which current initiatives have a positive impact on the provision of support to user clients.

The evaluation saw an opportunity to set a baseline for comparison between the former service delivery model and the new consolidated model being delivered through the EITSM project to determine the extent to which effectiveness has improved. Unfortunately, the decommissioning of legacy service management systems (e.g., Remedy, Support Magic, etc.) meant that historical data regarding service requests and incident tickets was not available for the evaluation. In addition, user satisfaction data was not available because the department has not launched a standardized user client survey at least within the last five years. Collecting user client satisfaction data is an industry best practice that has not been implemented. User client feedback can be used to complement the quantitative data being collected through Assyst on incident and service request resolution rates. These limitations made it very difficult for the evaluation to make a definitive assessment on the quality and timeliness of user support provided by the program in the last five years; some preliminary results are discussed below.

In an attempt to set a baseline, the evaluation obtained data from the new Assyst tool on the number of incident and service request tickets logged in FY 2015/16[37] to determine the following:

  • Average number of days to resolve an incident or service request;
  • Percentage of tickets re-opened; and
  • Percentage of tickets resolved on the first “call.”

The data is summarized in Table 5 and industry averages have been used, where available, as a benchmark for comparison only (not as a target). Without a benchmark for comparison (either past performance data or a standard or average) it is difficult to draw conclusions from one year of data.

Total Tickets Logged Average Time to Resolve Tickets % Tickets Re-Opened % First Time Resolution

Incidents[38]

204,691 17 days 2.16% 17.21%

Service Requests[39]

256,159 18 days 0.73% 2.80%
Industry Averages[40] - 8 to 24 hrs 2.00% 64.26%

Table 5. Performance Data on Service Delivery for FY 2015/16—Incidents and Service Requests. (Source: Assyst.)

Table Summary

This table compares departmental performance data on service delivery for FY 2015/16 to industry averages. This table has five columns and four rows. The first column lists the type of service compared (incidents, service requests, industry averages). The second column lists the total tickets logged. The third column lists average time to resolve tickets. The fourth column lists the percentage of those tickets that were re-opened. The fifth column provides the percentage of tickets that were first-time resolutions.

Assyst data provided some preliminary, but mixed, results for future comparison. On the one hand, ticket reopening rates indicate that quality services are being delivered to user clients. The target is to have a low percentage of re-opened calls because low percentage of re-opened calls is an indication of quality in completing tickets. This results in less work for the service provider and a satisfied user client. As illustrated in Table 5, the DND/CAF service provider averages of 2.16 percent for incidents and 0.73 percent for service requests meet and exceed the industry average of 2 percent.

On the other hand, Assyst data on “First Time Resolution Rates” for both incidents and service requests suggests room for improvement when compared to industry averages. First time resolution refers to the resolution of service requests or incidents on the first attempt. User satisfaction increases when their questions are answered right away or their issues are addressed while they are waiting. An IT service provider’s objective should be to have a high percentage of first time resolution calls.[41] While the industry average was 64 percent for incidents and service requests combined, the DNC/CAF service providers averaged 17.21 percent for incidents and 2.80 percent for service requests, indicating a significant need for improvement in this area.

Based on limited data, the average time to resolve tickets also suggests room for improvement. The DND/CAF average of 17 and 18 days to resolve incidents and service requests (respectively) was significantly higher than the industry average of 18 to 24 hours, as illustrated in Table 5. Within industry, approximately 91 percent of organizations reported resolving incidents in less than five days, and approximately 86 percent reported resolving service requests in less than five days,[42] suggesting that it may be possible for the DND/CAF to reduce the average time to resolve tickets.

Since these assessments are based on limited data, moving forward, the program could benefit from the collection of other performance data to determine the extent to which timely and quality user client services could be improved.

ADM(RS) Recommendation

  1. There is a need to collect user client satisfaction feedback on the timeliness and quality of services.

OPI: ADM(IM)

OCI: CA, RCN, RCAF

2.4.7 Visibility, Oversight, and Control over Information Systems

Documented evidence suggests that the decentralized model employed by the department to deliver the IS Lifecycle Program in the last five years did not enable service providers to make decisions regarding IS based on risk management and strategic direction. Instead, service providers were working in silos and some negative consequences resulted from this approach, including many previously discussed in the evaluation related to systems integration, systems interoperability and service management. Independent studies commissioned by the department suggested the governance of the IM/IT program needed to change in order to provide ADM(IM) (as the Defence CIO) more central visibility and control and ensure better strategic alignment of IS investments in the department. During the conduct of the evaluation, the new governance model was being implemented and an enterprise risk management framework was being developed. Although this is evidence of progress, no assessment could be made with regard to the effectiveness of the new governance model to improve strategic alignment.

Key Finding 12: The visibility, oversight and control of IS investments and decision-making within the CA, RCN and RCAF require improvements

Evaluation evidence suggest that efforts to improve the visibility, oversight and control of ADM(IM) as the functional authority and Defence CIO may not be sufficient to ensure all stakeholders are making decisions about IS investment based on risk management and strategic direction. The evaluation conducted this assessment using the indicators of strong leadership and governance over IT developed by TBS. These indicators are listed in Annex B – Evaluation Methodology.

The evaluation found that RCN, CA and RCAF service providers did not meet all of the indicators of strong leadership and governance over IT that would be required in a federated model. CA stakeholders appeared to have better direction and control for IT spending than did their RCN and RCAF counterparts but still lacked in certain areas. The assessment revealed that all stakeholders could improve direction and control of technology choices within their organizations to ensure projects and initiatives are not occurring without the authority and guidance of the defence CIO. This could improve systems integration, interoperability and security. RCAF stakeholders admitted that they do not have a solid process for prioritizing IS requirements, nor do they have adequate visibility on IT spending at the Wing level. Likewise, RCN stakeholders suggested that RCN headquarters does not play a central role in prioritizing requirements from both coasts and would normally let the coasts coordinate at their level. Discussions with RCN and RCAF personnel revealed that they could not always easily account for IT spending and consequently were making inaccurate expenditure attributions to the PAA.

Best practices observed by the evaluation included the CA’s SharePoint site for tracking IT infrastructure requests submitted by CA level twos for approval and/or funding and for monitoring locally funded IT projects. The evaluation did not find evidence of a similar approach or process being undertaken by RCN or RCAF service providers. All stakeholders saw the value in maintaining oversight and control and also welcomed the opportunity to collaborate and apply best practices used by others.

The next evaluation should examine the extent to which the new federated model has increased effectiveness of the program through improved strategic alignment.

ADM(RS) Recommendation

  1. There is a need for a more robust process for oversight and IS Lifecycle management in order to implement good practices. RCN, RCAF, and CA stakeholders should collaborate with ADM (IM) in the development of that process.

OPI: ADM(IM)

OCI: CA, RCN, RCAF

2.4.8 Operational Requirements and Capability Gaps and Deficiencies

The evaluation reviewed 19 operational documents[43] from nine operations and one joint exercise to determine the extent to which information systems met the needs of stakeholders during operations. The evaluation also reviewed RFC data for the last five years, with a focus on RFCs labeled “immediate.” Immediate RFCs are those where the requestor considers that “the change must be implemented immediately to resolve serious performance deficiencies or problems,”[44] and the request takes priority over other routine changes. This assessed the reason why changes may have been requested by the users/operators. In addition, stakeholder feedback was sought through evaluation interviews.

Key Finding 13: Information systems have not always met past operational requirements. Although some initiatives are already underway to address some of the challenges, this finding highlights the importance of ensuring the timely delivery of major capital projects and sufficient consultations with user clients.

A review of operational reports revealed that IS has not always effectively supported past operations, either because the quantity and/or quality of systems available was insufficient and/or because there were capability gaps. In many cases, work-arounds were implemented that ultimately enabled effective communication and coordination, but with limitations. Examples of information systems deficiencies and impacts on operations were noted in several operational reports and after action reviews of joint exercises. Anecdotally, several examples of impacts on operations were also specifically mentioned by interviewees, some of which have resulted in limited ability to communicate, inability to complete planned exercises, and even the cancellation of one planned mission. As is normally the case, work-arounds have resulted in successful operations, but the circumstances have not always been ideal.

RFC data provides further evidence to support this finding. Out of the 642 RFCs submitted as “immediate” from 2010 to 2016, 51 percent were labeled “new requirement” requests, which could indicate capability gaps/deficiencies with existing systems. The second highest occurrence of immediate RFC (20 percent of requests) were labelled “improve performance capability,” indicating potential performance deficiencies with existing systems.[45]

The perceptions of a few interviewed user clients, suggests some clients may not be completely satisfied with the quality and utility (usefulness) of existing information systems to support operations.[46] When asked to rate their level of satisfaction with existing IS, these interviewees gave an average score of 3.3 out of 5. User clients commented on a wide variety of reasons for their dissatisfaction with existing IS capability, including: outdated systems, limitations of the DWAN, inadequate back-up systems, inadequate bandwidth to support operations and limitations to the Canadian Deployed Mission Network. Other capability deficiencies may exist but only become evident in the conduct of operations and are subsequently captured in operational reports.

Current on-going projects, such as the Protected Military Satellite Communications Project and the potential for a protected BlackBerry communications capability, are expected to address some challenges or capability deficiencies brought up by users/operators. Evaluation assessment also suggests that the timely delivery of IS projects and adequate consultations with users/operators should also improve the delivery of IS capability to better meet the needs of military operations.

2.4.9 Consolidation of Systems and Services

A current objective of the DND/CAF is to consolidate information systems and services; by this we mean a reduction of the number of service desks, networks, and applications managed by all service providers in order to eliminate unnecessary duplication and to maximize cost-savings. This objective resulted primarily from a Defence Renewal Team (DRT) review of the IM/IT program which determined that there was a need to specifically target inefficiencies of the program.

The consolidation of applications within the DND/CAF is now being formally completed through DRT initiative 3.2, with a goal to eliminate underused or duplicative applications and optimize migration to common enterprise platforms. Meanwhile, the consolidation of service desks is being completed through DRT initiative 3.1, with a goal to reduce the number of service desks from approximately 122 to 22 (as discussed in Section 2.4.6), reassign FTEs, and optimize IT service delivery.[47]

Although the consolidation of networks was not identified as a formal DRT initiative, reduction of the number of networks is also a priority.

It is understood that the IS Lifecycle Program should continue to deliver a consolidated and operationally focused IS lifecycle approach in order to avoid going back to inefficiencies of operating under the formal construct. Thus, this goal was included as an expected outcome in the logic model for the IS Lifecycle Program.

Key Finding 14: Although the progress has been slow, there is evidence of progress in the intended consolidation of applications and service desks.

DRT trimestral reports suggest that reorganization of service desks into regional service centres has occurred; even though significant reduction in the number of service desks has yet to be achieved. The DRT reported that initiatives to scale down the number of service desks from 122 to 22 have not progressed according to the original plan, thus significant efficiencies have yet to be realized.

With regard to the consolidation of applications, substantial progress has been made. Data provided by program stakeholders as well as data from DRT trimestral reports indicate that the number of applications managed by the program has been reduced by at least 3,000 since the initiative was undertaken informally in 2011. Reductions in the number of applications are most notable for CA, RCAF and RCN stakeholders. Estimates indicate that these stakeholders managed an average of 1,700 applications each in 2011/12, compared to the average of 371 each in 2014/15. This effort has so far taken six years and consolidation is still not complete. Evaluation evidence suggests that continued efforts are required to ensure success of this initiative.

Measuring progress on the consolidation of networks during this evaluation was not possible because the program was still attempting to gain a better picture of the existence of main networks in the department and to develop a standard definition of a “network.” Program stakeholders estimated that 177 separate networks exist, but were only made aware of the actual number during the recent upgrade to Windows 7. Knowing the number of networks and details on which organization(s) manage them and which organization(s) are supported will be important, as responsibility for managing the networks shifts from the DND/CAF to SSC.

2.4.10 Overall Effectiveness of the Program

Key Finding 15: The department is undertaking numerous initiatives that need to be completed to improve the overall effectiveness of the IS Lifecycle Program. Sustained efforts will be required to ensure their success.

As discussed in the preceding sections (2.4.1 to 2.4.9), program performance has been a challenge in the past and has impacted operations. All of the areas assessed for effectiveness revealed some challenges; however, most areas (six out of nine) also presented evidence of existing initiatives already being implemented by the department to improve performance. As such, the evaluation did not make any further recommendations for improvement. Notably, initiatives in the following areas should be continued: system’s interoperability, system’s integration, technical condition, service management, governance and consolidation of products and services. Details of the specific initiatives are discussed in the preceding sections. These initiatives will require continued effort and on-going performance monitoring to ensure their successful completion.

ADM(RS) Recommendation

  1. There is a need to ensure the completion of the numerous initiatives aimed at improving the program.

OPI: ADM(IM)

2.4.11 SSC Impacts on the Achievement of Expected Outcomes

Key Finding 16: There are indications that the transfer of resources to SSC is impacting some IS Lifecycle Program activities, but impacts could be minimized with clear delineation of roles and responsibilities and the application of service level standards.

Examples of impacts provided by stakeholders indicate the DND/CAF is experiencing service delays, inefficiencies and service gaps. Stakeholders produced a list of high level examples of impacts and provided anecdotal examples through evaluation interviews. The examples provided were analysed and determined to be generally affecting the following:

  • The availability and health of systems, including networks and infrastructure: For example, it was noted that the Defence Resource Management Information System has key failed network components supporting user access and is in need of replacement. SSC holds the maintenance contract which has not yet been renewed. On the infrastructure side, several examples were provided in documents and interviews of delays in SSC procuring replacement parts, impacting the technical condition of systems.
  • The timely delivery of IS projects (major and minor capital and other non-capital projects): The DND/CAF has identified over 400 projects/activities in the Strategic IM/IT plan that will require SSC participation. Stakeholders have already provided several examples of the DND/CAF projects that have been delayed due to delays in procuring the necessary infrastructure, now an SSC responsibility.
  • User client support: the DND/CAF user clients have already started to experience some service delays. As one example of the obstacles encountered by the DND/CAF service providers is the absence of necessary administrator rights, thus delaying services provided to the DND/CAF clients.
  • Availability of the DND/CAF IT resources: Beyond the IT FTEs transferred to SSC, additional DND/CAF resources are being used to deliver services in order minimize service gaps, mitigate impacts and to provide oversight of SSC services. This has resulted in reduced DND/CAF resources available to address other the DND/CAF information systems priorities. Additional detail is provided in Section 2.5.5.

There is also the potential for systems security to be impacted, but this has been avoided by the DND/CAF personnel stepping in to fill service gaps. Several stakeholders suggested that if roles and responsibilities are not clarified, it could impact the DND/CAF Security Assessment & Authorization process, as it is not currently clear who is responsible for authorizing systems when they are under the control of SSC.

An analysis of the issues indicates almost all of the existing challenges could be addressed by developing and implementing mutually agreeable terms of reference or a final operating protocol. This analysis is also supported by the Office of the Auditor General Report on Information Technology Shared Services (Fall 2015) which noted shortfalls in existing service agreements between SSC and its clients. A recommendation was made to SSC to update existing business arrangements with partners, including clarifying roles and responsibilities, establishing service expectations, service targets, and partner reporting commitments.[48]

A management action plan resulting from a 2015 ADM(RS) internal audit,[49] had committed the DND/CAF to finalizing an Operating Protocol with SSC by March 31, 2016. The protocol would include a placeholder for the subsequent development of more detailed Service Level Agreements to be developed at a later date. Unfortunately, as of the completion of this evaluation, the operating protocol has yet to be approved by both departments. Despite the DND/CAF efforts to finalize the protocol, negotiations at the highest levels of both departments continue. Responsibility for Mission Critical/Mission Differentiating C4ISR systems and services, particularly those at the secret level, has not been clarified. The DND/CAF stakeholders emphasized that the DND/CAF remains committed to being a constructive partner with SSC to support their mandate while at the same time ensuring that the DND/CAF is able to deliver on its own mandate. The finalization of this Operating Protocol is expected to “enable both the DND/CAF and SSC to deliver their respective programs and services effectively and efficiently while remaining compliant with their respective mandates, legal requirements, policy obligations and management controls.”[50] Given that some impacts have already resulted, the finalization of roles and responsibilities (through an operating protocol or other such instrument) is critical.

ADM(RS) Recommendation

  1. There is a need for the DND/CAF to continue to work with SSC partners to finalize and clearly delineate roles and responsibilities and set service level standards in a timely manner.

OPI: ADM(IM)

2.5 Performance—Demonstration of Efficiency and Economy

Under the 2009 Treasury Board Policy on Evaluation, efficiency is defined as maximizing the outputs produced with a fixed level of inputs or minimizing the inputs used to produce a fixed level of outputs (paraphrased). Economy is defined as “minimizing the use of resources […] to achieve expected outcomes.”[51] For the purposes of the Policy on Evaluation, the following elements of performance are demonstrated when:

  • Outputs are produced at minimum cost (efficiency); and
  • Outcomes are produced at minimum cost (economy).

Process efficiencies may also be used to qualitatively assess the efficiency and economy of the program. This evaluation used a combination of quantitative assessments based on costs, as well as qualitative assessments based on process (i.e., application of current business practices) to assess overall efficiency. The specific questions and indicators used in this assessment are listed in Annex D.

The findings in this section are based on a review of administrative and financial data from the departmental PAA and from the departmental IT expenditure reports produced for TBS.

2.5.1 Cost-Effectiveness in the Production of Outputs (Efficiency)

Key Finding 17: The department has not accurately tracked expenditures related to the IS Lifecycle Program. As such, it was difficult to determine the extent to which the department has used cost-effective means in the production of outputs.

Financial data analysis and evaluation interviews revealed that attributions to PAA sub-program 4.4 – IS Lifecycle are inaccurate. For instance, the evaluation was not clear how some of the activities attributed to this sub-program contributed to life-cycling information systems and stakeholders were not able to adequately justify the attributions. The evaluation also found that some activities were not attributed to PAA 4.4 – IS Lifecycle, but likely should have been. For example, the estimated $800 million expenditures in 2014/15 do not include the acquisition, development and deployment of: Land Command Support Systems (LCSS) and the Intelligence, Surveillance, Target Acquisition and Reconnaissance (ISTAR) systems that make up the land force C4ISR capability. The acquisition, development and deployment of this capability amounted to $258.2 million in 2014-15. These are acquired, developed and deployed by ADM(Mat) (Director General Land Equipment Program Management – Director Land Command Systems Program Management) and are being charged to PAA 4.2 – Materiel Lifecycle. The evaluation was not clear on why these would not be included in PAA 4.4 which specifically captures information systems for military use.

Another example is that RCAF expenditures as a percentage of total program spending (estimated at 9.9 percent of total annual expenditures of PAA 4.4 ($810 million) may be inflated by as much as $30 million due to inaccurate attributions to PAA 4.4. Other L1s may have also made errors in attribution as well. See section 1.3.2 for further discussion.

The PAA Attribution Rules developed by the Chief of Programme organization, which provides guidance to stakeholders, needs refinement and may have been a contributing factor to the inaccuracies. Stakeholders were not always clear when activities should be charged to PAA 4.4 – IS Lifecycle, as opposed to another sub-program. This was further complicated by several changes to the departmental PAA over the last five years, as it presented a challenge for stakeholders attempting to reliably attribute expenditures and subsequently limited the ability of the evaluation to conduct trending analysis. Stakeholders agreed that analysis using PAA attributions over time would not have rendered reliable findings.

In an attempt to assess cost effectiveness using more accurate financial data and to enable trending analysis, the evaluation used the annual DND IT Expenditure Reports produced for TBS, but with limitations and assumptions as described in Annex B – Evaluation Methodology and Limitations. Table 6 compares expenditure data between expenditure attributions to PAA 4.4 – IS Lifecycle and expenditures as captured in the annual DND IT Expenditure Report. This comparison further highlights the need for better expenditure tracking, as costs between the two can vary significantly.

2011/12 2012/13 2013/14 2014/15
Expenditures as per PAA 4.4 – IS Lifecycle $779,884,662 $732,443,275 $653,212,724 $801,431,686
Expenditures as per DND IT Expenditure Reports $1,180,115,633 $935,971,306 $915,990,706 $996,803,927
Variance $400,230,971 $203,528,031 $262,777,982 $195,372,241

Table 6. Comparison of Annual IT Expenditures Reported – PAA vs. DND IT Expenditure Reports. (Source: DRMIS and DND IT Expenditure Reports (2011/12 to 2014/15.)

Table Summary

This table compares the reported financial figures in the PAA report versus the DND IT Expenditure report. This table has five columns and four rows. The first column specifies the expenditures according to each report in different rows, and the variance between the two figures. The second, third, fourth, and fifth columns specify the year.

Nevertheless, program stakeholders were comfortable with the use of DND IT Expenditure Reports for trending analysis and comparisons since these costs are validated by all L1s using a standard methodology set by TBS. Although DND IT Expenditure Reports also capture costs of activities that would be considered lifecycle activities in support of the corporate environment (PAA 6.7 – Information Technology), these costs only represent 1 percent of what the departments spends on IT overall, and therefore, justifies the use of this data to estimate costs related to this program.

Some analysis was also completed using financial and FTE data attributed to PAA 4.4 – IS Lifecycle for the most current fiscal year (2014/15), which was the most reliable data available.

With limited financial data, the evaluation can only estimate that the DND/CAF spent between $800 million and $996 million in FY 2014/15 on life-cycling IS in support of military operations and training, depending on whether the PAA attributions or the DND IT Expenditure Reports are referenced. This represented between 4.3 percent and 5.4 percent of the DND/CAF’s total annual expenditures.

The trend in IS lifecycle spending as a percentage of total departmental expenditures over the last four years is illustrated in Figure 1 and is based on the DND IT Expenditure Report data. Figure 1 shows a significant decline in spending from 2011/12 to 2012/13, which reflects the initial resource transfer to SSC. The graph also demonstrates a subsequent, steady increase in program expenditures from year to year, which could be attributed to the re-investment in program initiatives to improve the effectiveness of the program or to other expenses incurred by the department since the transfer of resources to SSC. However, without accurate financial data, it is difficult to know for certain.

Figure 1. Trend in DND/CAF IT Expenditures as Percentage of Total Departmental Expenditures. (Source: DND IT Expenditure Reports and Departmental Performance Reports.)

Figure Summary

This figure is a graph demonstrating the trends in DND/CAF IT expenditures as a percentage of the total departmental expenditures. Beginning in FY 2011/12 and continuing up until FY 2014/15, this graph shows a dip and then increase in overall funding. FY 2011/12 saw spending at its highest – with 5.8% of the departmental budget spent. The year following, spending was at its lowest, at 4.7% of departmental spending in FY 2012/13. In FY 2013/14, spending increased to 4.9%. In FY 2014/15, spending increased to 5.4%.

Unfortunately, even with the use of the DND IT Expenditure Report data, some costs of specific interest to the evaluation were not tracked. Costs related to project management, user support and in-service support were not tracked by all stakeholders, nor were they explicitly captured in the DND IT Expenditure Reports. Therefore, it was not possible for the evaluation to assess cost-effectiveness at the activity or outputs level (i.e., managing projects, maintaining, repairing or upgrading systems, strategic coordination, development and control of the department’s decentralized model).

Not accurately tracking expenditures could be problematic, specifically when additional funds are required for project completion, for in-service support, or for adding functionality to the system. Tracking expenditures would enable stakeholders to make better estimates on future requirements and justify requests for additional funds. It would also allow the department to more easily and accurately determine where efficiencies could be gained.

Key Finding 18: Despite not knowing the exact extent of cost-effectiveness, there is evidence indicating that the department is making efforts to improve the overall efficiency of the IS Lifecycle Program.

Through the consolidation of service desks (DRT 3.1) and applications (DRT 3.2), as well as the implementation of a new service delivery model for the IM/IT program, from decentralized to federated (DRT 3.3), departmental initiatives are aiming to identify IS Lifecycle Program resources that can be reinvested in other high priority areas. Since the initiatives have yet to be completed, the evaluation could not determine the extent to which actual cost-savings approximated estimated savings. Regardless, process efficiencies will be gained through consolidation and elimination of duplication of products and services.

The next evaluation could consider the extent to which cost-savings were realized through DRT initiatives, but improved tracking of expenditures will be necessary to conduct a pre- and post-implementation assessment.

ADM(RS) Recommendation

  1. There is a need to improve tracking of program expenditures, particularly at the output level (i.e., project management costs, user support costs, in-service support costs).

OPI: ADM(IM)

OCI: CA, RCN, RCAF

2.5.2 Application of Current Business Practices

Qualitatively, the evaluation assessed the extent to which current business practices are applied to operate the IS Lifecycle Program in the most efficient manner. This included an assessment of effectiveness of the program to monitor and improve program performance and delivery to ensure efficiencies, as well as the application of best practices.

Key Finding 19: A Performance Measurement Framework does not exist to sufficiently capture the performance of the IS Lifecycle Program as a whole.

High-level performance indicators developed for the purposes of the Departmental Performance Report in the last five years did not adequately capture the performance of the program, as most indicators only measured the performance of ADM(IM) organizations. As the program is being delivered using a federated model, performance measurement should reflect the activities of all stakeholders and should be based on a common understanding of the objectives of the program.

The department is making progress towards incorporating Information Technology Infrastructure Library best practices and considering the input of industry experts in the development of new standards and processes. The evaluation found that industry standards are not always considered in the assessment of program performance. For example, the EITSM initiative is currently developing standards based on past performance and is not considering industry standards in the development of service level targets. The evaluation acknowledges the limitations of comparing a government institution to industry; although in some instances, industry standards could be used as a benchmark for comparison. Alternatively, comparisons with other similar federal departments could also be made.[52]

ADM(RS) Recommendation

  1. There is a need to develop a Performance Measurement Framework that incorporates performance indicators for all stakeholders delivering the program and targets that consider industry best practices and standards or comparisons with other federal institutions. Tracking of performance should also be prioritized.

OPI: ADM(IM)

2.5.4 Benchmarking Program Resource Investments

Key Finding 20: The IS Lifecycle Program may benefit from further resource investments, considering past challenges in meeting expected outcomes and comparisons to other departments.

The evaluation conducted a benchmarking exercise to compare the DND/CAF IT expenditures with the IT expenditures of other federal departments. Comparisons with the Royal Canadian Mounted Police (RCMP) and the Canada Border Services Agency (CBSA) were deemed to be appropriate, as they are large departments that also produce information systems in support of operations. To make accurate comparisons, some assumptions had to be made, particularly because the DND/CAF is the only department in this group that specifically divides the set of activities into two separate programs: information systems lifecycle in support of military operations (PAA 4.4 – IS Lifecycle) and information system lifecycle in support of the corporate environment (PAA 6.7 – Information Technology). The methodology and limitations of this benchmarking exercise is described in detail in Annex B – Evaluation Methodology and Limitations.

When compared with other operational departments, the DND/CAF’s IT expenditures as a percentage of total departmental spending has been lower than CBSA and RCMP over the last four years.[53] As illustrated in Figure 2, the DND/CAF spent 5.4 percent of total departmental spending on life-cycling information systems in 2014-2015, while CBSA spent 14.8 percent and RCMP spent 6.6 percent of their respective total departmental expenditures. The DND/CAF also spent less on IT per FTE when compared to CBSA (see Table 7, column (D)) and spent a lower percentage of its IT expenditures on IT salary (see Table 7, column (F)) when compared to both CBSA and RCMP. Analysis of trends in salary expenditures indicates the DND/CAF has also consistently spent less on IT salaries than CBSA and RCMP over the last four years, as illustrated in Figure 3.

Results of this benchmarking exercise, as well as results of evaluation findings related to challenges with program effectiveness over the last five years, suggest that investing in additional IT resources may be required to improve the overall effectiveness of the program going forward. In many cases, success in achieving some of the expected outcomes hinges upon on the successful implementation of new initiatives while sustaining delivery of adequate technologies and services for over 88,000 FTEs. The increased reliance on technology and the need for the department to modernize the way in which it conducts business (as discussed in Section 2.3), further supports this finding.

Figure 2. IT Expenditures as Percentage of Total Departmental Expenditures (four year comparison between DND, RCMP and CBSA). (Source: Departmental IT Expenditure Reports and Departmental Performance Reports)

Figure Summary

Figure two is a graph that compares the IT expenditures of three organizations, the Department of National Defence, the RCMP and CBSA. It considers the trend in IT expenditures in relation to total departmental expenditures over the course of four fiscal years, starting in 2011/12 up until 2014/15.

Of the three organizations, DND spent the lowest percentage of total departmental expenditures on IT on average. DND spent the highest percentage of funding on IT in FY 2011/12, at 5.8%. In the following year, FY 2012/13, DND spent 4.7% of total departmental expenditures. In FY 2013/14, DND spent 4.9% of total expenditures on IT. In FY 2014/15, DND spent 5.4%.

In contrast, the RCMP had a much larger range of spending, as in FY 2011/12, it spent 15.5% of total departmental expenditures on IT, whereas in FY 2012/13, it spent the smallest percentage seen on the graph at 4.6%. In FY 2013/14, the RCMP spent 6.3% of total departmental expenditures on IT. In FY 2014/15, the RCMP spent 6.6%.

On the whole, CBSA spent the greatest percentage of departmental funding on IT. In FY 2011/12, it spent 15.8% of total departmental expenditures. In FY 2012/13, it spent 13.8% of expenditures. In FY 2013/14, it spent 16.5% of expenditures, the highest percentage of spending seen in the graph. In FY 2014/15, it lowered its spending on IT to 14.8% of total departmental expenditures.

Department

(A)

Total Dept

Expenditures

(B)

Total IT

Expenditures

(C)

Total

Dept

FTEs[54]

(D)

IT

dollars

per FTE

(B / C)

(E)

IT Salary

Expenditures[55]

(F)

IT Salary as a

% of

IT Expenditures(E / B)

DND $18,453,938,461 $996,597,262 88,141 $11,309 $226,144,400 22.7%
RCMP $2,861,888,975 $189,837,911 28,787 $6,595 $65,906,999 34.7%
CBSA $2,170,147,906 $320,594,756 13,768 $23,285 $65,906,999 37.5%

Table 7. Benchmarking IT Salary and FTE Expenditures (comparison for FY 2014/15 between DND, RCMP and CBSA). (Source: 2014/15 Departmental IT Expenditure Reports and 2014/15 Departmental Performance Reports).

Table Summary

Table 7 is a table that compares departmental expenditures on IT. It has seven columns and four rows. The first row is the title header of the table. From the left to right, the headers are: Department; (A) Total Department Expenditures; (B) Total IT Expenditures; (C) Total Department FTEs; (D) IT Dollars per FTE; (E) IT Salary Expenditures; (F) IT Salary as a % of IT Expenditures. The figures in column (D) are calculated by dividing B by C, while the figures in column (F) are calculated by dividing E by B. Row two, three, and four are DND, RCMP and CBSA. By reading left to right alongside the rows, readers can see the distribution of IT salary expenditures.

Figure 3. Salary Expenditures as Percentage of Total IT Expenditures (four year comparison between DND, RCMP and CBSA). (Source: Departmental IT Expenditure Reports.)

Figure Summary

Figure three is a graph. It compares the salary expenditures of three departments, CBSA, RCMP and DND, on IT professionals as a percentage of total IT expenditures. The figure compares spending over the course of four years, starting from FY 2011/12 up until FY 2014/15. Spending percentages are conveyed as a trend, with each department having its own line. In FY 2011/12, spending on salaries and expenditures was around the same range. DND spent the highest percentage, at 26.1% of our IT expenditures going towards salary. The RCMP spent 22.6% of IT expenditures on salary, and CBSA spent only 21.5%. However, in the years following there were a number of jumps. In FY 2012/13, CBSA spent the greatest percentage, at 34.7%, whilst the RCMP followed at 26.6% and DND at 21.2%. In FY 2013/14, CBSA continued to increase its spending on IT salaries, spending 40.1% of its IT expenditures on salary expenditures. The RCMP also increased its expenditures, spending 33.4%, whilst DND spent 26.6%. In FY 2014/15, CBSA lowered its percentage, spending 37.5%. The RCMP followed, at 34.7%, and DND back at 22.7%.

The following specific initiatives will require adequate resources if the program is to improve past performance: DRT initiatives (application and service desk consolidation), implementation of Assyst and Clarity tools, all classified IT security initiatives, major projects to improve interoperability, integration of IS, the creation of service level standards (as part of the EITSM initiative), the implementation of the new service delivery model, network consolidation and the finalization of roles and responsibilities and service standards with SSC. In addition, sufficient resources need to continue to be made available to support other TBS and GC-mandated initiatives and priorities.

2.5.5 Efficiencies Gained Through SSC

Key Finding 21: The DND/CAF has not yet realized significant efficiencies as a result of transferring resources and responsibilities to SSC.

In order to maintain adequate service levels to enable the DND/CAF operations and minimize impacts on the DND/CAF systems and users, substantial coordination has been required on the part of the DND/CAF in order to oversee the delivery of services by SSC over the last three years. The 7 Comm Group (formerly the Canadian Forces Shared Services Group), for instance, was specifically stood up by the DND/CAF to provide an interface with SSC in support of transferred IT services. Currently 10 DND/CAF FTEs are working under 7 Comm Group to support the coordination and alignment of IM/IT service transformation initiatives and to ensure CAF command and control of military personnel working under SSC. Additional resources at the base and wing level have also been put in place to provide oversight and minimize impacts. Furthermore, considerable time and effort has also been devoted by the DND/CAF staff at all levels to delineate roles and responsibilities, an on-going challenge for over three years now. An independent study conducted by a consultant[56] also noted the transfer of resources has necessitated significant oversight and attention on the part of the DND/CAF.

Although the department does not explicitly track how much it has spent on activities that were or are expected to be the responsibility of SSC, document review and stakeholder examples provided through evaluation interviews indicates the DND/CAF has had to use its own resources to fill service gaps. For example, requirements for the DND/CAF staff to remain operational 24 hours a day, seven days a week have necessitated the use of the DND/CAF service desks to provide services to user clients during SSC “off-hours.” DND/CAF resources are also continuing to support classified networks and protected B enclaves now considered the responsibility of SSC. Print services was provided as another example; the DND/CAF staff is still involved with configuration and user management of print services, an activity that was transferred to SSC.

Services that were to be covered by SSC, using funds transferred from the DND/CAF, are also now being transferred back to the DND/CAF without the return of any resources. As of April 2016, the DND/CAF will need to employ its own resources to support Mobility Services, including:

  • Paying for the portion of the Mobility invoices;
  • Paying for all BlackBerry devices above the departmental threshold;
  • Managing Mobility invoices using SSC and Vendor delivered tools;
  • Addressing abuses and overages by individuals; and
  • Providing first line service support for all BlackBerry10 devices.

These additional tasks incur significantly more work for the DND/CAF service desk agents, as the total number of mobility devices that would need to be supported is approximately 33,903, with just over 50 percent of them being BlackBerrys.[57] Again, the DND/CAF resources that used to perform this work were transferred when SSC was created. This has resulted in additional work to be performed by the DND/CAF without any added resources.

Table of Contents

Annex A—Management Action Plan

As a result of much consultation, discussion and studies, the need for improved governance, visibility and oversight of program has been identified. Additionally, attention to the delineation of responsibilities between SSC and DND is occurring at the highest levels of leadership. A great deal of effort has been expended to make the IM/IT Program more relevant, effective and efficient. IM/IT Program Stakeholders are all participating in the various initiatives to improve the way the Program is governed, defined managed and overseen.

In addition to the numerous initiatives mentioned in the evaluation report, program leadership continues to explore all opportunities to ensure the program delivers on its outcomes, namely to deliver capabilities to address operational gaps, improve support to existing capabilities and more clearly articulate the results that the Program is delivering to DND/CAF requirements. Additional initiatives undertaken since the completion of the evaluation report include:

  • The development of sub-program strategies and roadmaps to set the direction for all stakeholders. Recent examples include: Application Development, enterprise Business Intelligence and Analytics and C4ISR Targeting and Acquisition;
  • A review of the RFC process resulting in recommended improvements including addressing the backlog of requests;
  • A C160 review to improve the oversight and management of funds in support of common program solutions and services;
  • An IMB approved list of prioritized demands, with a better awareness of the full demand on existing allocations and identified pressures;
  • Leveraging the new Departmental Results Framework as an opportunity to reshape our outcomes, activities and how we articulate the results delivered by the Program;
  • Increased discussions about the ability to react to priorities in the short-term, making better use of Minor Capital Project funding to bridge immediate capability gaps; and
  • A Capability Development Board that focuses at the service provider level of governance to provide the “how” in support of improved decision making for sourcing new solutions and capabilities.

By investing in initiatives that support the DND CIO’s functional authority, engaging IM/IT Program Governance, and engaging with SSC on roles and responsibilities, Program stakeholders are shaping the way decisions are being made. All stakeholders have a better and bigger impact on the improvements being made to understand and influence the IM/IT Program going forward. This will result in a more relevant, effective and efficient program in which decisions will be made from a holistic DND/CAF perspective. The increased visibility created by stakeholder participation will result in the CIO being able to better exercise his functional authority over the Program.

Timeliness of Project Delivery

ADM(RS) Recommendation

  1. There is a need for the department to investigate options for a more flexible process for delivering on major capital information system projects, specifically targeting reduced timelines from project definition to Full Operational Capability.

Response: This recommendation highlights a serious issue in IS capability delivery that requires engagement at high-levels within the Department.

Management Action: ADM (IM) will work with C Prog, CFO and through Senior Defence Management Committees to develop options for ways to reduce the overall time to deliver IS capabilities either through process, policy and/or financial delegation changes. Specifically, one of the options to be explored is additional funding of IT minor capital.

OPI: ADM(IM) - DGIMTSP (DDIMP)

OCI: C Prog, CFO, DGIMPD

Target Date(s):

- Initial meeting to be held by May 31, 2017.

- Options to be presented to IMB/PMB by September 30, 2017.

Delivery of client requirements

ADM(RS) Recommendation

  1. There is a need to ensure that sufficient stakeholder consultations are conducted with the user client community in the delivery of major capital IS projects to ensure operational requirements are met.

Response: IM Group is for most projects an Implementer not a Sponsor. The responsibility to ensure operational requirements are defined for major capital IS Project rests with the Project Sponsor that is represented by the Project Director. The Project Implementer is not usually represented in the early stages of the project prior to Project Definition. The responsibility to conduct stakeholder consultation during the early stages of the project also rests with the sponsor organization. The standard to ensure a continued stakeholder engagement during project definition and implementation is defined as the Project Stakeholder Management in the latest edition of the Project Management Book of Knowledge (PMBOK edition 5). This Knowledge Area was introduced in 2013. Prior to this date, stakeholder management was part of the Communication Knowledge Area.

Management Action: In order to ensure continued stakeholder engagement during the definition and implementation phases of the project, as Project Implementers, project management offices will develop a Stakeholder Management Plan as part of their Project Management Plans, as recommended in the latest edition of the PMBOK. This will be implemented for projects approved in 2017. For projects approved prior to 2017, the Communication Management Plans will be reviewed to ensure stakeholder consultation is well defined.

OPI: ADM(IM) – DGIMPD

Target Date(s):

- June 2017 for major capital IS projects approved for Definition in FY 2017/18.

- Review of other major capital IS projects completed by December 2017 for projects approved prior to FY 2017/18.

Service management (support services to user clients)

ADM(RS) Recommendation

  1. There is a need to collect user client satisfaction feedback on the timeliness and quality of services.

Response: DBRM was stood up this year and has been building capacity to work closely with L1 clients. It is their intent after their first full year of maturity (2017/18) to collect client feedback as part of IM Group performance. L1s are welcome to provide feedback on IT service delivery should they have issues and challenges they believe should be addressed.

Management Action: DBRM plans to lead an L1 client satisfaction survey. In developing the questions that should be asked, they will implement opportunities to address timeliness and quality of service delivery.

OPI: DGIMTSP (DBRM)

OCI: DGIMTSP (DIMCD)

Target Date:

- Baseline survey to be conducted for 2017/18, April 2018.

Visibility, Oversight, and Control over Information Systems

ADM(RS) Recommendation

  1. There is a need for a more robust process for oversight and IS Lifecycle management in order to implement good practices. RCN, RCAF, CA stakeholders should collaborate with ADM (IM) in the development of that process.

Response: As part of DND/CAF’s effort to move towards a federated IM/IT Program, DND/CAF is developing and implementing a new IM/IT Program framework that will improve oversight of the Program at both the L0 and L1 levels. The combination of the new governance being established and the implementation of such requirements as the L1s' attestation of their IM/IT Plan will help to identify any remaining structural and procedural deficiencies to be resolved.

Management Actions: L1s through IT Plans will improve oversight of IM/IT Program projects and activities. ADM(IM) will develop a new oversight process that includes a service provider review of IT plan data to provide feedback to IM/IT governance.

OPI: ADM(IM) - DGIMTSP (DDIMP)

OCI: CA, RCN, RCAF

Target Date:

- Implement the service provider review process by March 31, 2017.

- For review by Governance May 31 2017.

Overall Effectiveness of the Program

ADM(RS) Recommendation

  1. There is a need to ensure the completion of the numerous initiatives aimed at improving the program.

Management Action: The IM Group will develop a list of all the key initiatives aimed at improving the Program and track their status as well as any other information that will support ensuring priority resourcing.

OPI: ADM(IM) – DDIMP

Target Date:

- A list of key initiatives will be developed by March 31, 2017.

- Semi-annual tracking progress will be reported to DDIMP.

SSC Impacts on the Achievement of Expected Outcomes

ADM(RS) Recommendation

  1. There is a need for the DND/CAF to continue to work with SSC partners to finalize and clearly delineate roles and responsibilities and set service level standards in a timely manner.

Response: The DND/CAF continues to work diligently with SSC towards a common understanding of the delineation of roles and responsibilities. In addition, tracking and discussion of the many issues that the Operating Protocol was meant to address continues. Engagements are occurring from the staff level all the way to the DM level. As part of the SSC OAG audit, SSC is developing a Business Arrangement document with all partners.

Management Action: The DND/CAF is in the process of working with SSC to develop the Business Arrangement document.

OPI: DGIMTSP (DBRM)

Target Date:

- Business Arrangement document endorsement by IM Group by September 1, 2017.

Cost-effectiveness in the Production of Outputs (Efficiency)

ADM(RS) Recommendation

  1. There is a need to improve tracking of program expenditures, particularly at the output level (i.e., project management costs, user support costs, in-service support costs).

Response: As part of the 2016 TBS Policy on Results, the DND/CAF is required to replace the departmental PAA with a new Departmental Results Framework and Program Inventory as the new mechanisms to track program expenditures.

Management Action: The IM Group will engage C Prog staff and L1s responsible for the development of the Departmental Results Framework and Program Inventory to improve the IT expenditure framework, attribution rules and understanding of attribution across the Program. It is anticipated that the new Program Inventory will be more intuitive for attribution decisions.

OPI: ADM(IM) - DGIMTSP (DDIMP)

OCI: C Prog, DDIMP

Target Date:

- IT Expenditure taxonomy to be briefed to IM Prog WG upon completion of DRF, January 31, 2018.

Application of Current Business Practices

ADM(RS) Recommendation

  1. There is a need to develop a Performance Measurement Framework that incorporates performance indicators for all stakeholders delivering the program and targets that consider industry best practices and standards or comparisons with other federal institutions. Tracking of performance should also be prioritized.

Response: It is acknowledged that in the past the PMF was only responded to by the IM Group in isolation except where data was readily available to report on the entire Program. In many cases there is no common repository (like Assyst or APM) for the data to support the measures that were in the past PAA PMF (IS System Health, the entire inventory of IT Capital Projects, service metrics). While the repository and data set issues may not be corrected in the near future, through the Departmental Results Framework, IM Group has already begun the logic model process that will result in a Program Inventory with outcomes and indicators that are more relevant to decision makers and define results more clearly to Parliament of Canada. The other L1 service providers will also be engaged through the Program Inventory development to get input and agreement on IM/IT Program indicators and measures.

Management Action: As is being done now for the Management Accountability Framework reporting, every effort will be made to maximize the use of existing and future data sources (such as CA PPM and Assyst) so that all stakeholder performance is represented. TBS CIOB has already begun to mandate industry health and service metrics for IT which will be adopted as targets and benchmarks for DND performance.

OPI: ADM(IM) - DGIMTSP (DDIMP)

OCI: C Prog, L1 Stakeholders

Target Date:

- New IM/IT services metrics will be included in the DRF – PMF, November 30, 2017.

Table of Contents

Annex B—Evaluation Methodology and Limitations

1.0 Methodology

The evaluation examined issues related to relevance and performance, in accordance with the TB 2009 Policy on Evaluation. Table B-1 identifies the specific evaluation questions that have been developed, based on the core evaluation issues stipulated in the TB Directive on the Evaluation Function. Annex D – Evaluation Matrix includes specific indicators and methodologies for each evaluation question.

Relevance
Alignment with Federal Roles and Responsibilities
  1. How do the IS Lifecycle Program activities undertaken within DND align with federal government roles and responsibilities?
Continued Need
  1. How does the IS Lifecycle Program address a demonstrable need for DND/CAF?
Alignment with Government Priorities
  1. How do the IS Lifecycle Program objectives align with the priorities of the GC and the strategic outcomes of DND/CAF?
Performance
Achievement of Expected Outcomes (Effectiveness)
  1. Is the IS Lifecycle Program meeting the expected outcomes as defined in the Logic Model?

Immediate Outcomes:

  • Info Systems are delivered in a timely manner;
  • Info Systems Projects meet specifications (technical requirements);
  • Info Systems are interoperable across the DND/CAF, with OGDs and with allies;
  • Info Systems are integrated into existing operational and corporate environments;
  • Info Systems are in good technical condition (healthy);
  • Info Systems are available (functioning);
  • Clients have access to timely and quality support/services; and
  • Decisions regarding IS are made by all service providers based on risk management and strategic direction

Intermediate Outcomes:

  • Info Systems address a capability gap or deficiency;
  • Info Systems meet clients’ operational requirements; and
  • Info Systems Services and the lifecycle approach are consolidated and operationally focused.
Resource Utilization (Efficiency/Economy)
  1. How much is being spent on the IS Lifecycle Program? To what extent are cost effective means being utilized in the production of outputs?
  1. Does the IS Lifecycle Program apply current business practices to operate in the most efficient manner?
  1. Are the IS Lifecycle Program’s expected outcomes being achieved in the most economical manner?

Table B-1. Evaluation Issues and Questions

Table Summary

Table B-one provides information on the evaluation issues identified and the questions developed.

The evaluation was conducted using internal resources. The approach and level of effort for the evaluation was also complementary to other past/present evaluations, audits and/or reviews, as identified during the planning phase. The evaluation leveraged existing information from these studies/reviews.

An Evaluation Advisory Committee (EAC) was convened for this project to ensure that subject matter expertise and program staff input to the evaluation was available to enable the production of useful evaluation findings and recommendations. The EAC represented all major stakeholders and played an important role as an advisory body but did not act in a decision-making fashion. The L1s represented in the EAC and their members included:

  • ADM(IM) – DGIMO, Director Business Management
  • ADM(IM) – DGEAS, Director Applications Development and Support
  • ADM(IM) – DGIMTSP, Director Defence Information Management Planning
  • ADM(IM) – DGIMPD, Director Project Management Info Systems
  • ADM(Mat) – Director Electronic Systems Procurement
  • RCAF – Director Air Domain Development
  • CA – Director Land Command Information
  • RCN – Member representing Director Naval Strategic Management

1.1 Overview of Data Collection Methods

In order to maximize the possibility of generating useful, valid and relevant evaluation findings, a mixed methods approach was used for this evaluation. This approach allowed for triangulation (i.e., convergence of results across lines of evidence) and complementarity (i.e., developing better understanding by exploring different facets of a complex issue) through the use of both qualitative and quantitative methods. Data collected from each research method was collated and analyzed against the evaluation issues and questions.

Data collection methods were selected based on the data required to address the specific performance indicators, as listed in the Evaluation Matrix (Annex D). The following collection methods were used:

  • Literature and document review;
  • Key informant interviews;
  • Comparative analysis (benchmarking) with other government departments; and
  • Administrative and financial data review.

1.2 Details on Data Collection Methods

1.2.1 Literature and Document Review

Information drawn from this line of evidence was synthesized and integrated into the evaluation to provide context and to complement other lines of evidence in the assessment of relevance and performance. A preliminary review was conducted as part of the planning phase of the Evaluation to garner a foundational understanding of the Information Systems Lifecycle Program. A comprehensive review was undertaken as part of the conduct phase of the evaluation focusing on the relevance and performance of program activities. A total of 114 documents were reviewed in the conduct phase. Types of documents reviewed included:

  • Departmental strategic documents;
  • Policy documents;
  • Operational reports;
  • Project documents;
  • Reports and presentations;
  • Industry reports; and
  • Other studies.

Unclassified operational documents were reviewed to specifically assess information systems interoperability and capability gaps and deficiencies. These reports relate to nine different operations and one Joint Exercise (JOINTEX). Specific reports reviewed are listed in Table B-2.

Operation/ Exercise Type of Operation Location Reports Reviewed
Op LENTUS Domestic Alberta
  1. JTFW Post Op Report–Alberta Floods – June 2013
  1. Air Component Coord Element (W) Post Op Report – August 2013
  1. 41 Brigade Commanders Post Op Report – 2013
  1. 1 CMBG Commanders Post Op Report – 2013
  1. Lessons Learned Report – 2013
Op PODIUM Domestic Vancouver
  1. Vancouver – Maritime Component Command POR – V2010
  1. Vancouver – Maritime Component Command POR – V2010 – Annex C – CIS – 2010
  1. Vancouver – LCC Post Operation Report 2010 04 30 – 2010
  1. Vancouver – OpRes West – Lessons Observed – 2010
Op ATTENTION International Afghanistan
  1. Topic Lessons Learned – Cdn Contribution Trg Mission Afghanistan Roto 1 – June 2012
Op ATHENA International Afghanistan
  1. Afghanistan – Theatre Lesson Report – working with the US – July 2010
Op LIBECCIO International Libya
  1. End Of Tour Report – Libya – November 2011
Op MOBILE International Libya
  1. Lesson Finding Report – Libya- October 2011
Op LOBE International Libya
  1. Libya – TFT End of Tour Report – Annex G – CIS – June 2012
Op IMPACT International Iraq
  1. Iraq - Air Task Force – End of Tour Report – Roto 2 – March 26, 2016
Op RENAISSANCE International Philippines
  1. End Tour Report – Philippines Tropical Storm – November 2013
  1. Air Task Force Post Mission Report – 2013
  1. Lessons Learned Report – 2013
JOINTEX 15 N/A N/A
  1. JOINTEX 15 – Hotwash – JOINTEX– November 2015

Table B-2. List of Operational Documents Reviewed.

Table Summary

Table B-two outlines the operational documents reviewed by the evaluation. It has four columns and eleven rows. From the left, column one lists the Operation/ Exercise. Column two lists the ‘type of operation’ distinguishing between domestic and international

1.2.2 Key Informant Interviews

Key informant interviews undertaken with service providers, user clients and other clients served as important sources of qualitative information and to provide context to information collected through other methods. During the scoping phase, 13 individual and group interviews were conducted. In the conduct phase, feedback was obtained from 24 stakeholders (12 service providers, nine clients, and three representatives from Other Government Departments (OGDs)). Because of the decentralized nature of the program, some stakeholders were both service providers and clients and responded separately to two different sets of questions. Table B-3 lists the type of interviewees and the numbers and name of L1 organizations interviewed.

Individual interviews were conducted in person, by telephone, or by written response. Other times, written replies to evaluation questionnaires were received directly from interviewees by email and, when necessary, follow-up questions were posed by evaluators by phone or in person.

Interviews were conducted with OGDs in order to compare effectiveness and benchmark expenditures. Although at least five OGDs were invited to participate in an interview to share their experiences on the effectiveness of their programs and interoperability with the information systems of the DND/CAF, only one department agreed to share their views on these topics. Two departments agreed to participate in the expenditure data benchmarking exercise and take part in discussions regarding IT spending only. A discussion with the TBS’s CIO Branch was also conducted to discuss the central agency’s view on governance over IT spending and Government of Canada policies related to IT management, roles and responsibilities, and IT spending. A discussion with SSC was also attempted but resulted only in a discussion with representatives from the Office of Audit and Evaluation.

Interviewee Type Number of L1 Organizations Providing Feedback List of Organizations Interviewed
Service Providers 12
  • ADM(IM) (4 L2s)
  • CA (DLCI and MCSC)
  • RCN (MARLANT and MARPAC)
  • RCAF
  • CFINTCOM
  • CJOCSOFCOM
User Clients 7
  • ADM(IM) (DGIMO)
  • CA
  • RCN
  • RCAF
  • CFINTCOM
  • CJOCS
  • OFCOM
Other clients 2
  • CFD
  • ADM(Pol)
OGDs 3
  • CBSA
  • TBS (CIOB)
  • RCMP
Total 24

Table B-3. Organizations Interviewed.

Table Summary

Table B-three outlines the organizations interviewed. It has three columns and six rows. The first column from the left specifies the interviewee type. The second column lists the number of Level One organizations providing feedback within that category. The third column provides a full list of organizations interviewed. Rows two to four list the service providers, user clients, other clients and other governmental departments interviewed. Row five provides the total number of organizations interviewed – twenty five.

1.2.3 Comparative Analysis with OGDs

As discussed in Section 2.0 – Limitations, comparison with similar programs of other nations’ militaries was considered during the scoping phase of the evaluation but was deemed not to be appropriate due to external factors that would have resulted in a less useful comparison.

As such, comparisons with OGDs within Canada were deemed to be more appropriate, especially due to the fact that Information Technology management and spending within the Government of Canada is centrally governed by the TBS’s CIO Branch. In order to make comparisons with OGDs more relevant, operational-type departments with security-related mandates were selected.

Financial and human resources data about OGDs was obtained directly from the department in question and from publicly available government reports, such as IT Plans and Departmental Performance Reports. Questions regarding the information contained in documents were answered by e-mail directly by representatives from these departments. Representatives from these OGDs also confirmed assumptions and interpretations made by the evaluation to ensure a reasonable comparison.

In comparing the DND/CAF expenditures with OGDs, IT Expenditure Reports produced annually and provided to TBS were used. Using IT Expenditure Reports provided a reasonable comparison, as all departments are required to follow the same TBS methodology for capturing costs. IT Expenditure Reports for OGDs were obtained directly from departments, as these reports were not previously publicly available.

All departmental IT expenditures, including expenditures normally excluded from the IT Expenditure Reports were used for this benchmarking exercise. Generally, TBS does not require departments to report expenditures related to Top Secret and Protected C (and above) systems. Therefore, for this benchmarking exercise, those costs had to be added. For example, DND’s 2014/15 IT Expenditure Report excluded $411 million of IT expenditures that were confirmed by DND stakeholders and added for this benchmarking exercise.

Exact exclusion figures for DND for FY 2012/13 and FY 2013/14 were not available. An assumption was made that exclusions would normally be similar from year to year. Therefore, proportional amount based on the 2014/15 actual expenditure exclusions was added to fiscal years 2012/13 and 2013/14. Because 70.22 percent is the proportion of IT exclusions for 2014/15, a proportional amount was applied to DND IT expenditures for 2012/13 and 2013/14 to more accurately reflect total IT spending in those years. Table B-4 illustrates this calculation and the actual amounts added to previous years under this assumption.

Fiscal Year

(A)

Actual IT expenditures reported in IT expenditure reports

(B)

Actual IT expenditures excluded from IT expenditure reports

(C)

Proportion of IT exclusions(B)/(A)x100

(D)

Amount added to estimate total for the year

70.22% of (A)

(E)

Total IT expenditures (including exclusions)

2014/15 $585,597,262 $ 411,206,665 70.22% Not applicable $996,803,927(A+B)
2013/14 $538,121,525 Not available Not available $377,869,180.58 $915,990,706(A+D)
2012/13 $549,859,626 Not available Not available $386,111,680.46 $935,971,306 (A+D)

Table B-4. Calculation of DND IT Exclusions for 2012/13 to 2014/15.

Table Summary

The following table has four rows and six columns. The first row is a series of headers, denoting the contents of each of the boxes in each column. Staring from the left, the first column lists the fiscal year of the following rows. Column two (A) lists the actual it expenditures reported in IT expenditure reports. Column three (B) lists the actual expenditures excluded from IT expenditure reports. Column four (C) lists the proportion of IT exclusions, calculated by the equation (B)/(A)x100. Column five (D) lists the amount added to estimate the total for the year 70.22% of (A). Finally column six, (E) lists the total IT expenditures, including exclusions. The second, third, and fourth row of column one each list a fiscal year, starting in FY 2014/15 in row two, FY 2013/14 in row three, and FY 2012/13 in row four.

To account for IT expenditure exclusions, RCMP advised the evaluation to add 25 percent to the amount of IT expenditures reported in their annual IT Expenditure Reports. CBSA reported that they did not have any IT expenditure exclusions to add.

1.2.4 Analysis of Administrative, Performance, Financial, and Human Resources Data

Information Systems Lifecycle Program administrative, performance, financial and human resource data was used to develop a program profile, as well as to answer evaluation questions related to performance (effectiveness, efficiency, and economy). The data covering FY 2010/11 to FY 2014/15 was extracted from multiple systems and reports including:

  • Defence Resource Management Information System (DRMIS);
  • Assyst;
  • Request for Change (RFC) Database;
  • Defence Renewal Team (DRT) trimestral reports;
  • IT Expenditure Reports; and
  • Capability Investment Database (CID).

Because of the limitation of accessing data from decommissioned systems such as Remedy and Support Magic (as described in Section 2.0 – Limitations), some available and relevant data from Assyst for FY 2015/16 was obtained. Performance data gaps for FY 2010/11 to FY 2013/14 were filled with estimates provided directly by stakeholders. When estimates were provided, efforts were made to compare estimates with actual data from Assyst to ensure that the estimates could be reliably used. If estimates and actuals were similar or followed a similar trend, the evaluation assumed the estimates to be the most accurate data available.

Financial attributions to the PAA were used in the analysis of program efficiency and economy but trending analysis using this data was limited, as described in Section 2 - Limitations. Therefore, some analysis was limited to the use of financial and FTE data attributed to PAA 4.4 – Information Systems Lifecycle in FY 2014/15. This was considered the most reliable fiscal year of PAA data.

Efficiency and economy analysis relied heavily on annual IT Expenditure Reports produced for TBS. These reports include costs outside of the scope of this evaluation (i.e., IT expenditures considered internal services and normally charged to PAA 6.7 – Information Technology, as well as IT expenditures incurred by L1s who are not directly involved in military operations).[58] Nevertheless, the expenditures outside the scope of the evaluation were minimal, as they represented approximately only 1 percent of total IT expenditures and therefore justified the use of IT Expenditure Report data.

2.0 Limitations

The main limitations of this evaluation and mitigation strategies implemented are described in Table B-5.

Limitation Mitigation Strategy
  1. Consistency of Financial Information

Financial attributions to PAA 4.4 across service delivery partners and across years were found to be inaccurate and inconsistent. There were several reasons for this:

  • The introduction of the new departmental PAA structure in FY 2012/13;
  • Challenges experienced by stakeholders with regard to understanding what specific activities to attribute to each specific PAA sub-program; and
  • Difficulty in differentiating between what would be considered IT expenditures in direct support to military operations compared to what would be considered IT expenditures as an internal service.

Consultations during the planning and scoping phase of the evaluation yielded some financial data clean-up by some stakeholders, making the data slightly more accurate and reliable.

Each department annually submits an IT Expenditure Report to TBS. This provided the ability to conduct some limited trend analysis.

The financial data was discussed with program financial staff to help clarify discrepancies, provide context, and identify misunderstandings by the evaluation team.

  1. Availability of Historical Performance Information

Historical performance information for this program was unavailable due to the decommissioning of databases that were replaced by the EITSM tool (Assyst) in FY 2014/15. Therefore, trend analysis for performance data was also limited.

The report relied upon multiple sources of administrative data, including historical data provided directly by program staff:

  • Stakeholders responded to a questionnaire, providing actuals and estimates for FY 2010/11 to FY 2014/15. They also provided contextual annotations to supplement the information provided.
  • ADM (IM) provided statistics regarding enterprise applications managed by DGEAS, citing the technical condition of applications in relation to their business value.

Assyst data for FY 2015/16 was also used to complement historical data provided by stakeholders.

  1. Decentralized Program Delivery

The IS Lifecycle Program is not delivered by a single service provider, but rather by a number of stakeholders across the department with varying levels of governance and financial resources. Although ADM(IM) provides functional guidance, each stakeholder has a different reporting structure and requirements up to their L1. This made it challenging to evaluate the program as a whole, as each stakeholder has a different process they follow to meet the common objectives of the program. On the other hand, this also presents a unique opportunity for stakeholders to learn from each other and implement best practices across the department.

A common logic model was developed in consultation with EAC members to include all stakeholders and all the major activities involved in the delivery of the program. The evaluation focussed on the conduct of these main activities and their impacts to ensure that the focus was not on individual processes, but rather on the achievement of common outcomes.

All stakeholders were asked to confirm that the logic model included their contributions to the program.

  1. Benchmarking

Benchmarking was a challenge of this evaluation as this program is unique in many facets, including its decentralized delivery design, its focus on outputs for military use, and the combination of civilian and military stakeholders and clients.

Although the TBS Chief Information Officer Branch produces a comparison of IT spending between departments, the comparison is based primarily on the amount of spending and does not take into consideration the nature of the business of each department.

During the scoping phase of the evaluation, stakeholders were asked for advice on potential comparators and, in general, stakeholders recommended benchmarking against other federal government departments but cautioned on making comparisons due to the uniqueness of DND as a military and civilian organization with a relatively large mandate and budget.

During the planning phase, the evaluation team conducted preliminary research on IT spending by other federal departments and determined that the best comparators were operational-type departments with security-related mandates. Comparisons with departments such as RCMP, CBSA, Public Safety Canada and Department of Fisheries and Oceans were deemed to be appropriate, although not all departments chose to participate.

Table B-5. Limitations and Mitigation Strategies.

Table of Contents

Table Summary

Table B-five has two columns and four rows. The left-hand column lists the limitations and the right-hand column lists the mitigation strategies. Rows one to four list the four limitations identified. Read across each row to determine the limitations identified and their corresponding mitigation strategies.

Annex C—Logic Model

Figure C- 1. Logic Model for the Information Systems Lifecycle Program. This flowchart shows the relationship between the program’s main activities, outputs and expected outcomes.

Table of Contents

Figure Summary

Figure C-one is a flowchart diagram that is to be read from bottom to top. It is a logic model that demonstrates the relationships between inputs (resources invested), the activities (transformation of inputs into procedure), the outputs (results and tangible deliverables) and the expected immediate, intermediate and ultimate outcomes (real impacts) generated by the program.

The activities of this flowchart are divided into four separate streams of activities that later feed into three clusters of outputs. These three clusters of outputs contribute to four clusters of immediate outcomes, which then contribute to three intermediate outcomes. These three intermediate outcomes all lead to a single ultimate outcome of “IS infrastructure and applications are available in the right quantity, mix, and condition to support military operations, defence services & contributions to Government, and readiness and training activities.”

The logic model can be broken down into specific sections,

The four areas of activities are: PAA 4.4.1 Portfolio Management; PAA 4.4.2 Acquisition, Development and Deployment; PAA 4.4.3 Systems Management and User Support; and PAA 4.4.4 Strategic Coordination, Development and Control.

-PAA 4.4.1 Portfolio Management

o Identify or define client requirements

o Conduct business intake and demand management for all requirements (prioritization)

o Coordinate activities with internal (across DND and within the L1) & external organizations (national and international allies)

o Manage contractors/contracts and license agreements

o Integrate with other service providers

-PAA 4.4.2 Acquisition, Development and Deployment

o Design and develop IS architectures

o Design and develop networks

o Conduct project management activities (from definition to closeout)

o Develop IS (including software development & hardware/system build, configuration and customization)

o Procure IS equipment or services

o Test IS

o Integrate systems into existing architectures within the operational environment (including installation and configuration)

o Deploy/implement systems

o Conduct/prepare security assessments of IS for accreditation/ authorization

o Integrate with other service providers (SSC)

-4.4.3 System Management and Support

o Conduct in-service support (maintain, repair/fix, test & upgrade/update systems, and minor enhancements)

o Provide technical expertise and engineering services in the design and build phase of new capability, development projects and initiatives – tier 3 & 4

o Integrate with other service providers (SSC)

o Divest IS

-4.4.4 Strategic Coordination, Development and Control

o Provide governance, oversight and coordination

o Develop and promulgate guidance, direction, plans, standards, and policies

o Integrate departmental strategic direction

o Performance Management (Monitoring & Reporting)

o Provide advice to other service providers (wrt IT infrastructure engineering, system design, improved delivery, evaluation, alignment, project management, procurement)

o Conduct oversight and compliance of the Security Assessment and Authorization Program

o Resource and business relationship management

o Integrate with other service providers (SSC)

These four areas of activities directly contribute to three separate sets of outputs. PAA 4.4.1 Portfolio Management and PAA 4.4.2 Acquisition, Development, and Deployment contribute to Major and Minor IS Projects. These include, System Management Technologies and Security Technologies, Systems that support C4ISR & Cyber, Distributed Technologies, Application Technologies (Enterprise and User-Specific), IS Networks and Architectures. PAA 4.4.3 System Management and User Support directly contributed to Engineering/ Break/Fix Services, as well as Client Support Services (Service Desk) outputs. PAA 4.4.4 Strategic Coordination, Development, and Control contributed to Security Certificates, Business Plans & Reports, Directives, Plans and Functional Planning Guidance as well as Expertise and Advice.

These outputs directly feed into immediate outcomes. The outputs categorized under Major and Minor IS Projects contribute directly to IS Projects meet specifications (technical requirements), Info Systems are delivered in a timely manner, Info Systems are interoperable across the DND/CAF, with OGDs and with allies, and Info Systems are integrated into existing operational and corporate environments. The outputs Engineering/ Break/Fix Services, as well as Client Support Services (Service Desk) contribute to Info Systems are in good technical condition (“healthy”), Info Systems are available, and Clients have access to timely and quality support/services. The Security Certificate output leads to Info Systems are secure/meet baseline IT security requirements. The Business Plans & Reports, Directives, Plans and Functional Planning Guidance as well as Expertise and Advice outputs feed into decisions regarding IS are made by all service providers based on risk management and strategic direction.

All of these immediate outcomes contribute directly to three intermediate outcomes. These intermediate outcomes are; Info Systems address a capability gap or deficiency (IS evolves), Info Systems meet clients’ operational requirements (IS is sustained), and IS Services and IS lifecycle approach are consolidated and operationally focused. These three intermediate outcomes feed into a single ultimate outcome, IS infrastructure & applications are available in the right quantity, mix and condition to support military operations, defence services & contributions to Government, and readiness and training activities.

Annex D—Evaluation Matrix

Evaluation Matrix - Relevance
Evaluation Questions Indicators Literature, Document Review Key Informant Interviews Administrative and Financial Data Comparative Analysis with OGDs
1. How does the IS Lifecycle Program address a demonstrable need for DND/CAF? 1.1 Evidence that outputs produced are necessary for CAF operations. Yes No No No
1.2. Number of Service Desk requests received (trend over time) (incl. incidents and service requests). No Yes Yes No
1.3. Number of break-fix services provided (trend over time) (isolated incidents only). No Yes Yes No
1.4. Number and/or value of IS projects managed by the program (trend over time). No Yes Yes Yes
1.5. Number and/or value of in-service support contracts managed by the program (trend over time). No Yes Yes No
1.6. Existence of other departments and organizations capable of providing appropriate and adequate IS to meet the needs of DND/CAF. Yes Yes No No
2. How do the IS Lifecycle Program objectives align with the priorities of the GC and the strategic outcomes of DND/CAF? 2.1 Degree of alignment between IS Lifecycle Program objectives and federal government priorities. Yes No No No
2.2 Degree of alignment between IS Lifecycle Program objectives and DND/CAF strategic outcomes. Yes No No No
3. How do the IS Lifecycle Program activities undertaken within DND align with federal government roles and responsibilities? 3.1 Degree of alignment between DND/CAF IT policies and TB IT framework and policies. Yes No No No
3.1 Degree of alignment between DND/CAF IT policies and TB IT framework and policies. Yes No No No

Table D-1. Evaluation Matrix—Relevance. This table indicates the data collection methods used to assess the evaluation issues and questions for determining the Information Systems Lifecycle Program’s relevance.

Table Summary

This table indicates the data collection methods used to assess the evaluation issues/questions for determining the Information Systems Lifecycle Program’s relevance. This table has six columns and twelve rows, including the titles. The left-hand column lists the evaluation issues/questions applicable to the determination of the relevance of the program; the second column lists the performance indicators applicable to the questions while the last four columns indicate the potential data collection methods to be used in the evaluation. To learn these data collection methods, select an evaluation question and read the columns to its right to learn the data collection methods that correspond to it. Those data collection methods are: Literature, Document Review; Key Informant Interviews; Administrative and Financial Data; and Comparative Analysis with OGDs. When the word “yes” appears in the matrix, it means that this data collection method has been used to assess the questions and to find evidence pertaining to the stated indicators.

Evaluation Matrix - Performance (Effectiveness)
Evaluation Questions Indicators Literature, Document Review Key Informant Interviews Administrative and Financial Data Comparative Analysis with OGDs
4. Is the IS Lifecycle Program meeting the expected outcomes as defined in the Logic Model? 4.1 Documented acceptance of project deliverables. Yes Yes No No
4.2 Degree of alignment between project deliverable and project specifications. Yes Yes No No
4.3 Client satisfaction and impacts after modifications to initial specifications. Yes Yes No No
4.4 Number of validated client requirements delivered (type 3 or 4). Yes Yes No No
4.5 Percentage of Info Systems Capital Projects (Defence Specific) on Schedule (original or adjusted timeline). No Yes Yes No
4.6 Percentage of Info Systems Program projects (Defence Specific) in "red" status over 3 months. No Yes Yes No
4.7 Impact of delayed IS project delivery on operations. Yes Yes No No
4.8 Impact of IS Project timelines on delivery of appropriate and current IS. Yes Yes No No
4.9 Average length of IS Projects (trend over time). No Yes Yes No
4.10 Variance between initial project timeline and real project timeline. No Yes Yes No
4.11 Degree to which IS enables effective interoperability with allies and its impact on operations. No Yes No No
4.12 Degree to which IS enables interoperability with OGDs and its impact on operations. No Yes No No
4.13 Degree to which IS enables interoperability with other environments and its impact on operations. No Yes No No
4.14 Extent, impact and root causes of not having integrated systems. No Yes No No
4.15 IS acquired or developed that do not fit into existing networks and platforms and associated impact Yes Yes No Yes
4.16 Existence and effectiveness of oversight mechanisms to ensure integration. Yes No No No
4.17 Percentage of DND Application Portfolio that is considered "healthy." No Yes Yes No
4.18 Percentage of applications with high business value that are in low technical condition. No Yes Yes No
4.19 Client and other stakeholders' perceptions of the reliability of IS. No Yes No No
4.20 Existence and usefulness of an asset management system or tool to assess the age of IS assets. Yes Yes No No
4.21 Percentage of time info system networks (DND owned and operated) are available. No Yes Yes No
4.22 Existence of and adherence to in-service support schedules and impacts on operations. No Yes Yes No
4.23 Time between "break" and "fix" and impacts on operations. No Yes Yes No
4.24 Percentage of tickets resolved within Service Level Agreement standard / industry standards. No Yes Yes Yes
4.25 Number of unresolved tickets (trend over time). No Yes Yes Yes
4.26 Extent of client satisfaction with Service Desk services. No Yes Yes Yes
4.27 Percentage of calls (tickets) re-opened. No Yes Yes Yes
4.28 Percentage of first-time resolution calls. No Yes Yes Yes
4.29 IM/IT security policies are current, relevant, and consistent with Government of Canada directives. Not assessed Not assessed Not assessed Not assessed
4.30 Percentage of IS operating with appropriate SA&A certificates or with Interim Authorization to Process. Not assessed Not assessed Not assessed Not assessed
4.31 Existence and results of assessments and monitoring activities to ensure systems are secure. Not assessed Not assessed Not assessed Not assessed
4.32 Acceptance and use of a risk management framework. Yes Yes No No
4.33 Existence of and alignment of IM/IT plans with functional planning guidance. Yes Yes No No
4.34 Percentage of Defence Info Systems score on the Coordination, Development, and Control Performance Evaluation Index. Yes Yes No No
4.35 Client acceptance of IS. Yes Yes No No
4.36 Extent of client satisfaction with the quality and utility of IS to support operations. Yes Yes No No
4.37 Degree to which technical specifications satisfy operational requirements. Yes Yes No No
4.38 Number of Requests for Change and Un-forecasted Operational Requirements (trend over time). No Yes Yes No
4.39 Extent to which IS Portfolio (applications, systems and networks) have been consolidated Yes Yes No No
4.40 Extent to which Service Desk/Desktop Support Services have been consolidated. Yes Yes No No
4.41 Service provider and other stakeholders’ perceptions on the need for consolidated IS products, services and approach and potential impacts (positive or negative) on operations No Yes No No
4.42 Evidence and perceptions that the IS lifecycle has produced IS adequate/appropriate to operations No Yes No No

Table D-2. Evaluation Matrix—Performance (Effectiveness). This table indicates the data collection methods used to assess the evaluation issues and questions for determining the Information Systems Lifecycle Program’s performance in terms of achievement of outcomes (effectiveness).

Table Summary

This table indicates the data collection methods used to assess the evaluation issues/questions for determining the Information Systems Lifecycle Program’s performance in terms of achievement of outcomes (effectiveness). This table has six columns and forty-five rows, including the titles. The left-hand column lists the evaluation issues/questions applicable to the determination of the performance of the program; the second column lists the performance indicators applicable to the questions while the last four columns indicate the potential data collection methods to be used in the evaluation. To learn these data collection methods, select an evaluation question and read the columns to its right to learn the data collection methods that correspond to it. Those data collection methods are: Literature, Document Review; Key Informant Interviews; Administrative and Financial Data; and Comparative Analysis with OGDs. When the word “yes” appears in the matrix, it means that this data collection method has been used to assess the questions and to find evidence pertaining to the stated indicators.

Evaluation Matrix - Performance (Demonstration of Efficiency)
Evaluation Questions Indicators Literature, Document Review Key Informant Interviews Administrative and Financial Data Administrative and Financial Data
5. How much is being spent on the IS Lifecycle Program? To what extent are cost effective means being utilized in the production of outputs?

5.1 Comparison of dollars and resources spent in the IS Lifecycle Program broken down by Sub-Sub-Program (2 – 5 years depending):

  • Trends over time;
  • Comparison of relative trends of SSPs; and
  • Comparison of number of people to run the IS program with other similar organizations.
No Yes Yes No
5.2 Comparison of dollars/resources spent by ECs on IS Lifecycle Program broken down by SSP. No Yes Yes No
5.3 ADM(IM) dollars/resources spent in support of each EC. No Yes Yes No

5.4 Costs per output including:

  • Trends in acquisition costs for major and minor projects (project management costs vs project value);
  • Trends in service desk costs per DND/CAF member;
  • Comparison of selected service desks; and
  • Trends in dollars/resources to provide strategic coordination, development and guidance.
No Yes Yes No
6. Does the IS Lifecycle Program apply current business practices to operate in the most efficient manner? 6.1 Existence of a PMF that is used regularly by the program to monitor performance and improve program delivery. Yes Yes No No
6.2 Existence and use of a review process for updating guidance, direction, plans, standards and policies. No Yes No Yes
6.3 Evidence of a systematic review and application of best practices. No Yes No Yes
6.4 Existence of mechanisms and initiatives to improve efficiencies. No Yes No Yes
7. Are the IS Lifecycle Program's expected outcomes being achieved in the most economical manner? 7.1 Trends in overall program cost as a percentage of the departmental cost. No Yes Yes No
7.2 Trends in program cost per departmental FTE. No Yes Yes No
7.3 Trends in departmental FTE per Program FTE. No Yes Yes No
7.4 Trends in program cost per program FTE. No Yes Yes No
7.5 Trends in program FTE salary per program FTE. No Yes Yes No
7.6 Service provider and other stakeholders’ perceptions on initiatives undertaken to deliver the program as economically as possible No Yes No No
7.7 Evidence of alternative service delivery models (pros and cons). Yes Yes No No
7.8 Benchmarking of IT expenditures: Comparison to OGDs (RCMP, CBSA, DFO); and Comparison with industry. Yes Yes Yes Yes

Table D-3. Evaluation Matrix—Performance (Efficiency and Economy). This table indicates the data collection methods used to assess the evaluation issues and questions for determining the Information Systems Lifecycle Program’s performance in terms of efficiency and economy.

Table of Contents

Table Summary

This table indicates the data collection methods used to assess the evaluation issues/questions for determining the Information Systems Lifecycle Program’s performance in terms of efficiency and economy. This table has six columns and eighteen rows, including the titles. The left-hand column lists the evaluation issues/questions applicable to the determination of the performance of the program; the second column lists the performance indicators applicable to the questions while the last four columns indicate the potential data collection methods to be used in the evaluation. To learn these data collection methods, select an evaluation question and read the columns to its right to learn the data collection methods that correspond to it. Those data collection methods are: Literature, Document Review; Key Informant Interviews; Administrative and Financial Data; and Comparative Analysis with OGDs. When the word “yes” appears in the matrix, it means that this data collection method has been used to assess the questions and to find evidence pertaining to the stated indicators.

Annex E—IS Lifecycle Program Major Service Providers

The specific L2 and L3 organizations responsible for delivering the products and services of the program and their responsibilities are:

  1. ADM(IM). All L2 organizations under ADM(IM) are major service providers, including:
    • Director General Information Management Technology and Strategic Planning: Responsible for engineering and integrating the DND/CAF IM/IT architecture; providing a centralized client portfolio management function (business intake and demand management); coordinating the departmental IT security program; and providing strategic direction and leadership through the implementation of governance structure, policies, guidelines and procedures.
    • Director General Information Management Project Delivery: Responsible for project managing all joint and enterprise IS projects, including planning, designing, developing, deploying, and implementing changes.
    • Director General Information Management Enterprise Application Services: In collaboration with its institutional partners, develops, delivers, and supports information management applications and enterprise-level Resource Management solutions.
    • Director General Information Management Operations: As a service provider, is responsible for delivering network infrastructure support to CAF classified networks, delivering user client support services at the national and regional level, providing network monitoring and defence, and providing an interface with Shared Services Canada. Director General Information Management Operations is also a user client, as the organization spans all levels of command (tactical, operational, and strategic) in order to coordinate, support, and provide the command and control and intelligence capabilities required for military operations. As a user client, this organization operationalizes systems to support signals intelligence, electronic warfare and cyber operations.
  2. CA. Under the Chief of Staff Army Strategy, Director Land Command and Information (DLCI) leads the Army effort to provide services to meet the specific information systems needs of the CA user clients. The CA also has a software development organization known as Military Command Software Centre (MCSC). Originally created to develop and support the specific software requirements of the Army, MCSC services and applications are now accessed by various other L1 organizations across the department. The application suite includes Monitor MASS (now MCS Personnel) and Canadian Forces Task Plans and Operations (CFTPO), among others.
  3. RCAF. Under the Director General Air Force Development, the Director Air Domain Development (DADD) is responsible for providing information systems services to meet the specific operational needs of RCAF user clients.
  4. RCN. Under the Director General Naval Strategic Readiness, the Director Naval Information Management Requirements (DNIMR) is responsible for providing information systems services to meet the specific operational needs of RCN user clients.

Under the environmental commands, DLCI, DADD, and DNIMR are similarly responsible for: Providing strategic direction and leadership for the implementation of departmental IT policies, guidelines and procedures; identifying and prioritizing IS requirements for their respective environments; conducting portfolio management; recapitalizing IT equipment; managing minor projects; sponsoring major capital projects; and providing user client support through regional service centers across Canada.

Other L1 organizations are minor service providers and user clients of the program; traditionally, these L1s have not attributed their IS lifecycle management activities to PAA 4.4 – IS Lifecycle.

Shared Services Canada is considered a major service provider, but is not considered a main stakeholder of the program because this is an organization external to the DND/CAF.

Table of Contents


Footnote 1 In particular, DRT initiatives: 3.1 – Optimize IM/IT Service Delivery (i.e. service desk consolidation), 3.2 – Application Portfolio Management (i.e., application consolidation), and 3.3 – Rationalize Defence IM/IT Program (i.e., program governance).

Footnote 2 The Defence Renewal initiative is the DND/CAF effort to comprehensively transform the major business processes of the department, including the IM/IT program, to create a lean and efficient organization that can generate savings to be reinvested in military capabilities and readiness. (Source: Canada. Department of National Defence. Defence Renewal Charter. Consulted October 2013.)

Footnote 3 A federated model tries to strike a balance between a centralized and decentralized model. In a federated model one organization provides central oversight and strategic guidance, while other organizations deliver products and services in alignment with that guidance but with enough flexibility to meet the specific needs of user clients. (Source: KPMG. Defence Renewal Change Management Services: Lean Headquarters Interim Report. Consulted on June 14, 2015.)

Footnote 4 The term “order-in-council” refers to a legal instrument generated by the Governor-in-Council and constitutes a formal recommendation of Cabinet that is approved and signed by the Governor General. Orders-in-council address a wide range of administrative and legislative matters including transactions between departments. (Source: Canada. Library and Archives Canada. Databases: Orders in Council. Online: 2016. http://www.bac-lac.gc.ca/eng/discover/politics-government/orders-council/Pages/orders-in-council.aspx. Consulted on July 25, 2016.)

Footnote 5 For the purposes of this evaluation the term “Information Systems” (IS) will be used and will include all IT infrastructure (hardware), applications (software) and networks that are interconnected for the purposes of storing, organizing and managing information to allow those involved in military operations to retrieve and transmit required information through the appropriate means and methods.

Footnote 6 Source: Canada. Department of National Defence. PAA Attribution Rules. November 2014.

Footnote 7 These are technologies used to manage and monitor the networks, servers, applications and other elements of the IT infrastructure.

Footnote 8 These are technologies to preserve the confidentiality, integrity, availability, intended use and value of electronically stored, processed or transmitted information.

Footnote 9 These are user client technologies including workplace technology devices (desktops, laptops, tablets, etc.) and associated operating systems, corporate and business-specific applications, and services (file, image, print, etc.) that are used in direct support of operations.

Footnote 10 In this evaluation, definitions for incidents and service requests are based on the Information Technology Infrastructure Library. (Sources: ITIL: Continual Service Improvement (CSI). London: TSO, the Stationery Office, 2010. Print.; ITIL: Service Strategy (SS). London: TSO (The Stationery Office), 2007. Print.)

Footnote 11 Functional authority is a specific authority delegated to senior managers, allowing them to issue administrative direction to all organizations across the DND/CAF on matters related to their responsibilities.

Footnote 12 The CIO is a title commonly given to the most senior executive in an enterprise who is responsible for overseeing all the people, processes and technologies within the IT organization to ensure they deliver outcomes that support the goals of the business. (Source: Gartner. Gartner IT Glossary: Chief Information Officer. Online: 2016. http://www.gartner.com/it-glossary/cio-chief-information-officer/ Consulted on July 27, 2016.)

Footnote 13 RCAF expenditures as a percentage of total program spending may be inflated by as much as $30 million due to inaccurate attributions to PAA 4.4, as explained in Section 1.3.2.

Footnote 14 Expenditures attributed to PAA 4.4 by other L1s represent mostly expenditures for military salaries (regular force pay only) for personnel working within all service providers. Regular force pay represents 15 percent of total program expenditures and is attributed to the PAA by the Chief of Military Personnel. Based on PAA data, it was not possible to determine the amount of military pay specifically attributed to each individual service provider. Other L1s also include: Assistant Deputy Minister (Finance)/Chief Financial Officer, Vice Chief of Defence Staff, Assistant Deputy Minister (Science & Technology), Assistant Deputy Minister (Infrastructure & Environment) and Assistant Deputy Minister (Human Resources - Civilian).

Footnote 15 Treasury Board, Directive on the Evaluation Function, April 1, 2009. http://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=15681&section=text. Consulted on July 4, 2014.

Footnote 16 See DAOD 1000-0.

Footnote 17 Estimates are based on historical data provided by individual service providers since historical data from decommissioned systems was not available (see Annex B – Methodology for more detail on the limitations of historical data). Estimates provided by stakeholders were taken as valid by the evaluation, particularly because numbers corroborated with actual numbers from the Assyst Enterprise (Assyst) system for FY 2015/16 which suggested that 391,483 total requests for user client support were received.

Footnote 18 The “Five Eyes” is an alliance composed of intelligence communities from Australia, Canada, New Zealand, the United Kingdom and the United States of America. These countries are bound by a multilateral treaty for joint cooperation in signals intelligence. (Sources: Canada. Department of National Defence. Defence Terminology Bank. Five Eyes. Online, 2016. Wikipedia, Five Eyes. Online: 2016. https://en.wikipedia.org/wiki/Five_Eyes. Consulted on July 28, 2016.)

Footnote 19 Source: Canada. Clerk of the Privy Council. Blueprint 2020: A Vision for Canada’s Federal Public Service. Ottawa: Government of Canada. Online: 2016. http://www.clerk.gc.ca/eng/feature.asp?pageId=400. Consulted on July 6, 2016.

Footnote 20 Functional Planning Guidance is a means by which ADM(IM), as the Functional Authority, can communicate program goals and priorities. This assists all program stakeholders to prioritize IS lifecycle activities and allocate appropriate resources to align with overall departmental priorities.

Footnote 21 Sources: The Emerging Future. Estimating the Speed of Exponential Technological Advancement. Online: 2012. http://theemergingfuture.com/speed-technological-advancement.htm. Accessed July 6, 2016; Nagy B, Farmer JD, Bui QM, Trancik JE (2013) Statistical Basis for Predicting Technological Progress. PLoS ONE 8(2): e52669. doi:10.1371/journal.pone.0052669.

Footnote 22 For example, the Integrated Information Environment Directory Services project launched in 1997 and the Protected Military Satellite Communications launched in 1999 – both have yet to close.

Footnote 23 Average is based on eight major capital projects closed between 2010 and 2015 for which information was available in the department’s Capability Investment Database.

Footnote 24 Percentage is based on milestone information provided for 10 out of 12 major capital IS projects closed between 2010 and 2015.

Footnote 25 Two specific examples of this include: the Secure Mobile Environment Portable Electronic Device capability; and the Unclassified Remote-sensing Situational Awareness capability which took almost nine years to develop but was not used because commercial, off-the-shelf systems were better by the time the project was delivered.

Footnote 26 Including the DND/CAF Project Approval Directive (Chapter 7, Part III) and ADM(Mat) functional guidance documents (i.e., Terms of Reference for Project Managers, Project Directors and Project Sponsors).

Footnote 27 It is important to note though, that all of these requirements do not apply to minor capital projects or other initiatives not conducted as formal projects.

Footnote 28 The RFC process at the DND/CAF is the change management process implemented to plan, authorize and implement changes to an information system in order to ensure that when the change occurs, it will not negatively impact other systems or stakeholders. It is the process that helps to maintain Configuration Control.

Footnote 29 The Security Assessment and Authorization process is the new process used within the DND/CAF to ensure systems security. Implemented in 2014, the process encompasses a lifecycle approach to ensuring systems security, from acquisition/development to disposal. It is a standard process with clearly defined controls based on the approach used by the United States’ National Institute of Standards and Technology.

Footnote 30 The business value of an application takes into consideration several factors, including: the criticality of the application to the business functions of the department; the number of users supported by the application; the efficiency and contributions of the application; and the current and future effectiveness of the application. Applications labeled “high business value” score high in these areas, based on assessments conducted by the applications’ business owners. The Canadian Forces Health Information System (CFHIS) and the Human Resources Management System – Military are examples of applications that are considered high business value due primarily to their business criticality and the number of users that they support.

Footnote 31 Source: Canada. Office of the Auditor General of Canada. 2010 Spring Report of the Auditor General of Canada: Chapter 1, Aging IT Systems.

Footnote 32 Source: Canada. Treasury Board Secretariat. Report on the State of Aging IT in the Government of Canada. 2012.

Footnote 33 Percentage is based on the number of applications that have been assessed for business value.

Footnote 34 Assyst is an application that provides a consolidated approach for IT service management and IT infrastructure library. Assyst is now commonly used by all service providers across the DND/CAF and replaces legacy systems formerly used by individual service providers. Source: Canada. Department of National Defence. DIMEI 7: ASSYST Enterprise. Online: 2016 http://dsblcsf.ottawa-hull.mil.ca/apps/details.asp?App=ASSYST&ShowAllDocs=1&. Accessed: July 27, 2016.

Footnote 35 The Information Technology Infrastructure Library is a series of industry best practices and standards applied to IT service management. Originally developed in the 1990s as a framework by the UK Cabinet for governmental departments, it has since grown to become adopted by organizations in both the public and private sector.

Footnote 36 At the time the evaluation was conducted, service level targets were still not developed/available.

Footnote 37 FY 2015/16 was the first year all service providers were using the system to track and resolve service requests and incidents.

Footnote 38 An incident is an unplanned interruption to an IT Service or reduction in the Quality of an IT Service. Failure of a configuration item that has not yet affected service is also an incident. Source: ITIL Continual Service Improvement (CSI). London: TSO, the Stationery Office, 2010. Print.

Footnote 39 A service request is a request from a user for information or advice, for a standard change, or for access to an IT service. For example, password resets or requests for access for new users are considered service requests. Service requests are usually handled by a service desk and do not require an RFC to be submitted. Source: ITIL Continual Service Improvement (CSI). London: TSO, the Stationery Office, 2010. Print.

Footnote 40 Sources for industry averages include: Gartner Research. IT Key Metrics Data: 2014 Key Infrastructure Measures: IT Service Desk Analysis: 2015. Online: 2013. http://www.gartner.com/document/2633228. Accessed July 6, 2016; and HDI. Pearls of Wisdom: 2013-2014 Research Brief Compilation. Colorado Springs, Colorado: 2014. http://www.thinkhdi.com/topics/research/research-brief-compilation/2013-compilation.aspx. Accessed July 6, 2016

Footnote 41 Source: Canada. Department of National Defence. EITSM Executive Service Delivery Scorecard. Ottawa: Government of Canada, 2016. https://eitsm-gstie.forces.mil.ca/web/guest/reports?

Footnote 42 These percentages are based on only one survey and only on those organizations (53 percent of those surveyed) who actually track this statistic. (Source: HDI. Pearls of Wisdom: 2013-2014 Research Brief Compilation. Colorado Springs, Colorado: 2014. http://www.thinkhdi.com/topics/research/research-brief-compilation/2013-compilation.aspx. Accessed July 6, 2016)

Footnote 43 The documents consisted of the same After Action Reports, End Tour Reports, Lessons Learned Reports, and other reports capturing lessons observed that were reviewed to assess systems interoperability. The specific documents reviewed are listed in Annex B – Evaluation Methodology.

Footnote 44 Source: Canada. Department of National Defence. IM Configuration and Change Management Framework, Version 2.0, 2002.

Footnote 45 The RFC process calls for a review of a new RFC by the appropriate Approval Authority to determine its completeness, accuracy and relevancy; however, since no audit was completed to ensure accuracy of the information contained in the RFC Database, the assumption was made that the RFCs met the definitions of "immediate" priority and the categories of “new requirement” and “improve performance capability,” as per the IM Configuration and Change Management Framework, Version 2.0, 2002.

Footnote 46 Client feedback regarding their level of satisfaction with the availability of information systems to support operations has not been formally collected in a statistically significant manner, therefore, all ratings cited in this section are limited to the views of only those clients who were interviewed.

Footnote 47 Source: Canada. Department of National Defence. Defence Renewal Plan. October 2013.

Footnote 48 SSC has committed to completing this recommendation by the end of December 2016 for services provided to all clients, including the DND/CAF.

Footnote 49 Source: Canada. Department of National Defence. Assistant Deputy Minister (Review Services). Audit of IM-IT Framework to Support Transition to SSC. Ottawa: Government of Canada, 2011.

Footnote 50 Ibid. Annex A – Management Action Plan, p. A-1.

Footnote 51 Source: Canada. Treasury Board of Canada Secretariat. Policy on Evaluation. Online: April 1, 2009. http://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=15024&section=text. Accessed: July 27, 2016.

Footnote 52 The evaluation unsuccessfully attempted to gather performance data from other federal departments for the purposes of comparing achievement in service level targets. At this time, not many departments are tracking service management statistics.

Footnote 53 Expenditures for FY 2010/11were not included in this benchmarking exercise because the methodology used by the DND/CAF for IT expenditure reporting was modified significantly in 2011/12. Comparisons from previous years may have yielded results that could have been impacted by the change in methodology, thus skewing the results

Footnote 54 For the DND/CAF, this number includes Regular Force (66,130) and Civilian (22,011) FTEs only.

Footnote 55 Including Employee Benefit Plan (EBP).

Footnote 56 See KPMG.Defence Renewal Change Management Services: Lean Headquarters Interim Report. Print, June 14, 2015.

Footnote 57 These numbers are current as of October 2015. Mobility devices include: BlackBerry, Cellphones, Data Devices, and Sim Cards (Source: Canada. Department of National Defence. SSC Mobility Service – PP to IMB February 2016.).

Footnote 58 According to PAA Attribution Rules, IT expenditures incurred by non-operational L1s are attributed to the PAA program which they support.

Page details

Date modified: