Evaluation of the Defence Science and Technology Program

April 2015

1258-212 (CRS)

Reviewed by ADM(RS) in accordance with the Access to Information Act. Information UNCLASSIFIED.

Acronyms and Abbreviations

ADM(IE)

Assistant Deputy Minister (Infrastructure and Environment)

ADM(Mat)

Assistant Deputy Minister (Materiel)

ADM(S&T)

Assistant Deputy Minister (Science and Technology)

CAF

Canadian Armed Forces

CBRNE

Chemical, Biological, Radiological-Nuclear and Explosives

CFDS

Canada First Defence Strategy

CORA

Centre for Operational Research and Analysis

CPME

Collaborative Planning and Management Environment

CRS

Chief Review Services

CSS

Centre for Security Science

DAOD

Defence Administrative Order and Directive

DG

Director General

DGSTCO

Director General Science and Technology Centre Operations

DND

Department of National Defence

DRDC

Defence Research and Development Canada

DRMIS

Defence Resource Management Information System

Dstl

Defence Science and Technology Laboratory

DSTO

Defence Science and Technology Organization

UK

United Kingdom

US

United States

ERP

Enterprise Resource Planning

FTE

Full-Time Equivalent

FY

Fiscal Year

GC

Government of Canada

HRMS

Human Resources Management System

L1

Level One

OGD

Other Government Department

OPCW

Organization for the Prohibition of Chemical Weapons

OPI

Office of Primary Interest

MoU

Memorandum of Understanding

NATO

North Atlantic Treaty Organization

NETE

Naval Engineering Test Establishment

NRC

National Research Council of Canada

PAA

Program Alignment Architecture

QETE

Quality Engineering Test Establishment

RCAF

Royal Canadian Air Force

RCMP

Royal Canadian Mounted Police

RCN

Royal Canadian Navy

R&D

Research and Development

S&T

Science and Technology

TBS

Treasury Board Secretariat

Overall Assessment

  • There is an ongoing demand for a departmental S&T Program to deliver niche S&T capabilities in support of DND/CAF, particularly where there are sensitive national security restrictions and in areas lacking industry or academic capacity.
  • Program clients benefit from the Defence S&T Program’s unbiased scientific advice and technical solutions.
  • The Defence S&T Program is entering into a period with substantially reduced internal funding and reduced contributions from external sources. Moving forward, funding reductions will place increasing pressure on program activities and available resources and a reduction of program capacity to meet DND/CAF requirements.

Executive Summary

This report presents the findings and recommendations of the evaluation study of the Defence Science and Technology (S&T) Program within the Department of National Defence (DND). For this evaluation study, the Defence S&T Program comprises only Defence Research and Development Canada (DRDC) and Assistant Deputy Minister (Science and Technology) (ADM(S&T)), the latter of which provides headquarters and coordinating functions for DRDC. The evaluation study was conducted by Chief Review Services (CRS) between November 2013 and October 2014, as a component of the DND/Canadian Armed Forces (CAF) Five-Year Evaluation Plan (2012/13 to 2016/17), and in compliance with the Treasury Board Secretariat (TBS) Policy on Evaluation (2009). As per the TBS policy, this evaluation examines the relevance and the performance of the Defence S&T Program over a five-year period (fiscal years (FY) 2008/09 to 2013/14).

Program Description

DRDC was created in 2000 as an agency within DND to respond to the S&T needs of the Department and the CAF. The Defence S&T Program has a corporate office in Ottawa and eight research centres across Canada. The Program aims to provide the DND/CAF, other government departments (OGD), and the public safety and national security communities with the knowledge and technological advantage needed to defend and protect Canada’s interests at home and abroad. The Program particularly focuses on scientific work that is classified, sensitive, or strategic.

The Defence S&T Program comprises multi-year projects with activities in research, technology development, analysis, and experimentation. Defence S&T research focuses on eleven areas of expertise that are grouped into three domains: physical, information, and human. The areas of operational expertise to support the DND/CAF within specific DRDC research centres may include aspects from more than one domain.

Relevance and Performance

Relevance

There is an ongoing demand for a departmental S&T program to deliver niche S&T capabilities to support DND/CAF, particularly where there are national security restrictions and in areas lacking industry capacity. Program clients also underlined the continued need for a sovereign and classified defence S&T function within the department to serve as a secure primary S&T delivery agent. Further, the Defence S&T Program is valued by its clients because of the competitive edge it provides to various client portfolios with unique understanding of client needs, unique expertise, and resources. One of the Defence S&T Program’s most important roles is as a “trusted advisor” for DND/CAF. In addition, the Defence S&T Program’s role, particularly its work in intelligence, surveillance, and reconnaissance, cyber and space, and identification of future capability requirements, provides important contributions in strategic and sensitive areas. Other contributions, such as support to operations, equipment readiness for operations, and leveraging of information with allies—including the North Atlantic Treaty Organization (NATO) and The Technical Cooperation Program—were also identified as critically important to the DND/CAF.

The Defence S&T Program aligns with government acts, legislation, and policies. It provides scientific and technical support to the missions of the Canada First Defence Strategy (CFDS) and is deemed to be a priority for the DND/CAF. Nevertheless, the evaluation study noted gaps in the Program’s ability to strategically assess its niche capabilities and determine which capabilities are better suited for external sources. In addition, the Program requires clear and updated policies concerning its role as the functional authority, and its coordinating role as the S&T delivery agent, within DND/CAF.1 The Defence and Security S&T Strategy (2013) references DRDC “as the primary delivery agent for the departmental S&T investment,” although this role is not clearly explained across existing departmental directions. This gap in the Program’s roles and responsibilities might have led clients to use alternative services without first engaging the Defence S&T Program.

Performance (Effectiveness, Efficiency, and Economy)

Many program clients have benefited from the Defence S&T Program’s unbiased scientific advice and technical solutions. The Program also contributed to strengthening the knowledge, skills, and capabilities of DND/CAF and its partners through publications and technology transfer in the form of licenses and patents. The Defence S&T Program’s contributions to the war in Afghanistan and other operational requirements have received high praise. National and international awards have been received as a result of the work performed by the program staff.

The evaluation study noted some opportunities for improvement, particularly in relation to addressing the Program’s reduced capacity in some delivery areas. In addition, there is a lack of regular project feedback and an established update mechanism.

There also are opportunities to enhance partnerships with academia, industry, and allies. Impediments in leveraging external partnerships include a lack of department-wide procurement and financial mechanisms, policies, and memoranda of understanding (MoU). Furthermore, some existing government-wide policies were also cited as barriers to collaboration.

The Defence S&T Program would benefit from an overarching corporate capability plan to manage the human resources, infrastructure, and equipment needs of the eight DRDC research centres. Concerns were raised in relation to the loss of considerable senior-level experience and the problems of attracting highly qualified candidates.

The Program should adopt integrated Enterprise Resource Planning (ERP) and expenditure systems to provide more complete financial and human resource information. The current ERP environment contains many costly, overlapping, operational, and corporate systems that are neither aligned to nor take advantage of the technology resident within the systems already being used.

Key Findings and Recommendations

Key Finding 1. There is a need for a departmental S&T program to deliver niche S&T capabilities in support of the DND/CAF, particularly where there are national security concerns and in areas lacking industry capacity.

Key Finding 2. The Defence S&T Program aligns with government acts, legislation, and policies. Nevertheless, gaps were noted in the Program’s ability to assess its niche capabilities strategically and determine which of them are better suited for external suppliers.

Key Finding 3. The Defence and Security S&T Strategy (2013) references DRDC “as the primary delivery agent for the departmental S&T investment.” However, this role is not stipulated clearly in existing departmental directions (DAODs) or in the tasking mechanisms for DRDC activities. In other words, there is a lack of S&T policy that clearly defines departmental S&T roles, responsibilities, and accountabilities. Alignment with the S&T Enterprise is necessary.

Key Finding 4. Opportunities may exist to eliminate S&T duplications provided by similar S&T organizations within DND and OGDs.

Key Finding 5. The Defence S&T Program has not yet identified which specific niche capabilities need to be retained by DRDC and which capabilities might be transferred to external partners. However, work has been initiated in 2013/14 as part of the implementation of the Defence and Security S&T Strategy.

Recommendation 1: Develop a DND-wide policy to define and communicate the Defence S&T Program’s mandate, strategy, roles, responsibilities, and accountabilities—i.e., reinforcing ADM(S&T) as the functional authority.

Recommendation 2: Determine the priority and niche S&T capabilities that align to Government of Canada (GC) and DND/CAF priorities and create the capacity to provide client guidance on non-niche areas.

Key Finding 6. The Defence S&T Program aligns with the federal government priority of “an innovative and knowledge-based economy,” and it is also consistent with departmental priorities.

Key Finding 7. The Defence and Security S&T Strategy aims to improve Defence S&T and match the requirements outlined in the federal S&T Strategy, particularly in the entrepreneurial advantage area. However, the Strategy lacks the necessary mechanisms and goals (timelines, targets, or road map) required to achieve these requirements.

Key Finding 8. Clients benefit from the Defence S&T Program’s unbiased scientific advice and technical solutions. Furthermore, project updates have improved since the implementation of the new reporting system.

Key Finding 9. Clients believe that recent funding reductions have had an impact on the Defence S&T Program’s technical capacity.

Key Finding 10. The annual Functional Planning Guidance had set annual partnership targets for academia, industry, and allies. The evaluation study found that the Program did not fully achieve its partnership and leveraging targets. It also noted gaps in processes and systems to monitor, measure, and report on data.

Key Finding 11. The Defence S&T Program lacks appropriate mechanisms or instruments to engage industry, academia, and other partners to fully implement the strategic objectives of the Defence and Security S&T Strategy, particularly those related to partner engagement.

Recommendation 3: Create and implement a comprehensive strategy to increase and strengthen external engagements.

Key Finding 12. The Defence S&T Program implemented a new project selection and validation process in 2013. However, at the time of the Evaluation only a few projects had gone through a formal, standardized project validation system.

Key Finding 13. The Defence S&T Program lacks an integrated project monitoring and control system to ensure the integrity of project-related information and its subsequent use by management. While the S&T Functional Assessment findings were used to influence S&T strategic guidance and activities for the following planning cycle, no documented evidence was found to support how related decisions were made by senior management.

Recommendation 4: Expedite efforts to formalize project management and oversight practices across the Defence S&T Program.

Key Finding 14. The Defence S&T Program lacks a sound corporate-level capability management plan to holistically manage all resource management requirements.

Key Finding 15. The Defence S&T Program depends on capital assets that are not under its control. Furthermore, the state of those assets has implications for the capacity of the Program to do its work and achieve its strategic goals.

Key Finding 16. The Defence S&T Program recognizes the need for scientific and technical collaboration between DRDC research centres.

Key Finding 17. The Defence S&T Program has an external review process in place. However, the evaluation team could not find sufficient evidence to assess to what extent the review recommendations were implemented.

Recommendation 5: Implement a management structure that ensures coordination of activities across the DRDC Centres, including resource sharing and management and promotion of external partnerships.

Key Finding 18. The Defence S&T Program has contributed significantly to CAF operations and GC needs.

Key Finding 19. There is evidence that technology or integration of knowledge produced by the Program is leveraged by the DND/CAF and OGDs/agencies.

Key Finding 20. Multiple data systems used by the Defence S&T Program create challenges in capturing the “total value of science.”

Key Finding 21. Since FY 2011/12, there has been a significant decrease (about 20 percent) in the ADM(S&T) A-base budget allocations, and a corresponding decrease of approximately 242 full-time equivalents (FTE) (about 20 percent).

Recommendation 6: Implement a formalized and integrated resource management system that provides key resource and financial data to support decision makers.

Recommendation 7: Align ADM(S&T) policy and guidance, including roles, responsibilities, and accountabilities, to the delivery of the Defence S&T Program’s strategic mandate.

Note: Refer to Annex A—Management Action Plan for a complete list of recommendations and management responses.

1.0 Introduction

1.1 Profile of the Defence S&T Program

1.1.1 Background

This report presents the findings and recommendations of the evaluation of the Defence S&T Program. It examines the relevance and the performance of the Defence S&T Program for FYs 2008/09 to 2013/14 inclusive. For the purpose of this evaluation study, the term “Defence S&T” encompasses roles and activities conducted only by ADM(S&T) and DRDC. Following the recommendations of the Strategic Review and Deficit Reduction Action Plan,2 the Defence S&T Program has undergone several business renewal processes. As some of these transformation efforts are still underway, not all results could be assessed by this study. This Evaluation was conducted by CRS from November 2013 to October 2014, as a component of the DND Five-Year Evaluation Plan (2012/13 to 2016/17). Performed in accordance with the TBS Policy on Evaluation (2009),3 it examined the relevance and the performance of the Defence S&T Program. For this CRS evaluation, the Defence S&T Program comprises both DRDC and ADM(S&T), which provides headquarters and coordinating functions for DRDC.

No previous evaluations of the CRS Defence S&T Program have been conducted, although evaluations of some of its components were undertaken. An evaluation of the Biological and Chemical Defence Review Committee was completed in June 2014 and the Chemical, Biological, Radiological-Nuclear and Explosives (CBRNE) Research and Technology Initiative (CRTI) was evaluated in 2011. Previous evaluations of the CRTI were conducted in 2006 and 2008. In addition, an evaluation of Research and Development in the Department of National Defence and the Canadian Forces was conducted by CRS in 2001. Further, the Internal Audit Division of CRS conducted the following three audits:

  • Audit of Chemical, Biological, Radiological, and Nuclear Material, 2014;
  • Audit of the Chemical, Biological, Radiological and Nuclear Defence Omnibus Project, May 2008; and
  • CRTI Financial Management Audit, November 2006.

Where relevant, this evaluation study utilizes the results from previous evaluations and audits as a baseline to assess continuous improvement. In addition, an advisory panel comprised of senior program staff supported the evaluation team through different phases of the evaluation.

1.1.2 Program Description

The Department released the first Defence S&T Strategy in 2006. One major outcome of the strategy was the creation of the S&T Enterprise—the integration of DND organizations (including DRDC) involved in defence and security research, with the goal of maximizing the return on the Department’s S&T investment. With the creation of the S&T Enterprise, ADM(S&T) assumed a new role of functional authority for DND’s S&T Program. As such, ADM(S&T)4 provides a coordinating function for Defence S&T research and development (R&D) activities conducted by DRDC.5 The eight research centres operated by the Defence S&T Program are located in: Suffield, Alberta; Toronto, Ontario; Ottawa, Ontario; Valcartier, Quebec; and Dartmouth, Nova Scotia.6

In accordance with the Defence and Security S&T Strategy (2013), the Defence S&T Program aims to leverage client requirements by three principal modes of delivery: build, collaborate, and access. While the Program builds and maintains S&T capabilities in the areas of strategic and national importance through in-house capabilities, other areas of S&T lend themselves to collaboration with partners.7 These collaborations are characterized by reciprocal access to facilities, sensitive information, personnel exchanges, joint initiatives, and coordinated investments. The Strategy calls for accessing directly from industry or academic sources the products or services for those requirements where security and sovereignty issues are not impediments.8

1.1.3 Program Objectives

The aim of the Defence S&T Program is to contribute directly and indirectly to the successful conduct of defence operations.9 Accordingly, the Defence S&T Program is designed to enable scientific and technological products and solutions to satisfy the needs of DND/CAF clients10 and partners.11 The Program comprises multi-year projects with activities in research, technology development, analysis, and experimentation. These projects are applied to inform, enable, and respond to Canada’s defence and security priorities over multiple time horizons extending up to a 20-year outlook.12

The Program aims to produce the following outputs:

  • evidence-based advice;
  • research and knowledge;
  • technology products and concepts;
  • leveraged partnerships; and
  • stewardship of the S&T function.

According to the Defence and Security S&T Strategy, S&T plays a critical role in contributing to Canada’s defence and national security, providing the knowledge and technological advantage necessary to develop the right military capabilities and prepare for an uncertain and potentially dangerous future. The defence and security community also relies on S&T to identify potential requirements not currently visible to operators and decision makers, and focus on the very early stages of scientific work that could eventually have future implications for the defence of Canada and the security of Canadians.13 The Program focuses on S&T work that is classified, sensitive, or strategic.

The specific activities, outputs, and outcomes of the Defence S&T Program are illustrated in the program logic model at Annex C.

1.1.4 Stakeholders

Defence S&T Program activities affect the following three broad organizational groups:

  • the end users/clients—those who feel the impacts (i.e., the direct and indirect consequences) of S&T activities, such as members of the CAF;
  • the investors—DND, federal OGDs, industry/academia and allies (i.e., organizations that contribute people, land, buildings, machinery and equipment, and operating funds allocated to S&T activities); and
  • partners and co-delivery agents—who, like investors, share in the risks and benefits of specific R&D activities.

The key stakeholder of the Defence S&T Program is the DND/CAF. There are, however, other organizations that have a stake in the quality and availability of science, research, and technology development activities undertaken by the Defence S&T Program. Other beneficiaries may receive direct or indirect expert opinion or advice from the Program to enhance their programs or decision-making requirements.

1.2 Evaluation Scope

1.2.1 Coverage and Responsibilities

The Defence S&T Program assists the DND/CAF to accomplish activities under the 2009 Program Alignment Architecture (PAA):14 program activity 1.1 Defence Science and Technology and sub-activities 1.1.1 Defence Research, Technology and Analysis and 1.1.2 Public Security Science and Technology. Under the 2014 version of the departmental PAA, the Program corresponds to program activity 5.0 Defence Capability Development and Research and the following sub activities: 5.1 Capability Design, Development and Integration, and 5.2 Strategic Direction and Planning Support. Also, the Public Security S&T Program (which is also known as the Canadian Safety and Security Program) is now under program activity 2.0 Defence Services and Contributions to Government and sub-activity 2.2 Defence Services for Canadian Safety and Security.

The scope of the evaluation study was limited to S&T activities undertaken by DRDC and ADM(S&T) only. Furthermore, it excludes the Public Security S&T Program, as it was subject to a separate evaluation in 2011. However, this program’s activities are discussed within the Evaluation to demonstrate a level of coordination and interoperability with OGD and public safety and security stakeholders.

1.2.2 Resources

The Defence S&T Program receives both direct departmental (A-base) funding and funding from other internal DND program areas (Level One (L1)). In addition, it receives funding from other sources, including leveraged funding and in-kind contributions from allies, industry, universities, and OGDs. On average, the total funding15 associated with the Program is approximately $363,996,000 for the fiscal years covered by the evaluation, representing 1.8 percent of the overall DND/CAF budget.

From FY 2009/10 to FY 2013/14, the overall expenditures associated with the Defence S&T Program have decreased, on average, by 3.1 percent annually. While in FY 2010/11 the total expenditures associated with the Program reached $388,069,000, in FY 2013/14 this amount decreased by 17 percent to $316,958,000.

Since, FY 2010/11, the number of civilian and military employees has decreased by approximately 17 percent.16 Based on the 2013/14 Departmental Performance Report, the Defence S&T Program employs 1,452 employees, comprised of 1,380 civilian and 72 military personnel. Details on expenditures and employees are presented in Section 2.5.1.

1.2.3 Issues and Questions

In accordance with the Treasury Board Directive on the Evaluation Function (2009), the Evaluation addressed five core issues related to relevance and performance. The methodology used to gather evidence in support of the evaluation questions is provided at Annex B. An Evaluation Matrix listing each of the evaluation questions, with associated indicators and potential data sources, is provided at Annex D.

2.0 Findings and Recommendations

Evaluation findings and recommendations are presented in Sections 2.1 through to Section 2.5.

2.1 Relevance—Continued Need for the Program

Is there a continued need for the DND/CAF to deliver the Defence S&T Program?

This section examines whether there is a continued need for an S&T program within DND/CAF. The findings in this section are based on program documents reviewed, results of a client questionnaire,17 key informant interviews with senior program staff,18 as well as with representatives from client organizations, an outside expert, and industry representatives.

The following three indicators were used to determine the continued need for the Program:

  • evidence that the Defence S&T Program responds to emerging needs and threats and provides the DND/CAF with unique resources, services, and capabilities;
  • perception of a continuing need for the DND/CAF to deliver the Defence S&T Program; and
  • evidence that external partners (including industry) use or leverage Defence S&T Program outcomes.

For key informant interviews and the client questionnaire, the evaluation study used the following scale throughout the report to indicate the relative weight of the responses for each of the respondent groups:

  • almost all: findings reflect the opinions of 90 percent or more of respondents;
  • many: findings reflect the views and opinions of at least 60 percent of respondents;
  • some/several: findings reflect the views and opinions of at least 25 percent of respondents; and,
  • a few: findings reflect the views and opinions of at least two respondents but fewer than 25 percent.

Key Finding 1: There is a need for a departmental S&T program to deliver niche S&T capabilities in support of DND/CAF, particularly where there are national security concerns and in areas lacking industry capacity.

Indicator 2.1.1: Evidence that the Defence S&T Program responds to emerging needs and threats and provides unique resources, services, and capabilities

DND/CAF requires S&T elements to contribute to efforts to counter the full range of threats and challenges to Canada and Canadians. This includes responding to the immediate requirements of deployed CAF members to counter real-time threats, providing scientific, technical, and analytical support to decision making in such areas as readiness, capability development, and acquisition, and anticipating and developing effective responses to emerging threats.19 In support of these areas, the Defence S&T Program, as the primary delivery agent for the departmental S&T investments, provides its clients with products ranging from evidence-based advice, concepts and doctrine, technology evaluation and engineering, and in-theatre support.20

An evaluation questionnaire was sent out to the Defence S&T Program clients to gauge the need for the Defence S&T Program within DND, and if its outputs continue to remain relevant for L1 organizations/client groups. In response to the questionnaire topic area of emerging needs and threats, most clients agreed that the organization served their needs in a satisfactory manner. Although many of the respondent comments included positive remarks, some respondents raised the following concerns:

  • process and timelines to acquire funding have been factors that limited the Defence S&T Program’s ability to address new and emerging needs; and
  • defence S&T might lack expertise or be unable to respond to emerging needs in a timely manner, since building expertise in new areas generally requires considerable time.

Program clients also underlined the continued need for a sovereign and classified Defence S&T function within the Department, serving as a secure primary S&T delivery agent. However, some questionnaire respondents commented that their organization did not solely use Defence S&T services. Further, several of the respondents somewhat agreed that Defence S&T products/services could also be delivered through alternative service delivery (industry, allies, or academia). For example, while the Royal Canadian Navy (RCN) identified the Defence S&T Program as their primary agent to deliver S&T services,21 the RCN has engaged external expertise from allies when the requisite capabilities or capacity could not be provided by DRDC.

Many clients of the Defence S&T Program also valued the organization based on its unique understanding of their needs, unique expertise, and resources. These clients mentioned that their organizations did not have the capacity to pursue alternative S&T options and considered other alternative service delivery options to be biased and to serve competing interests.

One of the Defence S&T Program’s most important roles was identified as being a “trusted advisor” for the DND/CAF. Defence S&T’s role, particularly its work in the intelligence, surveillance, reconnaissance, cyber, space, and identification of future capability requirements, was also underlined as important in strategic and sensitive areas. Other contributions were also identified as of critical importance to DND/CAF. These include support to operations (throughout the war in Afghanistan), support to the Vancouver Olympic Games (2010) on security related to CBRNE threats, equipment readiness for operations, ballistic protection, and leveraging of information with allies including NATO and the Technical Cooperation Program.

Contributions of the Defence S&T Program to emerging needs comprise: jamming of radio controls of improvised explosive devices, including continuous support in radar jamming technology; high-resolution long-range radar imaging to perform land surveillance missions; threat analysis and counter measures development to protect soldiers; shielding of vehicles to minimize casualties; and electronic warfare in Libya.

Regarding the Defence S&T Program’s contributions to future operational preparedness, interviewees provided examples that included smart-force generation, evidence-based hiring for the Forces, maintaining technology requirements, and supporting smart procurement. Examples of Defence S&T research capability areas include medical countermeasures, chemical, biological and ordinance detection, blast and weapons effects, future small arms research, autonomous systems (robotics), strategic analysis, studies to support strategic capability, defence policy analysis, intelligence assessments, baseline threat and health threat assessments, electro-magnetic spectrum, radio frequency, radar and cyber, and multinational modeling and simulation working groups.

The Defence S&T Program has also provided its clients with unique services in a secure environment where there is a lack of industry and academia expertise and/or no availability of commercial products or services. For example, DRDC has provided services in the area of ballistic protection, where sensitive information needs to be protected. Interviewees also cited the following unique services for two projects: extensive vehicle survivability knowledge, and innovative solutions for the protection of all in-theatre fleets; and the development of remote CBRNE sensing and surveillance capabilities in the semi-autonomous multi-agent tactical sentry vehicle.

However, in terms of potential future threats, interviews with former senior Defence S&T staff underlined the need to reassess the future capabilities of the Defence S&T Program in terms of potential long-term threat areas (Horizon 3). Furthermore, former senior staff noted that recent budgetary constraints might have driven Defence S&T’s expertise towards meeting immediate departmental needs rather than focusing on future threat capabilities. Client interviews and document reviews also confirmed that there was a shift towards meeting the CAF’s immediate operational needs during the war in Afghanistan. It was also underlined that any potential opportunities to leverage future threat capabilities from allies or other organizations might create challenges.

Indicator 2.1.2: Evidence that external partner, (including industry) use or leverage Defence S&T Program outcomes and products

Use of Program Outputs by OGDs

The Public Security S&T Program, launched in June 2012, harmonized the mandates of three programs led by DRDC’s CSS, in collaboration with Public Safety Canada.22 Through its unique mandate and whole-of-government approach, the Public Security S&T Program has partnered with 21 OGDs and international participants.23 For example, the work on the 2010 Olympics involved the Public Security S&T Program, Royal Canadian Mounted Police (RCMP) (forensic and explosive), Public Health Agency of Canada (biological), Environment Canada (chemical), and Health Canada. Other joint project examples include medical counter measures such as H1-6 nerve agent (with Public Health Agency of Canada), Canadian Animal Health Surveillance System (with Canadian Food Inspection Agency), and cyber fusion with the RCMP.

Interviews with key Defence S&T Program staff underlined that the Program has provided support to the Canadian Coast Guard for ship failure, corrosion, and fatigue issues. The DRDC’s radiological-nuclear detection and forensics work has broad applications across the first responder community, as well as with police forces, border protection, and other national security community organizations. Similarly, work on cyber operations is of broad interest to OGDs and agencies. While the primary focus is on military network security, many of the concepts are fundamental and have broad applications. Similarly, the document review indicated that the work on space systems and space-based radar has been of significant interest to the Canadian Space Agency and to some extent to Natural Resources Canada, with whom the Defence S&T Program has worked closely.

The evaluation study noted the role of the Defence S&T Program in public security issues through the Public Safety S&T Program is clearly defined through existing mechanisms; that is, partnership agreements are in place to support delivery requirements. In addition, it is evident that through this Program a whole-of-government approach supports clear roles and accountabilities and is demonstrating good value as outputs are used across government.

However, the evaluation study found gaps in the Defence S&T Program’s overall ability to determine which requirements should be delivered by Defence S&T versus OGDs, such as the National Research Council of Canada (NRC). In this regard, clearly identified roles and accountabilities in the delivery of S&T across the government’s S&T organizations would demonstrate better value and more effective use of services.

The evaluation study noted that allied countries have taken steps to ensure that roles are well defined across government science organizations, and structures are in place to allow partnerships. For example, in 2012 the Australian Defence Science and Technology Organization (DSTO) was given the whole-of-government responsibility to coordinate R&D for Australia’s national security. The key driver was to expand opportunities for collaboration with other departments and agencies. Similarly, the United Kingdom’s (UK) Defence Science and Technology Laboratory (Dstl) has done work for approximately 40 government departments and agencies, including the Home Office and Department for Transport. Dstl has established well-defined mechanisms to leverage capabilities to support these partnerships, which include an Interlab partnership of seven government research laboratories.

Use of Program Outputs by External Partners

International leveraging of Defence S&T Program outcomes/products has been significant. DRDC is involved in 300 projects of The Technical Cooperation Program, 150 activities in NATO (out of 200), and 56 collaborations with 36 different countries.

Responding to a request for information, the Australian DSTO underlined that it has benefited from a strong relationship with DRDC. The two organizations are of a comparable size, have similar budgets, and share similar Defence S&T objectives. According to the DSTO response, the S&T cooperation with Canada has contributed to the Australian Defence Force’s capability across a wide spectrum of areas, including soldier combat systems, shock and blast loading structures, ballistic protection and armour material, and landmine detection.

Based on document review and key program staff interviews, the following are examples of other external partners who have leveraged Defence S&T Program outcomes throughout the past five years:

  • Key program staff from DRDC Atlantic explained that the Defence S&T Program conducts long-term research on naval ships to improve structure. The Program’s outcomes are also leveraged by the United States (US) and Australia. Similarly, support to academia, such as to the University of British Columbia, was provided through a consortium to build knowledge in shipyards. Support was also provided to the Naval Architecture program at Memorial University;
  • DRDC Valcartier has collaborated with Australia’s DSTO, The Technical Cooperation Program, and NATO to leverage capabilities in the Virtual Range for Advanced Platform Protection project. Aside from several other project collaborations, digital simulation tools have been shared with Australia; and
  • Based on interviews with key CSS staff, CBRNE program outcomes have been shared with the RCMP. Other common interest areas included knowledge on tasers, situational awareness, detection, and Olympic security (2010). Natural Resources Canada has shown interest in DRDC research on unmanned autonomous vehicles.

2.2 Relevance—Alignment with Federal Roles and Responsibilities

Does the GC (and the DND/CAF specifically) continue to have a role/responsibilities in delivering the Defence S&T Program?

This section examines the extent to which the S&T Program aligns with departmental and federal roles and responsibilities. The findings in this section are based on documents reviewed and key informant interviews, including senior program staff and representatives from client organizations, industry, and an outside expert.

The following three indicators were used to determine the extent of alignment:

  • alignment with government acts, legislation, and policies;
  • evidence regarding the frequency and reasons why DND/CAF clients of the Defence S&T Program go outside of the Program for S&T service; and
  • existence of other departments, agencies, and/or organizations providing similar resources, services, and capabilities.

Key Finding 2: The Defence S&T Program aligns with government acts, legislation, and policies. Nevertheless, gaps were noted in the Program’s ability to assess its niche capabilities strategically and determine which of them are better suited for external suppliers.

Key Finding 3: The Defence and Security S&T Strategy references DRDC “as the primary delivery agent for the departmental S&T investment.” However, this role is not stipulated clearly in existing departmental directions (DAODs), or in the tasking mechanisms for DRDC activities. In other words, there is a lack of S&T policy that clearly defines departmental S&T roles, responsibilities, and accountabilities. Alignment with the S&T Enterprise is necessary.

Key Finding 4: Opportunities may exist to eliminate S&T duplications provided by similar S&T organizations within DND and OGDs.

Key Finding 5: The Defence S&T Program has not yet identified which specific niche capabilities need to be retained by DRDC and which capabilities might be transferred to external partners. However, work has been initiated in 2013/14 as part of the implementation of the Defence and Security S&T Strategy.

Indicator 2.2.1: Alignment with government acts, legislation, and policies

Document review of S&T policies and other literature revealed that the GC supports the continued need for basic and applied research.24 This is due to the range of potential benefits derived from research activities, along with their recognized contribution to a strong knowledge economy. The GC, through the 2007 federal Science and Technology Strategy, Mobilizing Science and Technology to Canada’s Advantage (S&T Strategy), underlined that public support for basic research is supported by significant benefits to society.

The Defence S&T Program is one of the federal science programs that are included under the Security and Defence domain of the federal S&T outcomes. As such, the Program provides leadership to strengthen Canada’s ability to manage public security emergencies, to defend Canada, and to contribute to international peace and security, while respecting the rights and freedoms of Canadians. Further, the Program enables the DND/CAF to have a relevant and credible capacity to meet defence and security commitments and achieve success in missions that contribute to domestic and international peace and security.25

The existence and application of the Defence S&T function derive from the National Defence Act,26 which defines the responsibility of the Minister of National Defence to conduct R&D activities in support of the defence of Canada. Through the CFDS, the government articulates the role of DRDC in this pursuit, as follows: “DRDC will collaborate with Defence partners to derive maximum benefit from technology and ensure that the Canadian Forces continue to be a ‘state-of-the-art’ military.”27

The CFDS also underlines the economic benefits of the Defence S&T Program through the development of high-tech, high-value sustainable jobs in all regions—directly, through the development of military capabilities and indirectly, through technological spinoffs and commercial applications. “This will put Canadians to work protecting Canadians. Universities and colleges will also benefit through increased opportunities to undertake cutting-edge research.”28 Based on the recommendations of the Jenkins Report29and interviews with representatives from industry, it was noted that the Defence S&T Program should continue to strengthen partnerships. This will allow better alignment to the role and responsibility specified by the CFDS.

Indicator 2.2.2: Evidence regarding the frequency and reasons why DND/CAF clients of the Defence S&T Program go outside of the Program for S&T services

As discussed in Section 2.1, several clients of the Program agreed that some Defence S&T services could be delivered through alternative sources. However, most clients also explained that some capabilities were not readily available through industry or academia.

Clients who commented on this issue underlined that industry or academia could provide certain S&T services, as long as an MoU was in place and the alternative sources for S&T services did the following:

  • met security requirements;
  • had relevant background/subject-matter expertise; and
  • were not involved in the business of selling their products.

A few respondents to the client questionnaire commented that the Defence S&T Program’s structure (resources/infrastructure) has been limited, which restricted its ability to meet all capability areas. Therefore, various projects have been outsourced (contracted out) to either industry or to other science-based departments or agencies such as the NRC. Other clients have been required to request services from allies (such as the US Navy and the Royal Navy) as Defence S&T no longer has the capacity to support certain projects such as submarine signature management.

Through interviews, the Defence S&T Program clients further explained that some projects used sources other than the Defence S&T Program for reasons of better suitability, meeting timelines, or unique pre-existing capability.30 The Defence S&T Program’s funding restrictions were also mentioned as one of the factors that have led clients to seek alternate sources for S&T services.

The evaluation study noted that the Defence S&T Program did not track how often clients go outside of the Program for S&T services. Consequently, data was not available regarding the frequency with which other sources are used and/or the reasons for going outside of the Program. Tracking this information would provide the Program with a better understanding of its business environment and allow for the improvement of its services where needed.

Furthermore, the evaluation study found that the Defence S&T Program is in need of clear and updated policies concerning its role as the functional authority and its coordinating role as the S&T delivery agent. 31 Despite the fact that the Defence and Security S&T Strategy referenced DRDC as “the primary delivery agent for the departmental S&T investment,” it was noted that this role was not clearly explained across existing departmental directions. This gap in the Program’s roles and responsibilities might lead clients to turn to alternative services without first contacting the Defence S&T Program.

Indicator 2.2.3: Existence of OGDs, agencies, and/or organizations providing similar resources, services, and capabilities

In respect to OGDs and agencies, it was noted that the Defence S&T Program has demonstrated strengths and expertise in S&T in support of collaborative initiatives, as part of the federal S&T Strategy32 that includes eleven OGDs and agencies. For example, CBRNE S&T involves multiple levels of government, including federal departments and agencies, academic and industry partners, and first responders.

The evaluation study found gaps in the Defence S&T Program’s ability to strategically assess its niche capabilities and determine which capabilities are better suited for external suppliers, such as industry and academia. Furthermore, the lack of a formal and comprehensive assessment of Defence S&T niche capabilities was noted in the form of an external engagement strategy to identify which suppliers are best suited to conduct specific activities and why.

The federal S&T Strategy also stated that there may appear to be overlaps where similar S&T activities are described in multiple places because of their contribution to different outcomes. Accordingly, the federal S&T Strategy asserted that “while much of this S&T is performed by partners supported by federal funding (e.g., academia and the private sector), the federal government maintains its own capacity to deliver S&T when no other stakeholder can/will provide the required results.”33

In regards to duplication of effort, most interviewees and select Defence S&T Program clients believed that other organizations, such as the four engineering and testing organizations under the direction of Assistant Deputy Minister (Materiel) (ADM(Mat)),34 complement Defence S&T outputs. These client interviews also revealed that no other entity has the breadth and depth to fully assume any DRDC function. Conversely, a few senior program staff identified that there might be some areas where service duplications could be found between the QETE, NETE, and DRDC. For example, the support provided to the RCN through the dockyard laboratories in Canadian Forces Bases Halifax and Esquimalt are similar to what QETE provides, not so much to the RCN, but to the Army and the RCAF. Nonetheless, one example provided was the routine analysis of composition fuels that could be done in Halifax by NETE versus DRDC Atlantic.35

Some clients acknowledged that there are organizations that provide similar services in some technical areas or have comparable facilities to those offered by the Defence S&T Program. However, generally, the Program has unique qualities, such as exceptional technical expertise in certain areas, has high standards in terms of a secure environment, and delivers certain efficiencies (e.g., “due to easy access and no need for lengthy contracting process”). Based on the client questionnaire and interviews conducted as part of the evaluation study, some clients indicated that other organizations could likely have met their needs, although it is unknown whether there would have been impacts to the quality or efficiency of the project.

CRS Recommendation

1. Develop a DND-wide policy to define and communicate the Defence S&T Program’s mandate, strategy, roles, responsibilities, and accountabilities—i.e., reinforcing ADM(S&T) as the functional authority.

OPI: ADM(S&T)

CRS Recommendation

2. Determine the priority and niche S&T capabilities that align to GC and DND/CAF priorities and create the capacity to provide client guidance on non-niche areas.

OPI: ADM(S&T)

2.3 Relevance—Alignment with Government Priorities

Are the objectives of the Defence S&T Program consistent with DND strategic outcomes and federal government priorities? The following indicators were used to make this determination:

  • alignment between Defence S&T Program priorities and federal government priorities; and
  • alignment between Defence S&T Program priorities and DND/CAF priorities and strategic outcomes.

This section examines the extent to which the Defence S&T Program is aligned with federal government priorities and DND strategic outcomes. The findings in this section are based on evidence from documents reviewed and interviews conducted for the evaluation study.

Key Finding 6: The Defence S&T Program aligns with the federal government priority of “an innovative and knowledge-based economy,” and it is also consistent with departmental priorities.

Key Finding 7: The Defence and Security S&T Strategy aims to improve Defence S&T and match the requirements outlined in the federal S&T Strategy, particularly in the entrepreneurial advantage area. However, the Strategy lacks the necessary mechanisms and goals (timelines, targets, or a road map) required to achieve these requirements.

Indicator 2.3.1: Alignment between the objectives of the Defence S&T Program and the priorities of the federal government

The GC set its current agenda for supporting S&T in 2007 with the introduction of its federal S&T Strategy. Subsequent Speeches from the Throne and federal budgets have reiterated the government’s commitment to these S&T Strategy priorities.36

The importance of government support for research was highlighted in the Expert Panel Report (referred to as the Jenkins report)37 that reviewed federal support to R&D, which stated the following:

The federal and provincial governments play an important role in fostering an economic climate that encourages business innovation—for example, by supporting basic and applied research and related training of highly qualified, skilled people […] the higher education and government sectors are key players in Canada’s innovation system and complement the role of business.

The basic research performed by universities and other laboratories generates disruptive technologies that open up whole new industries. […] Given the foundational role that a strong post-secondary education sector plays in the Canadian system of innovation, this Panel (like the Expert Panel on Commercialization before it) urges the government to commit to investing in basic research at internationally competitive levels, and also to review and modernize the support for the total institutional costs of research.

As of 2010, the federal government has invested approximately $6 billion annually and employs close to 39,000 workers (or nearly 14 percent of the public service) to support its S&T activities.38

Defence S&T Program activities also respond to the federal strategic priority of “innovative and knowledge-based economy” and align with the federal S&T Strategy, which identifies security and defence research as one of the five federal S&T research domains/outcome areas.39 In this regard, according to several outside experts’ opinion, the Defence S&T Program’s Public Security S&T Program model works very well. This Program is highly valued and successful with external partners, such as other science-based departments and agencies.

Key program staff and clients who were interviewed identified Defence S&T Arctic and northern projects as one of the key project areas aligned to government priorities. Examples of S&T program initiatives in support of Arctic sovereignty include the Cornerstone and Northern Watch projects.40 Specifically, maintaining expertise to work on ice and to track submarine activity under the ice were underlined as capabilities that would be detrimental to Canada’s national interests if they were to be lost.

Indicator 2.3.2: Degree of alignment between the Defence S&T Program objectives and DND/CAF priorities and strategic outcomes

In the DND PAA of 2009, Defence S&T falls under the departmental strategic outcome for “Resources are acquired to meet Government Defence Expectations.”

According to the Report on Plans and Priorities (2010/11), the Defence S&T Program has directly supported the DND/CAF priority of “Ensuring Sustainable Operational Excellence both at Home and Abroad” through its contributions before, during, and after CAF missions/operations.41 The Program achieved this outcome by pursuing some 300 S&T projects in collaboration with OGDs, allied nations, and Canadian industry that addressed departmental and governmental priorities.

In 2011, during a meeting with ADM(S&T), clients of the Defence S&T Program voiced concerns about the relevance of Defence S&T projects, specifically, how effectively the projects undertaken matched requirements. While senior representatives of client organizationstended not to disagree with the list of projects being undertaken on their behalf, the projects did not tend to truly address “those critical issues that were of great concern to them.” 42 Based on key informant interviews, the project selection process has recently been changed, and projects are now aligned to departmental priorities. Responding to the client questionnaire, clients commented that project alignment has improved as compared to previous years. However, only some respondents somewhat agreed that current projects align with organizational and departmental priorities.

According to the 2012/13 DND Report on Plans and Priorities, the Program’s current contributions are mainly linked to the DND/CAF priority area of the CAF post-Afghanistan. The Program contributes to this priority by developing strategic readiness systems, generating knowledge and technologies for defence and security affordability, accessing and managing the S&T required to strengthen the Defence Team, and providing a strategic knowledge and technology advantage for tomorrow’s defence and security.43 In addition, the 2012/13 DND Report on Plans and Priorities emphasized the following as significant areas of focus for the Defence S&T Program and its contribution to CAF capabilities: improving the CAF’s ability to maintain an appropriate presence in Canada’s North, supporting and developing cyber defence and security capabilities, and developing space and underwater surveillance capabilities.

2.4 Performance—Achievement of Expected Outcomes (Effectiveness)

2.4.1 Achievement of Expected Outcomes (Effectiveness)

This section evaluates the achievement of the Defence S&T Program’s expected outcomes, with a focus on the following immediate outcomes: (1) Defence S&T Program satisfies the needs of the DND/CAF and its partners; and (2) the extent to which DRDC R&D projects are well managed and supported.

Intermediate outcomes are considered in this evaluation insofar as they support the assessment of the Defence S&T Program’s effectiveness in enhancing Canadian defence capabilities.

The evaluation study applied key performance indicators against each outcome. Findings are based on Defence S&T Program data and documentation, including the results of a client survey conducted by Defence S&T (2011), departmental documents, interviews with key staff, external interviews including with industry representatives, a client questionnaire conducted by evaluation staff, and responses received to requests for information from allied nations and other science-based departments and agencies.

2.4.2 Immediate Outcomes

Immediate Outcome 1: The extent to which the Defence S&T Program satisfies the needs of the DND/CAF and its partners

The evaluation used the following indicators to make this determination:

  1. evidence of stakeholder confidence in the Defence S&T Program’s scientific and technical capacity, including degree of client satisfaction (quality, timeliness, responsiveness); and
  2. evidence of effective partnerships with the Defence S&T Program meeting targets.

Key Finding 8: Clients benefit from the Defence S&T Program’s unbiased scientific advice and technical solutions. Furthermore, project updates have improved since the implementation of the new reporting system.

Key Finding 9: Clients believe that recent funding reductions have had an impact on the Defence S&T Program’s technical capacity.

Indicator 2.4.2.1: Evidence of stakeholder confidence in the Defence S&T Program’s scientific and technical capacity, including degree of client satisfaction (quality, timeliness, responsiveness)

The evaluation study used six data sources to evaluate the Defence S&T Program’s achievement of immediate expected outcomes. These were a 2011 client survey conducted by the Defence S&T Program, a client questionnaire conducted by evaluation staff, interviews of senior and key program staff, program clients, an external expert, and former senior staff, Departmental Performance Reports FY 2013/14, requests for information, and administrative data analysis using the CPME.44

According to the results of a client survey conducted by the Defence S&T Program in 2011, 80 percent of respondents agreed that the Program supported CAF operations. Defence S&T support was available in terms of the relevancy of deliverables 89 percent of the time, the improvement of operational effectiveness 85 percent, and the benefits that Canadian defence operations receive from the Defence S&T Program 89 percent. The survey found that the following program outcomes could improve to fully support CAF operations:

  • deliver research results on time;
  • help to resolve important operational problems; and
  • provide critical information for decision making.

According to the client questionnaire conducted as part of this evaluation study, most program clients agreed that the Defence S&T Program had the right technical capacity to deliver the products/services required. Some clients indicated that the impacts of the Strategic Review and Deficit Reduction Action Plan funding reductions and the work force adjustments45 have caused capacity constraints in some areas, such as the ability to resolve propeller and propulsion signature management and DRDC Atlantic’s at sea research capacity (Quest). This issue will be discussed in the next sections.

On the issue of regular project-related feedback, both Program clients and Program staff commented that past and current client feedback processes have been gathered on an ad-hoc basis. In addition, project updates, including timelines and finances, were based upon individual and personal relationships at the working level, and this process was not standardized. However, the evaluation study found that the recently implemented project selection and reporting system should provide regular progress updates and better project outputs. Establishing regular interaction/feedback processes to ensure closer client interaction would contribute to better service delivery.

Indicator 2.4.2.2: Evidence of effective partnerships/meeting targets

To assess this indicator, the evaluation team considered the following criteria:

  • evidence of meeting targets; and
  • impediments to leverage capabilities.

Criterion: Meeting targets

Key Finding 10: The annual Functional Planning Guidance had set annual partnership targets for academia, industry, and allies. The evaluation study found that the Program did not fully achieve its partnership and leveraging targets. It also noted gaps in processes and systems to monitor, measure, and report on data.

Key Finding 11: The Defence S&T Program lacks the appropriate mechanisms or instruments to engage industry, academia, and other partners to fully implement the strategic objectives of the Defence and Security S&T Strategy, particularly the ones related to partner engagement.

According to the S&T Functional Planning Guidance (2013/14), the following parameters are managed strategically on an annual planning cycle for both multi-year and shorter-duration projects46:

  1. Target 1: (50/50) — Half of Program funds is used to support in-house delivery of defence S&T, and the other half is used to engage external S&T performers such as industry; and
  2. Target 2: ($0.75 leverage) — The Program seeks investment by partners in both defence and security sectors at a rate of at least $0.75 for each dollar of departmental investment.

On the first target, the Defence S&T Program salaries were compared to contracting expenditures.47 The evidence demonstrated that the Program fell under the required target threshold of a balance between internal and contracted research for each reporting year. Table 1 includes these figures with the corresponding ratio.

Table 1. Defence S&T Program Contracting Versus Program Salaries. This table represents Defence S&T Program’s salaries versus contracting activities from 2009 to 2013. (Source: DRDC Research Centre program data)

Table Summary

This table represents Defence S&T Program’s salaries versus contracting activities from 2009 to 2013. There are 7 columns and 4 rows, including titles. The first and second rows list, in millions of dollars, the cost of program salaries and of program contracting, respectively. The third row lists the cost ratio between these 2, internal and external. The remaining columns list the figures for each year from 2009 to 2013, with the last one stating the totals. Select any of the three categories and read across the columns for the figures from 2009 to 2013, with the total in the last column.

2009 2010 2011 2012 2013 Total
The Defence S&T Program Salaries ($million) 126,145 130,177 129,454 129,937 129,567 645,280
The Defence S&T Program Contracting ($million) 119,366 94,154 100,693 101,945 90,698 506,856
Ratio 51/49 58/42 56/44 56/44 59/41 56/44

For the second target, the evaluation study compared leveraging totals by non-DND partners to ADM(S&T) annual budgets.48 As illustrated in Table 2, the evidence demonstrated that the Defence S&T Program fell significantly under its leveraging target of 0.75 for all reported years.

Table 2. Defence S&T Program Funding Versus Leveraging. This table represents the Defence S&T Program’s funding versus the Program’s leveraging activities from 2009 to 2013. (Source: DRDC Research Centre’s CPME database)

Table Summary

This table represents the Defence S&T Program’s funding versus the Program’s leveraging activities from 2009 to 2013. There are 7 columns and 4 rows, including titles. The row at far left lists, in millions of dollars, the program funding, the leveraging activities cost, and the row third lists the leveraging figure per dollar. The columns across to the right of that row list the costs per year, between 2009 and 2013, with totals provided in the final column. Note that the figures for 2013 were not applicable for the last two row-categories, also making for gaps in their totals.

2009 2010 2011 2012 2013 Total
Defence S&T Program Funding ($million) 321,650 314,445 297,629 303,236 279,300 1,516,260
Defence S&T Program Leveraging ($million) 168,099 199,527 156,961 175,982 N/A *
Leverage per Dollar 0.52 0.63 0.53 0.58 N/A *

* CRS was not able to obtain Defence S&T leveraging figures for 2012/13.

Key program staff interviewed were mostly unaware as to what extent these targets had been achieved. This could partly be due to a lack of formal system, process and procedures to provide monitoring and reporting of targets at the project, DRDC research centre, and Corporate Defence S&T levels.

Criteria: Level and nature of external partnerships and impediments to leverage capabilities with Canadian industry/academia

Level and Nature of External Partnerships

The Defence S&T Program is committed to rely on a co-investment, co-development model with its partners as it seeks to bring the best ideas forward for exploitation.49 Similarly, the ADM (S&T) Business Plan FY 2013/14 underlines the importance of extensive partnerships with Canadian industry and academia as well as international organizations.50This includes multi-year projects with activities in research, technology development, and analysis and experimentation. In tandem with them, the Defence S&T Program seeks partnership engagement to avoid duplications of effort, to draw on best practices, and benefitting from investments already undertaken.51

Interviews conducted with program staff and industry representatives indicated that the Program was able to establish some notable partnerships, especially with international organizations. However, external partnerships have generally been limited in number.

External interviewees, as well as program staff, underlined the importance of improving the links with similar defence research organizations, such as the Royal Military College. Similarly, a program assessment conducted by DRDC in 201152 recommended that DRDC could further strengthen its interaction and cooperation with QETE. The assessment also underlined that new technologies and techniques that are developed at NRC were available for use by aircraft defect and accident investigators.53 According to the assessment report, the collaboration between DRDC and NRC was established many years ago through an MoU, and was characterized by a relatively informal set of working arrangements.54 The report identified that this agreement provided great advantages for the CAF in that it granted very quick reaction times in response to requests for assistance; however, this also meant that the process of monitoring activities has been somewhat informal.

Industry representatives emphasized the need for the Defence S&T Program to be engaged with industry and support industrial forums to better shape the future landscape. Interviews with key industry representatives provided the industry perspective that Defence S&T needs to identify its priority and niche capabilities. In other words, the representatives believed that the Defence S&T Program has not strategically assessed and identified which capabilities industry should assume. A clear understanding and articulation of niche Defence S&T capabilities would assist in understanding which roles, capabilities, or assets Defence S&T should consider acquiring, augmenting, or improving to align with Defence priorities. An interview with a former senior staff member underlined the importance of achieving the right balance between near-term operational needs and future operational risks. Accordingly, an ideal balance could be achieved by keeping capabilities broad enough and making sure the capability is sufficiently relevant.55

Other comments from industry representatives included the following:

  • Defence S&T should pursue its integrating role in case the Department would prefer procurement of off-the-shelf products to R&D;
  • Defence S&T should be more involved in operational research to minimize risk, particularly in operations. Currently, CORA has limited capability;
  • Capabilities delivered by DRDC Suffield and DRDC Valcartier are highly recognized by industry as very unique capabilities and should be retained since they are not marketable by industry; and
  • Defence S&T should position industry to focus on DND/CAF needs by identifying a comprehensive understanding of the key industrial capabilities in support of major procurement areas that include combat systems, electromagnetic warfare, and land simulation.

Impediments to Partnerships

The main findings concerning impediments to leverage external partnerships included: lack of an external engagement strategy, sufficient procurement mechanisms, and appropriate guidance instruments. The following points elaborate on each of these findings:

  • Lack of an external engagement strategy. The Program lacks the ability to clearly articulate the role of external suppliers and the criteria on which one or more sources should deliver the work and why. For example, evidence from document reviews demonstrated that although many Defence S&T projects had extensive lists of organizations with which DRDC could partner, as well as suggested means for partnering such as MoUs, the extent to which these sources were engaged was unclear. The need for an external engagement strategy was also outlined in a 2013 Program assessment,56 and described as lack of “a cogent argument as to who should do the work and why.”
  • Lack of sufficient procurement mechanisms. The Defence and Security S&T Strategy placed specific emphasis on improving the Defence S&T Program’s matching requirements outlined in the federal S&T Strategy.57 Nonetheless,the evaluation study could not identify specific activities in place to support the intent of the Defence and Security S&T Strategy. The interviews with industry revealed that the current nature of these mechanisms reflects small-scale transactions and are neither strategic nor systematic.

As noted in Section 2.2, to further gauge the extent and role of industry’s contribution within the Defence S&T Program, the evaluation team interviewed stakeholders from four industries as well as an additional external expert. The majority of industry representatives identified concerns with the Defence S&T Program’s inability to clearly articulate what capabilities could be assumed by external organizations. An industry stakeholder cited the Edge Innovation Network58 as a collaborative forum to expand the reach of Edge members and innovation centres59 such as Defence S&T. Industry representatives stated that the Defence S&T Program should become involved and leverage existing opportunities.

The evaluation team noted that allies have also been examining requirements for external engagements. ▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓▓

A common, prominent theme throughout the interviews with program staff was the necessity to develop more formal mechanisms or instruments to engage academia and other partners, similar to what has been achieved by some allies. Partnerships with OGDs have not been formalized through agreements, except those formed through the Public Security S&T Program. It was further noted that the Defence S&T Program has developed a level of coordination and trust with international allies such that long-term strategic goals and internal investments are coordinated appropriately.

An example of academic engagement has been established by the Dstl in the UK. This model includes eleven academic centres of excellence in cyber security research based at UK universities.60 Furthermore, a virtual centre of excellence in bio-inspired systems was recently created with leading UK universities, providing a basis for the next generation of defence and security scientists and engineers through a national doctoral scheme.61 In Australia, the DSTO, established in 2008, is a joint venture that brings together defence industry, universities, and government research agencies to develop new materials and manufacturing technologies to enhance Australia’s defence capability.62

  • Lack of appropriate guidance instruments. Responding to whether or not the new Defence Procurement Strategy would strengthen the relationship between industry and the Defence S&T Program, industry representatives commented that this Strategy was too vague and lacked overall strategic direction to exploit and engage industrial capabilities. Industry representatives also mentioned that other GC-led initiatives generally have been much more beneficial to industry (such as the Build in Canada Innovation Program).63 The Technology Demonstration Program for large-scale demonstration projects was also cited as beneficial to industry’s needs.64

The Departmental Performance Report (2012/13)65 stated that DND worked towards achieving defence affordability by working with Public Works and Government Services Canada to help position Canadian industry for global competitiveness through initiatives including the Defence IndustrialStrategy and the Defence Enhanced Priority Technology List, which was implemented as part of the Industrial and Regional Benefits Program.66 However, in January 2013, the Department opted to not move forward with Project Analysis of Concept and Capability Options for Requirements Definition, an initiative geared to enable key players within the broad defence industry and academic and government realms to directly feed into the conception, development, and analysis of future military capabilities for the CAF.67

Currently, the RCAF is embarking on the Emerging Skies Initiative,68 an RCAF initiative that brings the RCAF together with Canadian industry and academia to jointly explore innovative solutions for Canada’s future air-power needs. The objective of this initiative is for the RCAF to collaborate with industry and academia for concept development purposes. These concepts will focus on the Horizon 3 timeframe and are intended to assist in capability development to respond better to future requirements. The evaluation study noted that these types of collaborations have often identified DND’s present and future direction, and thus allowed industry to improve the shape of research and development.

CRS Recommendation

3. Create and implement a comprehensive strategy to increase and strengthen external engagements.

OPI: ADM(S&T)

Immediate Outcome 2: DRDC research centres are well managed and supported.

The evaluation study utilized the following indicators to make this determination:

  • evidence of a project selection process;
  • evidence of a project monitoring process that can support performance measurement and reporting;
  • evidence that a capability management system is in place to achieve strategic goals;
  • evidence of scientific and technical collaboration and communication (e.g., interoperability or leveraging of expertise) between DRDC research centres;
  • evidence regarding the external review processes used by the Defence S&T Program; and
  • evidence of a project validation process.

Indicator 2.4.2.3: Evidence of a project selection/validation process

Key Finding 12: The Defence S&T Program implemented a new project selection and validation process in 2013. However, at the time of the Evaluation, only a few projects had gone through a formal, standardized project validation system.

Project Selection

A departmental review recommended that the Defence S&T Program establish a rigorous review process to focus on projects with the DND/CAF priorities by reducing its R&D projects and associated contracts and personnel. Accordingly, the Defence S&T project approval process was modified and introduced new screening criteria to further focus on client outcomes, generate a clear line of sight between client outcomes and S&T activities, demonstrate greater accountability, deliver S&T more effectively and efficiently, and align with departmental processes.69 Program formulation70 is now the responsibility of the respective S&T DG in collaboration with their departmental and CAF clients. Since 2012, each program client has a Defence S&T board through which projects and priorities are identified in concert with the Defence S&T Strategy. Selected projects are then reviewed by the Research and Development Executive Committee71to be recommended to ADM(S&T).

Interviews conducted with program staff highlighted the following points concerning the project selection process:

  • FY 2013/14 was the first year of application for the new selection process, so some areas might need further strengthening. For example, additional communication between the client, the respective DG, and scientific staff might be required for some programs in later years;
  • Some concerns were raised about the Centre Directors’ lower level of involvement in the process, given their significant role in project development/overall lifecycle, and their familiarity with client requirements;
  • While the process bundled up projects and aligned each program to priorities, many legacy programs still continued, which indicated that these programs were previously not that far off-track; and
  • Key interviewees also believed that the new selection process is generally focused on client needs. For example, the liaison between the projects was previously driven at a lower level, which prevented project visibility, tracking, and coordination at higher levels.

Project Validation

The evaluation team could not find sufficient evidence of standard or synchronized processes between client organizations and the Defence S&T Program to review and validate S&T project results. While some clients, such as Chief of Force Development, reported that they did not have a formalized validation system in place, the RCAF believed that the Air Force Science and Technology Oversight Committee72 has provided a sufficient guidance and validation function throughout project progress. In addition, in 2013/14, the Defence S&T Program instituted a new process by which projects are presented for endorsement to senior departmental boards (i.e., the Defence Capability Board and the Project Management Board). Although Senior Review Boards were convened for major projects, the appropriate engagement of military staff/sponsors of the S&T work has varied over time.

Interviews with key Defence S&T Program staff underlined that direct client feedback has also been requested through a Directed Client Support form. However, it was reported that clients rarely filled out the form, and there was no formal process to respond to feedback.

Indicator 2.4.2.4: Evidence of a project monitoring process that can support performance measurement and reporting

Key Finding 13: The Defence S&T Program lacks integrated project monitoring and control systems to ensure the integrity of project-related information and its subsequent use by management. While the S&T Functional Assessment findings were used to influence S&T strategic guidance and activities for the following planning cycle, no documented evidence was found to support how related decisions were made by senior management.

The evaluation team examined whether the Defence S&T Program had project monitoring and performance management systems in place to support informed decision making. It was noted that three planning and monitoring systems were used over the period of the Evaluation. These were CPME, Defence Resource Management Information Systems (DRMIS),73 and other tools such as unofficial reporting systems by portfolio directors and project managers.

Although project performance indicators exist and are rolled up at the corporate level, no documented evidence was found to support that any were used in decision making. The CPME database has been used to monitor how much funding the Defence S&T Program spends in order to determine how much has been leveraged from allies, OGDs, industry, and academia, as well as how much time was spent for FTEs on projects. CPME has not been used as a general practice to help in project decision making.

The evaluation study noted that ADM(S&T) has approved the implementation of project management cells (i.e., personnel dedicated to providing project management support) across all DRDC research centres to manage, track, and collect project-related data.74 However, the interviewees were not aware of a particular project management tool to be used by all DRDC research centres.

As a result of program transformation (explained in the Introduction), in 2012 the Defence S&T Program developed an Integrated Corporate Services Performance Measurement Strategy75 with the goal of ensuring that “DRDC [will] sustain and enhance its corporate services and activities.” In order to achieve this, the Defence S&T Program identified the following initiatives, each of which may take up to two years to implement:

  • Annual Strategic Review—to provide clients and stakeholders with quantitative indicators that provide assurance that actual service is meeting standards and expectations;
  • quarterly or semi-annual report—to assess the performance, at the DRDC research centre level, using wider performance indicators that provide more frequent and detailed reporting; and
  • automating business processes.

The evaluation study noted that in FY 2010/11, the Defence S&T Program replaced its traditional annual report,76 which had been published on the internet, with a more streamlined reporting system. Since then, the Defence S&T Program has produced a streamlined annual report to meet the requirements of the Treasury Board of its special operating agency status.

Overall, the evaluation team found that project management systems/tools to monitor, report, and sustain progress on projects have not yet been formalized. Additional challenges are noted in the absence of a roadmap and targets for the Defence S&T Strategy. In this regard, the evaluation team’s examination of international partner organizations found examples of clear and well-established tools and monitoring processes that the Defence S&T Program could use to assess, analyze, and report on the satisfaction of clients/end-users, as well as program performance. For example, the Dstl’s performance measurement/dashboard model provides a simple and precise overall performance review of the British program. The model identifies reported areas with clear and measurable indicators, executive owners, trend data, and it includes plans to address concerns.77 Results of overall client satisfaction and products delivered on time and completed to cost are rolled up into quarterly and annual reports.

CRS Recommendation

4. Expedite efforts to formalize project management and oversight practices across the Defence S&T Program.

OPI: ADM(S&T)

Indicator 2.4.2.5: Evidence that a capability management system is in place to achieve strategic goals

Key Finding 14: The Defence S&T Program lacks a sound corporate-level capability management plan to holistically manage all resource management requirements.

Key Finding 15: The Defence S&T Program depends on capital assets that are not under its control. Furthermore, the state of those assets has implications for the capacity of the Program to do its work and achieve its strategic goals.

Interviews with ADM(S&T) Corporate Services and DRDC research centre staff suggested that centralizing services is an effective and efficient means of delivery. By centralizing some services, the organizations can rely on the expertise and critical mass built within the corporate services of the organization. This model may avoid some duplication between DRDC research centres, allowing the latter to focus primarily on the research at hand.

In 2012, the Defence S&T Program started a business transformation process to streamline processes and gain efficiencies. The process was completed in April 2014. It was observed during the evaluation study that DRDC research centre operations are still largely silo-based.

Interviews with program staff and document reviews also provided examples of ongoing challenges in the following areas of human resources and resource management:

  • Human resources. Program staff underlined their concerns about losing considerable senior experience and the risk of not being able to attract highly qualified candidates, due to Strategic Review, Deficit Reduction Action Plan, and Work Force Adjustment. Another area of concern was the long hiring procedures, followed by training requirements for new scientists.

Interviewees noted that succession planning has not been in place, while the technical community (scientists, researchers, engineers) has been declining in numbers. The process to recruit and train scientists takes time.

Follow-up interviews underlined the need for an overarching capability management plan and/or mechanism to be put in place in order to assess the need for required current and future capabilities, which should be matched with the existing human resources plan.78 For example, if the only person who worked on a specific scientific capability had left the organization, then the capability management plan would assess the need to keep this scientific capability and whether or not there should be a succession plan in place. In this regard, the current strategic human resources plan would be a subcomponent of the capability management plan.

  • Resource management. The evaluation study noted that a lack of resource capability planning discussions within and between the Defence S&T Program and the organizations in DND/CAF has led to uncertainties about some research outcomes. For example, the Quest79 and the Barge80 are owned and operated by ADM(Mat) and have provided research capabilities for the Defence S&T Program. The impact of not having this resource capability due to maintenance-related issues will be significant for many ongoing and future Defence S&T projects.

In addition, a document review revealed that the relationship between Assistant Deputy Minister (Infrastructure and Environment) (ADM(IE)) and ADM(S&T) with respect to infrastructure issues needed to be clarified. An infrastructure strategy prepared by ADM(S&T) was never implemented due to real property centralization by ADM(IE). Moving forward, it is important to ensure that the infrastructure strategy remains aligned with the strategic needs of ADM(S&T) and the DND/CAF for science.

In terms of inventory management, the evaluation team noted that lack of guidance in the form of an overarching equipment inventory might have led to duplications in purchasing efforts. Based on interviews, labs are managed in silos to serve the unique equipment needs of each DRDC research centre. For example, licences for modelling and simulation tools have been purchased based on specific project requirements. In the absence of a corporate equipment inventory, DRDC research centres cannot track if equipment is available in another research centre. An overarching equipment inventory will lead to potential efficiencies in the sharing of assets and equipment across research groups.

Indicator 2.4.2.6: Evidence of scientific and technical collaboration and communication (e.g., interoperability or leveraging of expertise) between DRDC research centres

Key Finding 16: The Defence S&T Program recognizes the need for scientific and technical collaboration between DRDC research centres.

Based on interviews conducted with the Defence S&T Program staff, there is evidence of scientific and technical collaboration between DRDC research centres. For example, although most of the blast effect components and research on active protection systems and rockets were conducted with DRDC Valcartier, DRDC Suffield has provided the calibration site for hyper spectral research for DRDC Valcartier. Other similar examples of scientific and technical collaboration between research centres are as follows:

  • DRDC Toronto shared their expertise on human components and factors with DRDC Suffield in a testing collaboration. While DRDC Suffield measured trainee stress on a training test, Toronto did the assessment of the performance and the factors that affect it. DRDC Suffield and Toronto also collaborated in the area of underwater explosion effects on divers, combining Suffield’s expertise in blast technology with Toronto’s expertise in human physiology;
  • Similarly, DRDC Suffield conducted blast impact tests with DRDC Atlantic, which has the ship materials group that investigates blast and impact resistance. The tests were conducted in Suffield using materials from DRDC Atlantic;
  • A deployable radiation assessment vehicle created by DRDC Ottawa was also tested in DRDC Suffield; and
  • DRDC CORA and Suffield have collaborated on a future threats assessment project.

During interviews with program staff, some staff expressed concerns that the new project selection process might not explicitly provide opportunities for cross-research centre engagement. However, it was also mentioned that efforts have been underway to identify cross-research centre issues, which are expected to lead to explicit direction for cross-research centre engagement.

Table 3 displays the engagement between DRDC research centres.81 DGSTCO is tasked to deliver 75 projects out of 80 in total: 61 percent of those projects are conducted by multiple DRDC research centres, and only 23 percent of projects are run by a single DRDC research centre.

Table 3. The Extent of Scientific Collaboration and Communication between DRDC Research Centres. This table represents the extent of scientific collaboration and communication between DRDC research centres. (Source: DRDC research centre program data)

Table Summary

This table represents the extent of scientific collaboration and communication between DRDC research centres. There are 9 columns and 6 rows, including titles. The top row lists the 7 DRDC research centres: Atlantic, CORA, Ottawa, Suffield, Toronto, Valcartier and DGSTCO’s office, and the column at left includes the number of projects involved, the number of projects single section, the number of project led, Leads multi centre, and the last row lists the percentage of multiple DRDC research centres. Select a category from the left column and read across the rows to learn the amounts for each of these DRDC centres, with the total for the category provided in the last column.

Atlantic CORA Ottawa Suffield Toronto Valcartier DGSTCO Total
Number of Single and Multi-centre Projects Involved 19 21 22 15 40 38 1
Number of Single and Multi-centre Projects Led 13 13 12 11 5 20 1 75
Number of Single-centre Projects Led 3 7 2 3 2 0 0 17
Number of Multi-centre Projects Led 9 5 8 8 2 13 1 46

Indicator 2.5. Evidence of external review processes used by the Defence S&T Program

Key Finding 17: The Defence S&T Program has an external review process in place. However, the evaluation team could not find sufficient evidence to assess to what extent the review recommendations were implemented.

The Defence S&T Program’s benchmarking process is comprised of two components: an annual internal S&T capability assessment and an external peer review of the S&T capability expertise area.82 Numerous external assessments of the Defence S&T Program were conducted from 2009 to 2012. The intent of these assessments was to evaluate the following:

  • the expertise and knowledge of the scientific staff;
  • the infrastructure, equipment, and tools to conduct the S&T research;
  • the management of the S&T capability and its R&D program; and
  • the required internal and external partnerships and collaboration.

The reviews also considered alignment between the S&T projects and departmental and government priorities.

In 2012, a Focus on Priorities Working Group conducted a full review of the portfolio of DRDC S&T capabilities. This Working Group also based its review on previous years’ peer review reports conducted by DRDC research centres. The criteria used for its assessment included strategic alignment, scientific leadership, impact, and imperative.

The Working Group report offered recommendations/solutions such as the following:

  • consolidations of certain program elements with other DRDC research centres that conduct similar programs;
  • examination of cost-sharing of infrastructure with partners;
  • reconsideration of certain program components, whether or not they ought to be led by ADM(S&T); and
  • examination of consolidations of certain programs or program components between DRDC research centres and whether possible savings would result.

Nonetheless, it was not clear if or to what extent these recommendations were considered or implemented.83 Finally, due to the unavailability of some peer review reports, the evaluation study could not determine if there was a standard peer review process across all DRDC research centres.

CRS Recommendation

5. Implement a management structure that ensures coordination of activities across the DRDC Centres, including resource sharing and management, and promotion of external partnerships.

OPI: ADM(S&T)

2.4.3 Intermediate Outcome—Canadian Defence and Public Safety and Security capabilities enhanced with the Defence S&T Program

The following two indicators were used to make this determination:

  • evidence of direct support to operations; and
  • evidence of improved Canadian defence and public safety and security capabilities through integration of knowledge and technology: publications, awards and recognitions, licenses, and patents.

Key Finding 18: The Defence S&T Program has contributed significantly to CAF operations and GC needs.

Key Finding 19: There is evidence that technology or integration of knowledge produced by the Defence S&T Program is leveraged by the DND/CAF and OGDs/agencies.

As highlighted through the relevance and effectiveness sections of the evaluation report, work performed by Defence S&T Program scientists has been viewed as of value to many of its clients. The Defence S&T Program’s contributions to the war in Afghanistan and other operational requirements have received high praise. During this time, as the “trusted adviser,” the Defence S&T Program’s work has focused on the unique needs of the DND/CAF that are classified and sensitive. National and international awards were received as a result of dedicated work performed by Program staff. For example, in 2014 in recognition of their work, several Defence S&T scientists received NATO awards for achievement. The evaluation study noted 26 national/international awards obtained in 2013/2014. Defence S&T also contributed to strengthening the knowledge, skills, and capabilities of DND/CAF and its partners through its publications and technology transfer in the form of licenses and patents. The following are some noteworthy accomplishments over the evaluation period:

  • “The Canadian Forces Medallion for Distinguished Service” in support of the CF-18 electronic countermeasures systems used throughout Operation MOBILE. That work has been credited with saving the lives of two pilots and with preventing the loss of two CF-18 aircraft due to enemy attack during combat missions over Libya in 2011.
  • The Organization for the Prohibition of Chemical Weapons (OPCW) was awarded this year’s Nobel Peace Prize by a former Defence S&T Program employee.
  • The Canadian Interoperability Technology Interest Group was formed by the DRDC Public Security S&T Program and Canadian Police Research Centre in partnership with the Canadian Association of Chiefs of Police, the Canadian Association of Fire Chiefs, and Emergency Medical Services Chiefs of Canada. This group received the Excellence in Technology Award from the International Association of Chiefs of Police for helping to advance first-responder interoperability in Canada.

Table 4 demonstrates the intellectual property data for FYs 2012/13 and 2013/14.

Table 4. Defence S&T Program’s Intellectual Property. This table represents the Defence S&T Program’s contribution to intellectual property for FY 2012/13 and 2013/14. (Source: DRDC research centre program data.)

Table Summary

This table represents the Defence S&T program’s contribution to intellectual property: patents, license agreements, royalties generated and program’s contribution to Public Service Inventor Awards. There are 5 rows and 3 columns, including titles. The column at left lists the kinds of Intellectual Property and amounts involved, namely, Patents, License Agreements, Licensees’ reported royalties generated, and the Defence S&T Program’s contribution to Public Service Inventor Awards. The two other columns are for fiscal years 2012/13 and 2013/14. Select a kind of intellectual property or an associated amount from the left column and read across to learn the relevant figures for either of the two mentioned fiscal years.

Intellectual Property 2012/13 2013/14
Patents 11 13
Licence Agreements 11 7
Licensees’ Reported Royalties Generated $2,215,033 $452,418
Defence S&T Program’s Contribution to the Public Service Inventor Awards $623,029 $123,801

As for scientific and technical publications, a review of program data (Tables 5 and 6) reveals a steady increase in conference presentations between 2008/09 and 2010/11. In 2012/13, marginal decreases were noted for contractor reports, scientific reports, and conference presentations. However, there has been a significant decrease (76 percent) in the number of publications of scientific literature between FYs 2011/12 and 2012/13. This could partly be attributed to the Controlled Goods Program84 that was introduced in 2011. It was noted that since the introduction of this initiative, a lot of publications have been backlogged.

Table 5. The Defence S&T Program Scientific and Technical Reports, Presentations, and Publications. This table shows the number of the Program’s scientific and technical publications, partial* figures on conference presentations for FYs 2011/12 to 2012/13. (Source: DRDC research centre program data.)

Table Summary

This table shows the number of the Program’s scientific and technical publications, partial figures on conference presentations, and national and international awards in FYs 2011/12 to 2012/13. The column at left lists the mentioned fiscal years. For each of these fiscal years, read across to learn the number of items relevant to contractor reports, scientific reports, scientific literature and conference presentations. The final cell at right directs the reader to consult Annex E for details about national and international awards. A special note cautions readers that figures for conference presentations are only for seven of the eight DRDC research centres.

FY Contractor Reports Scientific Reports Scientific Literature Conference Presentations
2011/12 152 202 179 665
2012/13 145 185 60 604

* Figures are only for seven of the eight DRDC research centres.

Table 6. Scientific and Technical Publications. This table represents the Program’s miscellaneous scientific and technical publications for FYs 2008/09 to 2010/11. (Source: DRDC annual reports.)

Table Summary

This table represents the Program’s miscellaneous scientific and technical publications for FYs 2008/09 to 2010/11. There are 4 rows and 6 columns, including titles. The column at far left lists the 3 mentioned fiscal years, and the next columns list figures for the following categories: contractor reports, scientific reports, scientific literature, conference presentations and, finally, international and national awards.

FYs Contractor Reports Scientific Reports Scientific Literature Conference Presentations International and National Awards
2008/09 300 500 600 330 18
2009/10 350 450 750 420 36
2010/11 255 690 455 575 11

2.5 Demonstration of Efficiency and Economy

The following section examines the extent to which the Defence S&T Program provides value for money by using the most efficient and economical means to achieve the outcomes expected of it. The Evaluation Policy (2009) defines efficiency as “maximizing the outputs produced with a fixed level of inputs.” Economy is defined as “minimizing the use of resources to achieve expected outcomes.”85 Economy also considers whether the resources allocated to the Program are reasonable and sustainable.

The evaluation team considered processes and mechanisms in place for managing, monitoring, measuring, and reporting expenditures, as well as for ensuring efficiency and economy of resource use by the Defence S&T Program during the period of FYs 2008/09 to 2013/14. This information was gathered through examination of multiple lines of evidence, including the following:

  • interviews with key Program personnel to know their roles and responsibilities;
  • review of program and external documents and reports, including: Departmental Performance Reports; DRDC Annual Reports; ADM(S&T) Business Plans; Defence S&T Program submissions to annual Statistics Canada Federal Scientific Expenditures and Personnel Survey;
  • analysis of business processes and procedures and analysis of financial recording/reporting by S&T projects and DRDC research centres; and
  • analysis of financial and human resources data86 relating to Defence S&T Program projects, DRDC research centre and client portfolios, and the Program as a whole.87

2.5.1 Economy

Are resources allocated to the Defence S&T Program reasonable, economical, and sustainable?

The following indicators were used to make this assessment:

  • total value of science;
  • trends in program resources; and
  • reporting mechanisms.

Total Value of Science

Key Finding 20: Multiple data systems used by the Defence S&T Program create challenges in capturing the “total value of science.”

Sound internal program information is required to determine if the Defence S&T Program is delivering its products and services to address client needs in an effective and economical manner. In order to calculate the total cost/value of science, the evaluation study sought to identify all sources of financial and in-kind support. The Program receives an A-base funding allocation from DND, in addition to funding from other groups within the DND/CAF and from external sources. In addition, the total human resources FTE allocations for the Defence S&T Program were considered. In this regard, the evaluation team compiled financial and human resources data from the two departmental systems: DRMIS and Human Resources Management System (HRMS). For other financial contributions to the Program and in-kind contributions, the CPME database was used.88

DRMIS reporting for ADM(S&T) includes A-base funding and some, but not all, financial transfers from other DND/CAF L1 organizations. It also accounts for funding from Treasury Board to operate the Public Security S&T Program. Funding from other sources, including leveraged funding and in-kind contributions from allies, industry, universities, and OGDs, can include the contribution of people, facilities, information, equipment and software. In-kind contributions take many forms, including the contribution of people and facilities. These contributions were only recorded in the CPME by project managers. CPME data did not record salary and wage envelope and operations and maintenance expenditures, but it did record funding from other sources. CPME also recorded financial contributions from other L1 organizations in the form of funds earmarked, which were not recorded in ADM(S&T) expenditures for DRMIS. Other funds received include infrastructure funding from ADM(IE), and CAF Regular Force salaries received from Chief of Military Personnel.

Trends in Program Resources

Key Finding 21: Since FY 2011/12, there has been a significant (about 20 percent) decrease in the ADM(S&T) A-base budget allocations, and a corresponding decrease of approximately 242 FTEs (about 20 percent).

The Program’s peak period for total financial and human resources was FYs 2010/11 to 2011/12 (Tables 7 and 8). This period coincided with DND/CAF commitments in Afghanistan, as well as with support to events in Canada, such as the 2010 Winter Olympics and the summit meetings of the Group of 8 and Group of 20 governments (commonly known as the G8 and G20).

As outlined in Table 7, since FY 2011/12 the number of civilian FTEs has been reduced by 24289 employees (about 20 percent).

Table 7. FTE Cohort in the Program. This table represents civilian FTEs reported as per the system of record. (Source: HRMS)

Table Summary

This table represents civilian FTEs reported by system of record. There are 7 columns and 3 rows, including titles. The columns list fiscal years 2008/09 to 2013/14, while the rows list the numbers of FTEs and the change over the period in percentages. Select either of these two last categories and read across to learn the values for each of the FYs in scope.

2008/09 2009/10 2010/11 2011/12 2012/13 2013/14
FTEs 1,581.42 1,626.15 1,662.97 1,634.29 1,474.19 1,351.02
Change Over Period 2.8% 2.2% -2.3% -9.2% -8.3%

Since FY 2011/12, A-base budget allocations (Table 8) have decreased significantly (about 20 percent), which coincided with departmental reviews (Strategic Review and Deficit Reduction Action Plan). Over the same period, external contributions to the Program at the project base, both financial and in-kind, have decreased substantially (65 percent). Of the overall decrease in in-kind contributions, 92 percent was from external partners. While DND’s contributions decreased by 17 percent, external partners contributions decreased by 84 percent.

Table 8. Defence S&T Program Total Expenditures. This table summarizes the total estimated Program expenditures and breakout of the Program’s support sources.

Table Summary

This table represents the total estimated Program expenditures and breakout of the Program’s support sources; figures are in thousands of dollars except where percentages are indicated. There are seven columns and seventeen rows, including titles. The total value of the resources used by the Defence S&T program combines A-base funds reported via DRMIS, other non-DRDC contributions as reported via CPME, pay for CAF Regular Force members, and infrastructure funds. The first row lists Fiscal years “2008-09” through to “2013-14”. The first column outlines the program allocation of resources; “1. Defence Resource Management Information System (DRMIS) Allocation” including its’ “Percentage of change in years”, followed by the “2. Collaborative Planning and Management Environment (CPME)-Non-DRDC” including “Other DNDL1 Financial, In Kind (Assets)” and the total for “Other DND L1 Contribution”, followed by “External Partners Financial, In Kind (Assets),and the total for“ External Partner Contribution” and their “Percentage change in years”, next lists the “3. Reg Force Pay (CMP PAA 1.1)” and “4. Infrastructure (ADM IE PAA 1.1)” and last, the “Total estimated cost1+2+3+4”. The last three rows lists figures for “DND Expenditures on S&T Program (excluding partner contributions and value of DND assets leveraged”, and percentage values for both “Average Annual Growth rate” and “DND Expenditures as Share of DND Budget”.

2008/09 2009/10 2010/11 2011/12 2012/13 2013/14
1.& DRMIS—Allocation 321,323 314,704 333,500 332,039 293,946 263,336
Change in FYs (%) -2.1 6.0 -0.4 -11.5 -10.4
2. CPME—non-DRDC 194,458 204,165 190,289 202,361 119,814 70,871
Other DND L1—Financial 32,280 35,839 45,412 36,544 46,391 26,593
Other DND L1—In-kind (assets) 18,033 22,832 16,067 20,641 15,021 20,753
Other DND L1—Contributions 50,313 58,671 61,479 57,185 61,412 47,346
External Partners—Financial 6,354 6,396 11,155 16,470 6,471 1,577
External Partners—In-kind (assets) 137,791 139,098 117,655 128,706 51,931 21,948
External Partners—Contributions 144,145 145,494 128,810 145,176 58,402 23,525
Change in years (%) 0.9 -11.5 12.7 -59.8 -59.7
3. Regular Force Pay (Chief of Military Personnel—PAA 1.1) 8,683 9,104 3,269 3,670 4,873 4,884
4. Infrastructure (ADM(IE)—PAA 1.1) 12,298 6,524 5,888 8,350 12,379 22,145
Total Estimated Cost (=1+2+3+4) 536,762 534,497 532,946 546,421 431,012 361,236
DND Expenditures on S&T Program (excluding partner contributions and value of DND assets leveraged) 374,584 366,171 388,069 380,604 357,589 316,958
Average Annual Growth Rate (%) n/a -2.2 6.0 -1.9 -6.0 -11.4
DND Expenditures as Share of DND Budget (%) 2.0 1.8 1.9 1.9 1.8 1.7

Program staff commented that, despite declining in-kind support figures, expertise within the Program has still been managed effectively by the Program. Accordingly, the rationale presented to the evaluation team to explain the decline in in-kind contributions from external sources included the following:

  • new departmental rules and regulations, which no longer allow National Procurement funding to be used to support FTEs. This has led to a significant decrease in FTEs supported by in-kind contributions;
  • Strategic Review/Deficit Reduction Action Plan personnel cuts, coupled with Treasury Board’s imposed rules on project management, which place any R&D effort into a higher class of risk. This may have led to a drop in support and a commensurate increase in requests for unfunded direct client support for technical problems. In addition, oversight and diligence in tracking leveraging may also have been impacted; and
  • R&D treated as a tactical issue, and not a strategic enabler. Although at the start of the evaluation period, project formulation was largely bottom-up, recent directions have been towards a top-down alignment of the Program to meet key DND/CAF requirements and priorities.

This suggests that the Program is entering a period with substantially reduced internal funding and contributions from external sources. Moving forward, funding reductions will place increasing pressure on program activities and available resources. A sound performance measurement and management system would provide the information necessary to identify which research portfolios and or lines of business are becoming unsustainable, and where alternative service delivery may be the preferred option. Since the Program is organized to deliver S&T products to client groups, reporting may be more meaningful if it aligned resource use/expenditures by client portfolios, rather than by DRDC research centres.

CRS Recommendation

6. Implement a formalized and integrated resource management system that provides key resource and financial data to support decision makers.

OPI: ADM(S&T)

Mechanisms in Reporting

The FY 2010/11 DRDC Business Plan stated that the ERP systems were not integrated and did not permit seamless information access and flow between functional areas such as finance, human resources, and materiel. Similarly, program staff who were interviewed underlined the need for integrated ERP and expenditure systems to be in place to permit access to financial and human resources systems. The current ERP environment contains many costly and overlapping operational and corporate systems, which are neither aligned nor take advantage of the technology resident within the systems already in use.90

Prior to 2011, senior management was provided with the necessary financial updates on which to base financial and operational decisions about every two months. However, during Strategic Review/Deficit Reduction Action Plan, and since the DRDC Transformation, financial updates based on DRMIS reports91 have been provided only once a year. As of July 2014, senior management began receiving monthly civilian human resources reports through HRMS. However, there was no evidence of reporting on the status of military members employed across DRDC research centres and/or contracted employees.

The CPME was used mostly as a planning tool to record the Program’s in-kind contributions, as well as project-related financial and human resources. However, CPME was not used as a regular reporting tool for financial and human resources. The nature of reporting through CPME was ad hoc in nature, and was pulled by management upon request. Therefore, the evaluation team could not identify the means by which Defence S&T Program senior management had visibility of their project, research centres, or program-wide activities, allocations, and expenditures.

The evaluation study noted several impediments in the Program’s financial and performance reporting, including issues with the organizational structure and the lack of documented business processes and procedures, as follows:

  • Organizational structure. During the evaluation period, the Program changed its organizational model from one focused on DRDC research centres led by DGs with multiple roles, responsibilities, and accountabilities, to one where those roles, responsibilities, and accountabilities have been focused on program leadership. In the previous centre-focused model, the DGs operating from DRDC research centres across the country performed three integrated functions,92 namely, scientific advisor, portfolio manager, and resource manager. The evaluation study noted that in 2009, a survey of senior management (from levels 1 to 4) had identified financial, project, procurement, information, and asset management issues. Specific recommendations arising from the survey were related to improving S&T project documentation, strengthening project management practices and culture in DRDC, encouraging stronger budgeting and forecasting practices within the management cadre, and auditing milestone failures in S&T projects.93

“The building blocks of a good management model for an organization call for a system that integrates all the components of decision making with management practices.”94 The Program is now in the process of moving to a model that reorganizes and separates S&T planning from delivery, giving the DGs single roles as portfolio managers. In this new model, the DRDC centres are managed by directors rather than DGs. The evaluation team was told by management and staff that during this transition, issues have persisted and in some cases worsened due to a lack of clear roles, responsibilities, and accountabilities related to program performance. However, during this transition period, not all roles and responsibilities have been clearly identified. Consequently, this has created impediments to properly-functioning organizational mechanisms for gathering and integrating information. The evaluation study found that, at times, unclear delegation of responsibilities was also leading to impediments in reporting.

  • Lack of documented business processes and procedures. The evaluation team found that, as of 2013, business processes and procedures were not consistent across all DRDC research centres. As such, alignment and integration across projects, DRDC research centres and program-wide reporting have been problematic. Program staff noted that transformation and reorganization efforts have been ongoing, and further work would be necessary in this area to ensure a fully functional and effective performance measurement and monitoring system to inform decision making.

CRS Recommendation

7. Align ADM(S&T) policy and guidance, including roles, responsibilities, and accountabilities, to the delivery of the Defence S&T Program’s strategic mandate.

OPI: ADM(S&T)

2.5.2 Demonstration of Efficiency

Are the most appropriate and efficient means being used by DND/CAF to deliver the Defence S&T Program?

The following indicator was used to make this assessment:

  • assessment of resources used to produce the program outputs.

The evaluation study attempted to compare the relative proportions of expenditures on research and administrative support. In the absence of complete and accurate financial information, this analysis aims to serve as a provisional assessment of whether or not the resources were used efficiently. However, due to significant changes in the Program’s governance structure, for this assessment only the expenditures for FY 2013/14 were used. While Table 9 represents the relative shares of all resources dedicated to management and administrative functions, versus the Program’s research activities, Table 10 represents the same calculation based on the A-base funding.

Table 9. Defence S&T Program Share of Resources in FY 2013/14. This table represents the share of resources dedicated to research activities versus management and administration calculated against total estimated costs.

Assumptions:

  • All DRMIS allocations outside of Chief of Staff and DG Corporate Services to research activities.
  • Non-DRDC contributions to research activities.
  • Regular Force pay to research activities.
Table Summary

This table represents the share of resources dedicated to research activities versus management and administration calculated against total estimated costs. There are 3 columns and 11 rows, including titles. The left column lists three overall types of resource disbursals and their total, while the other two rows on its right list each disbursal's amount in thousands of dollars and its percentage relative to the Program's total. The left column lists Management and Administration costs, a DRMIS Allocation, which is subdivided into two: Chief of Staff and, in the next row, DG Corporate Services. The next row and the four that follow it are for Research Activities; these four subdivisions are: DRMIS Allocation; Other DND L1 Contributions—Financial and In-kind (Assets) (CPME); External Partner Contributions—Financial and In-kind (Assets) (CPME); and Regular Force (CMP—PAA 1.1). The next row in this list of resource disbursements is Infrastructure (PAA 1.1), and it is followed, at the bottom of the Table, by Total Estimated Costs. Read across any of these disbursements for the total in thousands of dollars expended during fiscal year 2013/14, and, in the last column on the right, for the percentage of the total monies disbursed.

FY 2013/14 ($000s) % of Total
1. Management & Administration (DRMIS Allocation) 56,781 15.7
Chief of Staff 8,404 2.3
DG Corporate Services 48,377 13.4
2. Research Activities 282,310 78.2
DRMIS Allocation 206,555 57.2
Other DND L1 Contributions—Financial and In-kind (Assets) (CPME) 47,346 13.1
External Partner Contributions—Financial and In-kind (Assets) (CPME) 23,525 6.5
Regular Force Pay (Chief Military Personnel—PAA 1.1) 4,884 1.4
3. Infrastructure (PAA 1.1) 22,145 6.1
Total Estimated Cost 361,236 100.0

Table 10. The Defence S&T Program’s Share of Resources. This table shows the DRMIS funding allocations and the percent of total. (Source: DRMIS)

Assumption:

1. All DRMIS allocations outside of Chief of Staff and DG Corporate Services to research activities.

Table Summary

This table presents the share of resources dedicated to management and administration, and is calculated using only A-Base allocation. The table has four rows and three columns. The left column lists the global DRMIS allocation and, in the two rows that follow that, the Management and Administration costs and costs for Research Activities. The next column lists disbursals in thousands of dollars for 2013/14, and the last column on the far right provides figures for the percentage of the total. Read across each category to learn how much was disbursed in fiscal year 2013/14, and, in the last column, the percentage of global DRMIS allocations represented by that category.

DRMIS Allocation 2013/14 ($000s) % of Total
1. Management and Administration 56,781 21.6
2. Research Activities 206,555 78.4
Total 263,336 100.0

* Share of resources dedicated to management and administration are calculated using only A-base allocation.

Accordingly, in FY 2013/14, the value of resources dedicated to research activities represented approximately 78 percent of the total value of the Program’s resources. Management and administration costs were interpreted to be those costs associated with the Chief of Staff and the Corporate Services division. Management and administration was 15.7 percent of the total value, and 21.6 percent of the value of the A-base allocation, as reported in DRMIS. Given that some administrative support costs are provided by other parts of the Department, such as the ADM(IE), Assistant Deputy Minister (Information Management), and Assistant Deputy Minister (Human Resources – Civilian), the share of costs represented by administration and management is at the higher end of accepted levels of practice.95

Annex A—Management Action Plan

DRDC is the primary delivery agent for DND’s S&T investment, accounting for approximately two thirds of the Department’s total S&T investment. Over the past years, ADM(S&T) has sought to optimize this important role by evolving the Defence and Security S&T Program so that it is focused on strategic outcomes of critical importance to the Department. As part of this work, DRDC is now focussing on a smaller number of priorities, primarily in the strategic, classified, and sensitive domains, while building and harnessing the innovation capacity and capability residing with national and international partners.

The recent transformation of DRDC means that ADM(S&T) is now well-placed to work towards maximizing the impact of the Department’s total S&T investment. This is set to occur through greater coordination and oversight across DND’s entire S&T enterprise, which also includes ADM(Mat), Assistant Deputy Minister (Information Management), RCAF, Canadian Army, RCN, and Chief of Military Personnel.

CRS Recommendation

1. Develop a DND-wide policy to define and communicate the Defence S&T Program’s mandate, strategy, roles, responsibilities, and accountabilities—i.e., reinforcing ADM(S&T) as the functional authority.

Management Action

Action 1.1: ADM(S&T) will submit, for departmental consideration, options related to the implementation of the Departmental Chief Scientist functional role. (September 2015)

Action 1.2: ADM(S&T) will release a DND-wide policy/DAOD aimed at clarifying the role of ADM(S&T) in relation to the overall departmental investment in S&T and test and evaluation. (September 2016)

OPI: ADM(S&T)

Target Date: September 2016

CRS Recommendation

2. Determine the priority and niche S&T capabilities that align to GC/DND/CAF priorities and create the capacity to provide client guidance on non-niche areas.

Management Action

Action 2.1: ADM(S&T) will finalize the comprehensive DRDC S&T Capability Review started in 2014 to prioritize defence and security S&T capabilities and inform build-collaborate-access program and investment decisions. (September 2015)

Action 2.2: ADM(S&T) will develop a plan to implement the results of the Capability Review. (December 2015)

Action 2.3: ADM(S&T) will implement the plan. (April 2016)

Action 2.4: ADM(S&T) will communicate to clients its role as departmental interface with external partners to deliver on S&T departmental needs and requirements. (September 2016)

OPI: ADM(S&T)

Target Date: December 2015

CRS Recommendation

3. Create and implement a comprehensive strategy to increase and strengthen external engagements.

Management Action

Action 3.1: ADM(S&T) will build upon the partnership component of the 2013 Defence and Security S&T Strategy to create an engagement plan comprising those activities required to optimize relationships with allies, industry, academia, and OGDs in support of defence and security S&T outcomes. (December 2015)

Action 3.2: ADM(S&T) will implement the engagement plan. (September 2016)

OPI: ADM(S&T)

Target Date: September 2016

CRS Recommendation

4. Expedite efforts to formalize project management and oversight practices across the Defence S&T Program.

Management Action

Action 4.1: ADM(S&T) will augment its performance management practices, enabled by a project management tool to track project and program performance that will be used to inform DND S&T strategy and investment decisions. (September 2015)

Action 4.2: ADM(S&T) will enhance its annual S&T Program assessment report that includes annual performance measures to assess progress against planned performance targets and program outcomes. (April 2016)

OPI: ADM(S&T)

Target Date: April 2016

CRS Recommendation

5. Implement a management structure that ensures coordination of activities across the DRDC Centres, including resource sharing and management and promotion of external partnerships.

Management Action

Action 5.1: ADM(S&T) will review and address potential management structure gaps to finalize the DRDC Transformation to optimize efficiency in DRDC-wide S&T resources management. (September 2015)

OPI: ADM(S&T)

Target Date: September 2015

CRS Recommendation

6. Implement a formalized and integrated resource management system that provides key resource and financial data to support decision makers.

Management Action

Action 6.1: ADM(S&T) will identify requirements to enhance its management processes and existing tools for regular review of its key resource, financial, and performance information, as well as for reporting, resulting in ADM(S&T) program or capability decisions. (March 2016)

Action 6.2: In collaboration with ADM(S&T), key S&T enablers—Assistant Deputy Minister (Information Management), Assistant Deputy Minister (HR – Civilian), ADM(IE), and Assistant Deputy Minister (Finance and Corporate Services)—will develop implementation plans for the most cost-effective solution that meets departmental S&T program requirements. (March 2017)

OPI: ADM(S&T)

Target Date: March 2017

CRS Recommendation

7. Align ADM(S&T) policy and guidance, including roles, responsibilities, and accountabilities, to the delivery of the Defence S&T Program’s strategic mandate.

Management Action

Action 7.1: ADM(S&T) will communicate developed and agreed upon roles, responsibilities, and accountabilities to ADM(S&T) staff, DND/CAF stakeholders, and external partners for the departmental S&T Program. (April 2016)

OPI: ADM(S&T)

Target Date: April 2016

Annex B—Evaluation Methodology and Limitations

Methodology

The evaluation team used multiple lines of evidence and complementary qualitative and quantitative research methods to help ensure the reliability of information and data supporting findings. The methodology established a consistent approach in the collection and analysis of data to support evaluation findings, conclusions and recommendations. Based on evidence from available sources, the evaluation team reviewed the achievement of expected outcomes, and the Program’s efficiency and economy, to develop a balanced picture of the relevance and performance of the Defence S&T Program. Information and data were correlated to each evaluation question and corresponding indicators.

Overview of Data Collection Methods

Data collection methods were selected based on the data required to address performance indicators in the Evaluation Framework (Annex D). The following data collection methods were used to gather qualitative and quantitative data for each type of operation in the Evaluation:

  • literature and document reviews;
  • client questionnaire;
  • key informant interviews;
  • expert opinion;
  • comparative research analysis with allies; and
  • administrative, financial, and human resources data reviews.

Literature and Document Review

A preliminary document review was conducted, as part of the planning phase of the Evaluation, to garner a foundational understanding of the Defence S&T Program. A comprehensive document review was undertaken as part of the conduct phase of the Evaluation, focusing on the relevance and performance of S&T activities.

The following documents were reviewed during the conduct phase of the Evaluation:

  • foundation documents: Defence and Security S&T Strategy, The Future Security Environment (2008–2030), Report on Transformation (2011), DRDC Annual Reports (2008–2012), ADM S&T Business Plans (FYs 2009/10–2013/14);
  • corporate documents: Reports on Plans and Priorities, Departmental Performance Reports, Canada First Defence Strategy, and Speeches from the Throne;
  • legal documents: Acts and Regulations, such as the National Defence Act; and
  • other documents: evaluations, audits, other internal studies such as previous CRS audits.

The document review was conducted using a customized template organized according to the evaluation questions and indicators.

Client Questionnaire

A bilingual (English and French) survey to collect information and opinions on the Program was provided to all L1 organizations/client groups to determine the following:

  • the need for the Defence S&T Program within DND;
  • if program outputs (research and knowledge, evidence-based expert advice, and technology products and concepts) continue to remain relevant for L1 organizations/client groups; and
  • if Defence S&T outputs have met L1 organizational needs during the last five years.

A five-point scale was presented for each question, along with a section for comments after each question. Twenty-one responses were received.

Key Informant Interviews

Key informant interviews and information sessions undertaken with S&T stakeholders directly or indirectly involved in the program’s delivery served as an important source of qualitative information.

Interviewees were e-mailed an interview guide prior to the interview. Individual interviews were conducted in person or by telephone. Follow-up questions were posed and answered by e-mail. The interview guide was explained to interviewees before the beginning of the interview, at which time they were encouraged to be open and candid in their responses to the questions.

Notes were taken by the evaluators during interviews, with the consent of the interviewees. The evaluators transcribed the notes taken during the interviews and compared them with one another, with a view to establishing a common record.

Interviews were also conducted with industry stakeholders (Canadian Association of Defence and Security Industries, Canadian Aviation Electronics, General Dynamics and ING Engineering) to determine the nature and degree of effectiveness of partnerships between the Program and Canadian industry. These interviews also helped to assess alternative program delivery options.

Table B-1 lists the organizations interviewed and the personnel involved, the numbers of interviewees, and the number of interviewing sessions that took place.

Table B-1. Number of Interviewees by Organization. This table shows the number of interviews conducted and the total number of interviewees per organization.

Table Summary

This table shows the number of interviews conducted and the total number of interviewees contacted per organization. There are 16 rows and 3 columns. The column on the left lists the organizations that were interviewed. The second column lists the number of interviewees, and the third lists the number of interviews that were held, and the last row provides the totals for each of these two groups. Select an organization from the left column and read across to learn the number of interviewees who were approached, and the number of interviews that were held. For totals, read across the last row.

Organizations Interviewees Interviews
Defence S&T senior and key program staff 17 15
Canadian Army S&T staff 1 1
RCN S&T staff 2 2
RCAF S&T staff 1 1
Chief Force Development S&T staff 2 3
DRDC Ottawa 3 3
DRDC Valcartier 6 6
DRDC Halifax 6 6
DRDC Toronto 6 6
DRDC Suffield (Teleconference) 3 3
Canadian Defence Liaison staff (Washington) 1 1
Industry representatives 4 4
External expert 1 1
Former senior program staff 2 2
Total 55 54

Expert Opinion

The evaluation study interviewed an expert (the Canada Research Chair in Radiochemistry and Environmental Health, University of Ottawa) to inquire about his views on the Program’s contributions to and collaborations with academia.

Comparative Research Analysis with Allies

Information from international partner organizations (UK, US, and Australia) was solicited to assess their views on the relevance and performance of the Program. A request for information was distributed to partner organizations via Canadian Defence Liaison staff in Washington. Follow-up questions were raised and answered by e-mail.

Review of Financial and Human Resource Data

S&T financial and human resource data was reviewed to determine the degree of efficiency and economy of program activities. The data, covering FYs 2008/2009 to 2013/2014, was extracted from multiple, official, and unofficial systems and reports. The evaluation study used data provided through A-base allocation and in-kind/other resources to compile a more compete cost-per-value assessment of program outputs.

Limitations

Table B-2 lists the limitations associated with this evaluation study, and the mitigation strategies that were applied to address them.

Table B-2. Evaluation Limitations and Mitigation Strategies. This table lists the limitations of the evaluation and the corresponding mitigation strategies.

Table Summary

The table has 4 rows and 2 columns, including titles. The left column lists three kinds of limitations to the evaluation—including those that are methodological, about financial systems, and regarding attribution to outcomes. The column to the right lists the mitigation strategies that were adopted to address each limitation. Select a kind of limitation from the column on the left and read across to learn the mitigation strategies that were adopted to counter it.

Limitations Mitigation Strategies

Methodological

  • The Program is highly complex and involves many players. Many initiatives have only recently been put in place, so their impacts have yet to be measured.
  • Number of interviews was somewhat limited by time and resource pressures.
  • Low response rate for client questionnaire.

To explore further lines of evidence, questionnaires were sent to client organizations.

Follow-up interviews were conducted with client organizations.

Financial Systems

  • The financial analysis was based on various systems, causing variations in data.
  • Low confidence on the part of Defence S&T Program with respect to its Collaborative Planning Management and Evaluation system prevented any detailed review of program quality or status.
  • Defence S&T is in the midst of major changes to the planning and execution of the Program throughout Business Transformation.

The financial data was compared to other lines of evidence, such as interviews and further consultations with stakeholders.

Only provisional assessment of program strengths and weaknesses could be done at this time.

Attribution to Outcomes

  • Attribution of activities and outputs of the Program to intermediate and ultimate outcomes was difficult due to the high level of influence of external factors and lack of data.
More focus was placed on measuring the immediate outcome (viz., Defence S&T Program clients’ needs are served). These findings allowed making deductions to assess if intermediate outcomes had been achieved. Follow-up questions raised through stakeholder interviews provided explanations that helped to mitigate attribution to the intermediate and ultimate outcomes as the data act as a validity check96 on findings.

Annex C—Logic Model

Figure C-1. Logic Model for the Defence S&T Program. This flowchart depicts specific S&T activities, outputs, and outcomes.

Text description for Figure C-1

Starting Point A:

Inputs

  • DRDC facilities
  • ADM(S&T)
  • Chief of Staff
  • Corporate Services

Activities

DRDC Research Centre Activities:

  • Conduct S&T research, development and analysis
  • Conduct technology development and evaluation
  • Provide strategic, operational and tactical S&T expert advice
  • Engage/ build partnerships (academia, industry, allies, OGDs)
  • Anticipate and advise on future S&T trends, threats and opportunities
  • Provide S&T for external customers
  • Manage S&T program delivered by partners

Outputs

  • Evidence-based advice
  • Research and knowledge
  • Technology products and concepts
  • Leveraged partnerships
  • Stewardship of the S&T function

Research centres include:

  • DRDS Suffield, DRDC Ottawa, DRDC Toronto, DRDS CORA, DRDC Valcartier, DRDC Atlantic, Director General Military Personnel Research and Analysis, DRDC CSS Valcartier Research Centre and Atlantic Research Centre, and partners

Research centre outputs include:

  • DRDC Suffield: weapons effects; casualty management; military autonomous systems; chemical and biological defence
  • DRDC Ottawa: radar, electromagnetic warfare, space systems, information operations, communications
  • DRDC Toronto: military medicine, individual effectiveness, command and control, human factors
  • DRDC CORA: operational research, strategic analysis, S&T intelligence
  • DRDC Valcartier: executive order surveillance and countermeasures, weapons systems, personnel platform protection; command and control and intelligence systems, mission-critical cyber systems
  • DRDC Atlantic: underwater warfare, naval platforms, maritime information systems
  • Director General Military Personnel Research and Analysis: personnel generation, personnel and families support, operational and organizational dynamics
  • DRDC CSS: CBRNE; surveillance, intelligence and interdiction; emergency management systems; and interoperability

Partners include:

  • OGDs/agencies, industry, academia, allies, communities of practice

Immediate Outcomes

  • Defence S&T Program enables concepts, products and solutions to satisfy the needs of DND/CAF and its partners

Intermediate Outcomes

  • Improved Canadian defence capabilities
  • Improved Canadian public safety and security capabilities

Long-term Outcome

  • DND/CAF missions are enabled through S&T

Starting Point B:

Inputs

  • DRDC facilities
  • ADM(S&T)
  • Chief of Staff
  • Corporate Services

Activities

Defence S&T Governance and Support:

  • Chief of Staff: planning and reporting quality assurance; and managing external partnerships
  • Corporate Services: provide corporate service support to DRDCs compliance with acts, regulations, and policies

Outputs

Stewardship of the S&T function outputs includes:

  • Corporate reports (S&T Strategy, Annual Report, Departmental Performance Report, Report on Plans and Priorities); and peer reviews
  • Plans and policies; advice and guidance and tools regarding internal processes; and corporate stewardship and accountability

Immediate Outcome

  • Well-managed and supported DRDC research centres

Intermediate Outcome

  • S&T Program aligns with defence safety and security priorities

Long-term Outcome

  • DND/CAF missions are enabled through S&T

Annex D—Evaluation Matrix

Table D-1. Evaluation Matrix—Relevance. This table indicates the data collection methods used to assess the evaluation issues/questions for determining the relevance of the Program.

Table Summary

This table indicates the data collection methods used to assess the evaluation issues/questions for determining the relevance of the Program. The study used 3 evaluation questions to determine the Program’s relevance; these questions appear listed in the left-hand column, each under its own sub-heading. Select a question and read across that row to learn, first, the Program’s relevance according to the study’s indicators. The 4 columns that follow list the sources of evidence that support each of the indicators. These sources of evidence are: administrative data; document review; a questionnaire; and key informant interviews.

Evaluation Issues/Questions Indicators Program Administrative Data Document Review Questionnaire Key Informant Interviews

1.0 Continued need for the Program

  • Is there a continuing need for the DND/CAF to deliver the Defence S&T Program?
1.1 Evidence that the Defence S&T Program responds to emerging needs and threats/provides unique resources, services, and capabilities No Yes Yes
  • Program staff
  • Partners
  • OGDs/academia/industry
  • External expert opinion
1.2 Evidence that OGDs and external partners use Program outputs Yes Yes Yes
  • Program staff
  • Partners
  • OGDs/academia/industry
  • External expert opinion

2.0 Alignment with government priorities

  • Are the objectives of the Program consistent with DND strategic outcomes and federal government priorities?
2.1 Alignment with government Acts, legislation, and policies No Yes No
  • Program staff
  • External expert opinion
2.2 Evidence regarding the frequency and reasons for when DND/CAF clients of the Program go outside of the Program for S&T service Yes Yes Yes
  • Program staff
  • Partners
  • OGDs/academia/industry
2.3 Existence of OGDs, agencies, and/or organizations providing similar resources, services, and capabilities No Yes Yes
  • Program staff
  • OGDs/academia/industry
  • External expert opinion
3.0 Alignment with federal government prioritiesDoes the GC (and DND/CAF specifically) continue to have a role/responsibility in delivering the Program? 3.1 Alignment between the Defence S&T Program priorities and federal government priorities No Yes Yes
  • Program staff
  • Partners
  • OGDs/academia/industry
  • External expert opinion
3.2 Alignment between Program priorities and DND/CAF priorities and strategic outcomes No Yes Yes
  • Program staff
  • External expert opinion

Table D-2. Evaluation Matrix—Performance (Effectiveness). This table indicates the data collection methods used to assess the evaluation issues/questions for determining the performance in terms of achievement of outcomes (effectiveness) of the Program.

Table Summary

This table indicates the data collection methods used to assess the evaluation issues/questions for determining the Program’s performance in terms of achievement of outcomes (effectiveness). This table covers both immediate and intermediate outcomes, which divide the table’s left column in two. The immediate outcomes of the Program’s performance or effectiveness were determined by means of 2 evaluation questions, whereas its intermediate outcomes were determined through 1 question. These questions appear in the second column. Select a question and read across that row to learn, first, the Program’s performance according to the study’s indicators. The next 4 columns list the sources of evidence that support each of the indicators, which are: administrative data; document review/benchmarking; a questionnaire; and key informant interviews.

Evaluation Issues/Questions Indicators Program Administrative Data Document Review Questionnaire Key Informant Interviews
Immediate Outcomes

4.1 To what extent does the Defence S&T Program satisfy the needs of the DND/CAF and its partners?

4.1.1 Evidence of stakeholder confidence in DRDC scientific and technical capacity, including degree of client satisfaction (quality, timeliness, responsiveness) Yes Yes(excludes benchmarking) Yes
  • Program staff
  • Partners
  • OGDs/academia/industry
  • External expert opinion
4.1.2 Evidence of effective partnerships/meeting targets Yes Yes No
  • Program staff
  • Partners
  • OGDs/academia/industry
  • External expert opinion

4.2 Are DRDC research centres well managed and supported?

4.2.1 Evidence of a project selection process Yes Yes Yes
  • Program staff
  • External expert opinion
4.2.2 Evidence of a project monitoring process that can support performance measurement and reporting Yes Yes Yes
  • Program staff
4.2.3 Evidence that a capability management system is in place to achieve strategic goals Yes Yes No

n/a

4.2.4 Evidence of scientific and technical collaboration and communication (e.g., interoperability or leveraging of expertise) between DRDC centres Yes Yes No
  • Program staff
  • Partners
4.2.5 Evidence regarding the external review processes used by Defence S&T Yes Yes No
  • Program staff
  • Partners
4.2.6 Evidence of a project validation process Yes Yes Yes
  • Program staff
  • Partners
Intermediate Outcomes
4.3 To what extent are the Canadian Defence and Public Safety and Security capabilities enhanced Defence S&T? 4.3.1 Evidence of direct support to operations Yes Yes Yes
  • Program staff
  • Partners
4.3.2 Evidence of improved Canadian Defence and public and safety and security capabilities through integration of knowledge and technology: publications, awards, and recognitions, licenses, and patents Yes Yes Yes
  • Program staff
  • Partners
  • OGDs/academia/industry

Table D-3. Evaluation Matrix—Performance (Efficiency and Economy). This table indicates the data collection methods used to assess the evaluation issues/questions for determining the performance in terms of efficiency and economic use of Program resources.

Table Summary

This table indicates the data collection methods used to assess the evaluation issues/questions for determining the Program’s performance in terms of achievement of efficiency. There were 2 evaluation questions listed in the left-hand column relating to the Program’s performance in terms of efficiency and economy. Select one and read across the rows to its right to learn, first, the indicators used to assess these questions and the 4 lines of evidence used for the indicators and questions—namely, administrative data; document review/benchmarking; a questionnaire; and key informant interviews.

Evaluation Issues/Questions Indicators Program Administrative Data Document Review Questionnaire Key Informant Interviews

5.1 Are the most appropriate and efficient means being used by DND/CAF to deliver the Defence S&T Program?

5.1.1 Resources used to produce outputs Yes Yes No
  • Program staff
  • Partners
  • OGDs/academia/industry
  • External expert opinion
5.1.2 Assessment of resources used (efficiently and economically) to produce the program outputs and outcomes Yes Yes Yes
  • Program staff
  • Partners
  • OGDs/academia/industry
  • External expert opinion
5.2 Are resources allocated to the Defence S&T Program reasonable, economical, and sustainable? 5.2.1 Level of completeness and accuracy of reporting systems Yes Yes Yes
  • Program staff
  • Partners
  • OGDs/academia/industry
  • External expert opinion
5.2.2 Mechanisms in reporting Yes Yes Yes
  • Program staff
  • Partners
  • OGDs/academia/industry
  • External expert opinion

Annex E—DRDC Publications, Presentations, and Awards Received

Table E-1. Defence S&T Scientific and Technical Publications and Awards for FYs 2008/09 to 2012/13. This table lists the reports produced by the Program and the awards received from FY 2008/09 to 2012/13.

Table Summary

This table lists the number of Defence S&T Programs scientific and technical publications, as well as the number of awards it received from FYs 2008/09 to 2012/13. There are 6 rows and 6 columns, including titles. The column at far left lists the mentioned fiscal years, while the columns to its right list the entities, namely, Contractor Reports, Scientific Literature, Conference Presentations and International and National Awards. Select a fiscal year from the left column and read across to learn the number of publications and awards achieved during that year. Conversely, trends over the years in scope may be discerned by reading down any of the columns. Note that figures for awards, in the column at far right, were not obtained for the last two years in scope. Note also that conference presentation statistics for FYs 2011/12 and 2012/13 do not include DRDC Valcartier, since they could not be obtained.

FY Contractor Reports Scientific Reports Scientific Literature Conference Presentations International/National Awards
2008/09 300 500 600 330 18
2009/10 350 450 750 420 36
2010/11 255 690 455 575 11
2011/12 152 202 179 665* Information not available
2012/13 145 185 60 604* Information not available

* The Program did not track international and national awards in FYs 2011/12 and 2012/13. Conference presentation statistics for FYs 2011/12 and 2012/13 do not include DRDC Valcartier (since these could not be obtained).

Table E-2. Defence S&T’s Contributions to Intellectual Property. This table lists numbers of patents and licence agreements, as well as amounts earned on royalties and the Program’s contribution to the Public Service Inventor Awards in FYs 2012/13 and 2013/14.

Table Summary

This table lists numbers of patents and licence agreements, as well as amounts earned on royalties and the Program’s contribution to the Public Service Inventor Awards, in both FYs 2012/13 and 2013/14. This table has 5 rows and 3 columns, including titles. The column at far left lists data types relating to intellectual property—including patents issued, license agreements, Licensees’ reported in royalties generated, Program’s contribution to Public Service Inventor Awards. The other two columns list the data for the two mentioned fiscal years. Select a data type regarding intellectual property from the left column and read across to learn the appropriate figures in 2012/13 and 2013/14.

Intellectual Property 2012/13 2013/14
Patents 11 13
Licence Agreements 11 7
Licensees’ Report of Royalties Earned $2,215,033 $452,418
Defence S&T Contribution to Public Service Inventor Awards $623,029 $123,801

___________________________________________________________________________________________________________________________

Footnote 1 Defence Administrative Order and Directive (DAOD) 1000, dated September 26, 1997, includes ADM(S&T) as the responsible authority for the leadership in development and maintenance of the Defence S&T Enterprise Charter and related policies, as well as for coordination and facilitation of external S&T partnerships in support of the Enterprise. This DAOD is currently being revised.

Footnote 2 ADM(S&T) Business Plan, FY 2013/14. Examples of business renewal processes include change management, program formulation, program assessment, and corporate services transition.

Footnote 3 TBS. Policy on Evaluation, April 1, 2009. See http://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=15024&section=text. Last retrieved December 2, 2013.

Footnote 4 Defence S&T Enterprise Charter, 2008.

Footnote 5 DAOD 1000-ADM(S&T).

Footnote 6 Defence S&T research is delivered across Canada in eight research centres: Valcartier Research Centre, Toronto Research Centre, Ottawa Research Centre, Atlantic Research Centre, Suffield Research Centre, Centre for Operational Research and Analysis (CORA), Centre for Security Science (CSS), and Director General Military Research and Analysis.

Footnote 7 Defence S&T partners include OGDs and international partners (allied nations, international organizations such as the Technical Cooperation Program (Five Eyes), the NATO S&T Organization and other nations, industry, and academia).

Footnote 8 Defence and Security S&T Strategy. Science and Technology in Action: Delivering Results for Canadians, 2013. http://www.drdc-rddc.gc.ca/en/publications/defence-st-strategy.page#Delivering-ST-Solutions. Last retrieved September 17, 2014.

Footnote 9 DND and the CAF—Report on Plans and Priorities 2014-1.

Footnote 10 Defence S&T clients include Portfolio 0: Strategic Decision Support, Portfolio 1: Navy, Portfolio 2: Army, Portfolio 3 Air Force, Portfolio 4: Personnel, Portfolio 5: Joint Force Development, and Portfolio 6: Force Employment.

Footnote 11 Defence S&T partners include OGDs, international partners (allied nations, international organizations such as the Technical Cooperation Program (Five Eyes), the NATO S&T Organization, as well as other nations, industry, and academia.

Footnote 12 Report on Plans and Priorities 2013/14 defines time horizons as follows: Horizon 1 is short term (1 to 5 years), Horizon 2 is medium term (5 to 10 years) and Horizon 3 is long term (10 to 30 years).

Footnote 13 S&T Functional Planning Guidance, 2013/14.

Footnote 14 Program performance or financial data was not available based on the new version of the PAA (2013). Therefore, the Evaluation used the 2009 version of the Departmental PAA.

Footnote 15 These calculations are based on financial data available in the Defence Resource Management Information System (DRMIS) and the Collaborative Planning and Management Environment (CPME).

Footnote 16 Departmental Performance Reports (FYs 2009/10 to 2013/14).

Footnote 17 As per the Logic Model (Annex C), clients of the Defence S&T Program include Strategic Decision Support, the RCN, Canadian Army, Royal Canadian Air Force (RCAF), Personnel, Force Employment, Joint Force Development, as well as other national and international partners.

Footnote 18 Key program interviewees included the ADM(S&T), directors general, all DRDC Research Centre directors, chief scientists, scientists, and key financial personnel.

Footnote 19 Defence and Security (S&T) Strategy. For example, during the Afghanistan operations, in response to a Command Wire Remote Control Improvised Explosive Device threat, Defence S&T rapidly established a fundamental S&T capability that resulted in identifying and testing mitigating technologies. In addition, ongoing research in the area of automation and robotics is intended to minimize risks to personnel and enable rapid analyses of large data streams. The design and development of next generation CBRN protection systems are helping to counter future threats.

Footnote 20 Ibid.

Footnote 21 Naval Orders 3771-1, paragraph 4.1.

Footnote 22 DRDC CSS improves the protection of critical infrastructure and emergency preparedness and response and enhances the anti-terrorism capacity of law enforcement agencies and the military. There are 16 portfolio managers connected with communities of practice across government. Programs encompass a broad range of subjects under four themes: (1) The CBRNE Defeat initiative, (2) Critical Infrastructure Protection, (3) Surveillance, Intelligence and Interdiction, and (4) Emergency Management and System Integration.

Footnote 23 International agreements are established with the US and the UK. Nationally, there are 16 portfolio managers connected with communities of practice across government. Programs encompass a broad range of subjects under four themes: (1) CBRNE Defeat, (2) Critical Infrastructure Protection, (3) Surveillance, Intelligence and Interdiction, and (4) Emergency Management and System Integration.

Footnote 24 Selected references that support the need for S&T activities in both basic and applied research include: the Canadian S&T Strategy (2007) and Progress Report (2009); and the Review of Federal Support to Research and Development Expert Panel reports (also known as the Jenkins Report, 2012).

Footnote 25 The Federal S&T Map of Outcomes and Activities.

Footnote 26 National Defence Act. Revised Statutes of Canada, 1985, Number 5. Current to October 1, 2013.

Footnote 27 CFDS.

Footnote 28 Ibid.

Footnote 29 Expert Panel Review of Federal Support to Research and Development (2011). Innovation Canada: A Call to Action. http://rd-review.ca/eic/site/033.nsf/vwapj/R-D_InnovationCanada_Final-eng.pdf/$FILE/R-D_InnovationCanada_Final-eng.pdf.Last retrieved July 23, 2014.

Footnote 30 The Medium Earth Orbit Search and Rescue project was provided as an example that used two different sources other than the Defence S&T. The expertise of the Communications Research Centre was used for the ground segment of the project. The company COM DEV, a leading manufacturer of space hardware subsystems was used for the space segment. In this case the Communications Research Centre was better suited to the task for the experimental ground station. COM DEV already had a pre-existing search and rescue repeater, so they were uniquely suited to the R&D task for Medium Earth Orbit Search and Rescue, but their selection was also largely due to time factors.

Footnote 31 DAOD 1000, dated September 26,1997, identifies ADM(S&T) as the responsible authority for the leadership in development and maintenance of the Defence S&T Enterprise Charter and related policies, as well as the coordination and facilitation of external S&T partnerships in support of the Enterprise. This DAOD is currently being revised.

Footnote 32 The mission of the federal S&T Strategy is to “lead in bringing S&T to bear in supporting the federal government to address national challenges.” The goals of the federal S&T Strategy or map are for: 1) “Canadians to have the opportunity to be informed about the social implications of S&T, and 2) for Canada’s innovation system to be engaged effectively in supporting national priorities.”

Footnote 33 Ibid.

Footnote 34 The four organizations under ADM(Mat) are Quality Engineering Test Establishment (QETE), Naval Engineering Test Establishment (NETE), Land Engineering Test Establishment, and Aerospace Engineering Test Establishment.

Footnote 35 Currently, a Defence S&T review is underway to further identify areas of overlaps. Building on the “access model,” the review will also look to identify the potential partners. The intent of the review is also to take into consideration the significant resource/budget constraints the Program has been under as well as the developments in the external innovation systems. Consequently, in the future there may be more opportunities for external partnerships.

Footnote 36 See Speech from the Throne 2007, and Budget 2007; Speech from the Throne 2008; Speech from the Throne 2011.

Footnote 37 Expert Panel Review of Federal Support to Research and Development (2011). Innovation Canada: A Call to Action.

Footnote 38 Science and Technology in the GC. http://science.gc.ca/default.asp?lang=en Last retrieved on September 17, 2014.

Footnote 39 Federal S&T map of outcomes and activities diagram. See http://www.science.gc.ca/47DABAB7-A1F3-4D7B-B431-FC50071C1C39/overview_eng_Final.pdf Last retrieved on October 15, 2014.

Footnote 40 The objective of the Cornerstone project is to leverage autonomous underwater vehicle technology to execute a unique mission into uncharted waters—data collection to support Canada’s submission to the United Nations Convention on the Law of the Sea. Northern Watch seeks to determine the best combination of sensors for comprehensive, cost-effective situational awareness in the Arctic.

Footnote 41 DND. Report on Plans and Priorities, 2010/11.

Footnote 42 Research and Development Executive Committee, meeting records, May 2011.

Footnote 43 DND. Report on Plans and Priorities, 2012/13.

Footnote 44 CPME is a DRDC developed system to support internal business planning, project management, In-Year Resource Management and Performance Management processes.

Footnote 45 Following the Strategic Review and Deficit Reduction Action Plan reduction initiatives, in April 2013 ADM(S&T) embarked on a transformation of its organization. The transformation led to re-structuring the organization: realigning director general functions to support program clients better and grouping of similar functions under one director general replicated in six different centres.

Footnote 46 S&T Functional Planning Guidance, 2013/14.

Footnote 47 CPME database.

Footnote 48 CPME database.

Footnote 49 Defence and Security S&T Strategy. See http://www.drdc-rddc.gc.ca/en/publications/defence-st-strategy.page#Key-ST-Program-Enablers. Last retrieved on September 17, 2014.

Footnote 50 ADM(S&T) Business Plan 2013/14.

Footnote 51 Defence and Security S&T Strategy, 2013.

Footnote 52 DRDC S&T Capability External Peer Review, Mobile Systems, Part 1, Air Platforms, November 2011.

Footnote 53 As further explored in section 2.4.2 of this report, the evaluation study could not find sufficient evidence to assess the extent to which these recommendations were implemented.

Footnote 54 DRDC S&T Capability External Peer Review, Air Platforms, 2011.

Footnote 55 An explicit example provided by one the former senior staff was in respect to capability reduction decisions experienced in the 1990s. Following a program review, the Defence S&T Program was reduced by 35 percent in two years. At the time, program staff identified a couple of areas that were considered mature technology by the CAF staff, and not worthy of further development. Those two areas were vehicle protection and robotics.

Footnote 56 DRDC Program Assessment Final Report, 2013.

Footnote 57 The federal S&T Strategy seeks to foster Canada’s competitiveness through investments and activities in three key areas: (1) entrepreneurial advantage, (2) knowledge advantage, and (3) people advantage.

Footnote 58 The Edge Innovation Network is a model developed by General Dynamics where industry, academia, non-profit organizations, and government entities collaborate to deliver new capabilities to defence, security, and public safety organizations.

Footnote 59 Edge Network Innovation Overview. See http://edge-innovation.ca/documents/012200_EDGE_overview_2012-01-30_Standard.pdf. Last retrieved on June 29, 2015.

Footnote 60 Cyber Security Research Capability—Academic Centres of Excellence. See https://www.gov.uk/government/publications/cyber-security-research-capability-academic-centres-of-excellence. Last retrieved on June 29, 2015.

Footnote 61 The UK’s Dstl. See https://www.gov.uk/government/organisations/defence-science-and-technology-laboratory. Last retrieved on September 17, 2014.

Footnote 62 As part of its National Engagement Strategy, the DSTO, through the Defence Science Partnerships Program, has a common engagement mechanism with all universities by accessing and leveraging collaborations with academia and industry. The engagement mechanism consists of bilateral partnering agreements with each university and a governance framework. Request for Information for International Partner Organizations, Canadian Defence Liaison Staff Australia.

Footnote 63 The Build in Canada Innovation Program helps companies bridge the pre-commercialization gap by procuring and testing late-stage innovative goods and services within the federal government before taking them to market. See: https://buyandsell.gc.ca/initiatives-and-programs/build-in-canada-innovation-program-bcip/overview-of-bcip. Last retrieved on August 6, 2014.

Footnote 64 The Technology Demonstration Program provides non-repayable contributions in support of large-scale technology demonstration projects in the aerospace, defence, space, and security sectors. http://www.ic.gc.ca/eic/site/ito-oti.nsf/eng/h_00837.html. Last retrieved on August 6, 2014.

Footnote 65 National Defence Departmental Performance Report, 2012/13.

Footnote 66 Enhanced Priority List. See: http://www.ic.gc.ca/eic/site/042.nsf/eng/00063.html. Last retrieved on September 17, 2014.

Footnote 67 National Defence Departmental Performance Report, 2012/13.

Footnote 68 Emerging Skies Initiative. http://www.rcaf-arc.forces.gc.ca/en/cf-aerospace-warfare-centre/emerging-skies.page,?? Last retrieved on November 11, 2014.

Footnote 69 ADM(S&T). S&T Program Approval Process, 2013. As a result of the new approval process, projects were reduced from 200 to 80.

Footnote 70 For the purpose of DRDC program formulation, program is defined as “a group of related projects managed in a coordinated way to obtain synergies, benefits, and control not available from managing them individually.”

Footnote 71 The committee provides context and advice to support decision making by the ADM(S&T) as Chief Executive Officer for the strategic management of DRDC and the provision of advice to the Deputy Minister of National Defence and the Chief of the Defence Staff.

Footnote 72 This committee has been in operation since 2012, providing guidance at the program formulation, execution, and endorsement phases.

Footnote 73 DRMIS is DND/CAF’s integrated engineering, maintenance, supply, and finance information system. https://www304.ibm.com/events/wwe/grp/grp011.nsf/vLookupPDFs/Canada%20DND%20DRMIS%20Update/$file/Canada%20DND%20DRMIS%20Update.pdf. Last retrieved on October 7, 2014.

Footnote 74 This initiative is still at the implementation stage. When completed, each project will have a project manager who is responsible to track how well projects are being delivered.

Footnote 75 DRDC Integrated Corporate Services Performance Measurement Strategy, 2013.

Footnote 76 Annual reports included highlights of DRDC activities, including main partnership engagements as well as DRDC’s financial statements.

Footnote 77 The full report is only available as a paper document and not available externally; however, the Annual Report and Accounts contains a section on performance that tells the story (with metrics) and is available in publications on the UK government website.

Footnote 78 With regard to the current hiring process, interviews with program staff revealed that when hiring a new scientist (whether someone has left or there is room for growth), the respective section head discusses his/her request with the Director General Science and Technology Centre Operations (DGSTCO). If the DGSTCO is convinced, the hiring request is taken to the ADM S&T. In this regard, the ability to expand expertise is primarily the responsibility of the DGSTCO.

Footnote 79 Canadian Forces Auxiliary Vessel Quest has been the primary research vessel of the Defence Research Establishment Atlantic in Dartmouth, Nova Scotia since 1969.

Footnote 80 DRDC Atlantic Acoustic Calibration Barge is located in Bedford Basin, about 5 km by water from DRDC Atlantic. The main function of the Barge is to conduct acoustic calibrations of sonar transducers, such as hydrophones and projectors, in a free field salt water environment.

Footnote 81 Program presentation, DRDC Suffield, May 2014.

Footnote 82 DRDC External Peer Review Manual for the Evaluation of a Science and Technology expertise area. External peer reviews occur twice annually over a five-year period. Each process uses the same evaluation methodology and criteria, and their outputs are compared against each other, when applicable, to ensure congruence.

Footnote 83 The evaluation study requested feedback from program staff regarding the implementation of the recommendations of the Working Group and previous program assessments. According to the responses received, the implementation was carried out under Strategic Review/Deficit Reduction Action Plan, and ADM(S&T) regularly reviewed its own programs/capabilities. Currently, a new overall assessment of the Defence S&T Program is underway to inform decisions that will be made in terms of new divestments/investments before December 2014.

Footnote 84 The Controlled Goods Program is an industrial security program authorized by the Defence Production Act. The Controlled Goods Directorate administers this program to prevent the proliferation of tactical and strategic assets and build up Canada’s defence trade controls. Public Works and Government Services Canada is the federal agency responsible for this program. The Department of Foreign Affairs, Trade and Development Canada is the authority that determines what is and what is not a controlled good and/or a controlled technology.

Footnote 85 TBS Policy on Evaluation, April 1, 2009.

Footnote 86 These were the key official and unofficial means upon which the Defence S&T Program tracked, reported and based strategic decision making for activities from the project level to the corporate level. Similarly, these were the systems the Defence S&T Program relied upon to determine what work was being done in the delivery of program outputs and the costs associated with delivery.

Footnote 87 The database systems analyzed included DRMIS, CPME, Oracle/PeopleSoft Human Resources Information System 8.9 for DND civilian employees, and 7.5 for members of the CAF.

Footnote 88 The Evaluation’s estimate of the total cost/value represents a more complete and accurate picture of program resources but should be seen as an indicator or a rough order of magnitude. It is not a comprehensive study due to the nature of how the in-kind contributions and military FTEs are identified, compiled, and reported.

Footnote 89 The exact number of civilian FTE reductions was provided by the Defence S&T Program.

Footnote 90 DRDC Business Plan FY 2010/11.

Footnote 91 DRMIS reports included the A-base funding, but not all financial transfers from other DND/CAF L1 organizations and funding from other sources, representing only partial information.

Footnote 92 Janus: A Proposal for the DRDC Client Group Alignment, DRDC CORA Technical Report 2007-07, May 2007.

Footnote 93 A DRDC Management Accountability Framework: Results of Cycle 2, Contract Report DRDC Atlantic Centre 2009-135, September 2009.

Footnote 94 Measuring and ensuring excellence in Government Science and Technology: Canadian Practices, Prepared for Government of Canada Council of Science and Technology Advisors, KPMG Consulting, March 13, 2001.

Footnote 95 The general standard accepted by CRS evaluations is 15 to 20 percent of overall administration and management costs. In comparison, the administration and management costs of the Program amount to 22 percent.

Footnote 96 Source: Frans Leeuw and Jos Vaessen. Impact Evaluations and Development—NONIE Guidance on Impact Evaluation, 2009. NONIE is the Network of Networks on Impact Evaluation, 2009. http://www.worldbank.org/ieg/nonie/index.html Last retrieved on July 16, 2012.

Page details

Date modified: