The launch of Defence Program Analytics (DPA) in 2017 started a coordinated effort to increase adoption of analytics, but the scope of this effort was largely focused on business intelligence (BI) leveraging corporate data.  The use of data analytics to support decision-making within DND/CAF was not new; there have been Defence organizations providing such data-driven decision-support for decades.  The aim of DPA was to institutionalize the use of analytics, and engage all Defence Team members in leveraging data as part of their business and decision-making processes, so that the entire Defence Enterprise would benefit by finding efficiencies, exploiting opportunities and enabling better decisions.

There have been a number of successes as a result of DPA, such as significant investments in training for business intelligence dashboard authors and consumers, the development of several analytics products across Defence, the establishment of Analytics Support Centres (ASCs) in most Level One (L1) organisations, and increased awareness and support from Defence executives.  Despite these successes, the adoption of analytics in Defence remains relatively low because of a number of shortcomings, including:

  • A narrow scope of analytics almost entirely focused on BI and corporate applications, with limited appreciation of the requirements of the DND/CAF for supporting decisions related to force development, force generation and force employment;
  • An enterprise BI toolset that is not easily accessible and is often not connected to the data that members want to report on, including a limited ability to integrate varied data sources, to stage data, and to leverage unstructured data. Additionally, the enterprise BI toolset is not suitable for advanced analytics, which tends to require more specialized software (including open-source tools) and more powerful computing infrastructure (generally cloud-based);
  • A lack of infrastructure to transfer, integrate and analyze classified data. Additionally, there is little capability to access analytics in low-bandwidth situations, or in disconnected modes;
  • A lack of capacity and expertise within L1s to enable and support analytics efforts;
  • A lack of value realization metrics to demonstrate how much benefit Defence has received from using analytics to inform decision-making, either through identifying efficiencies or by exploiting opportunities, and track progress over time across L1s; and,
  • A lack of policy changes and a concept of operation, embraced by the department as a whole, to authorize connecting various data sources and authorize various users to perform analytics in accordance with data protection policies and directives.

With the launch of the DND/CAF Data Strategy in September 2019, accompanied by the CDS/DM Joint Directive on Data Management, some of the issues underlying the adoption of analytics, such as data quality, will be addressed.  The implementation of the Data Strategy is an opportunity to revisit the departmental vision for analytics, and present an updated vision for analytics that will address the shortcomings listed above.

Back to top

Purpose of this document

This document presents a proposed vision and guiding principles for analytics at DND/CAF, as well as a high-level roadmap.

Back to top

Scope of Analytics

In DAOD 6500-0, analytics is defined as the:

Systematic computational transformation of data into insights, for the purpose of making better decisions.

That transformation of data into insights can be achieved through a number of approaches and techniques.  The diagram below represents an analytics maturity curve.  As the maturity of analytics capabilities increases, new techniques and approaches are available to unlock greater value to the organization.  The maturity curve includes four types of analytics, described below:

Long description follows
Figure 2: Analytics maturity curve, adapted from Gartner
Figure 2: Analytics maturity curve, adapted from Gartner


Traditional BI, Advanced Analytics



What happened? Descriptive Analytics (e.g. dashboards, reports, historical trend analysis)
Why did it happen? Diagnostic Analytics (e.g. advanced data visualization, hypothesis testing)
What will happen? Predictive Analytics (regression models, machine learning / AI, time series forecasts)
What should we do? Prescriptive Analytics (optimization, simulation, automated decision making)

Table 1: Definition and examples of the types of analytics

Type of Analytics Definition DND/CAF Example Use Cases
Descriptive A form of analytics that leverages data to answer the question “What happened?”, mainly through simple reports and visualizations (e.g. bar charts, line graphs)
  • PowerBI dashboards tracking the progress of capital projects in Defence Services Program Portal.
  • Dashboards developed by many L1 organisations using SAP Web Intelligence (WebI) to track their expenditures during the course of a year.
Diagnostic A form of analytics that leverages data to answer the question, “Why did it happen?” It is characterized by more advanced visualisation techniques (e.g. drill-down, interactive data visualization and data discovery) and often by the use of statistical techniques (e.g. correlation analysis, hypothesis testing)
  • Statistical analyses of the annual Public Service Employee Survey to identify themes, questions and organisation demonstrating exceptionally positive or negative results, and identify the potential causes of certain results.
Predictive A form of advanced analytics that leverages data to answer the question “What is likely to happen?”, and is characterized by techniques such as regression analysis, forecasting, multivariate statistics, pattern matching, predictive modeling, and machine learning
  • The use of time series models by the RCN to feed monthly recruitment dashboards and forecast enrolment in different occupations.
  • The use of regression modelling to predict COVID-19 hot spots in support of Operation LASER.
Prescriptive A form of advanced analytics that leverages data to answer the question “What should be done?” and is characterized by techniques such as simulation, optimization, recommendation engines, heuristics, and machine learning.
  • The Visual Investment Planning Optimization & Revision (VIPOR) tool used by CFD, a portfolio optimization tool that informs decisions related to capital investments and capability development.

For the purposes of this document, the scope of analytics includes:

  • The creation and consumption of analytics products, including reports, dashboards, and advanced analytics models;
  • The tools that allow the creation and consumption of analytics, the environments where analytics products are developed and tested, data lakes and data warehouses where copies of data are stored for the purpose of analytics, and IT support for the analytics tools, environments, and data warehouses and data lakes;
  • User support for analytics tools and processes for stakeholders including clients and enablers.

For the purposes of this document, analytics does not include:

  • The capture of transactional data, or of Internet of Things or sensor data, that occurs as a result of employing defence capabilities, or the storage of such data into data repositories that are not for the unique purpose of analytics;
  • Other data management areas such as data quality, data security, or metadata.  Where there are relationships to other data management areas, they will be addressed in this document as dependencies.

Back to top

Page details

Date modified: