MAF 2020 to 2021 Information Management/Information Technology and Service Management

On this page

Methodology Overview

The COVID-19 pandemic has reinforced the importance of implementing the Government of Canada’s transition to digital. In light of this context, the 2020-21 IM/IT/Service Management methodology is focused on advancing the delivery of services and the effectiveness of government operations through the strategic management of government information and data and leveraging of information technology. The three retained questions focus on client service experience and the capacity to effectively deliver services. In addition, two questions have been added that seek to understand the impact of COVID-19 on service delivery and how data was leveraged to support response efforts.

Use of MAF Results

The results reported on IM/IT/Service will provide insight into the government’s digital direction. The results serve two purposes: to identify opportunities and best practices as they relate to key digital tools and initiatives and to demonstrate how the three AoMs will be integrated under the future Policy on Service and Digital.

Deputy Heads

  • Deputy Heads will be presented a clear view of how their department is performing in the digital era and will be able to trace their overall performance to key indicators.

Departmental Functional Communities

  • Departmental Functional Communities will see their work reflected in the results and will be able to use MAF to improve their activities in their respective fields.

Treasury Board of Canada Secretariat

  • assess departmental digital maturity in the planning, implementation and operation of past, present and future projects.
  • assess the substance and performance of services across government, including level of policy compliance, and areas for improvement.

Period of assessment

While the period of assessment for each indicator may vary depending on the information required, the overall timeframe for this year’s assessment falls within the April 1, 2019 to October 31, 2020 timeframe. Interview questions focused on pandemic response are specifically intended for the period from March 2020 until the data of the interview, therefore allowing for the submission of the most up-to-date information possible. Remaining questions will retain the standard reporting period of April 1, 2019 to March 31, 2020.

Impact on Departments

Below is a summary of the impact on departments in terms of the number of questions, and submission of evidence, which illustrates a substantial reduction overall when compared to 2019-20.

IM/IT/Service Management 2019-20 2020-21
Total number of questions 20 5
Total number of questions which require the submission of evidence 8 3 (2 interview questions, 1 other)
  • Three core questions have been preserved from 2019-20 pertaining to priority services and IT maturity.
  • There are also two new questions:
    • One that seeks to understand the actions taken by organizations to ensure that services continued to be delivered or scaled up during the pandemic and another exploring how data enabled the response to COVID-19.
    • It is important to note that the two new questions are for learning purposes.
  • Of the 5 questions, 2 are TBS to answer, 1 is department to answer (i.e., one of five elements), and 2 are interview questions (see FAQ document for more information on the interview process)

Overall outcomes

The IM/IT/Service Area of Management MAF 2020-21 cycle provides insight into service standards, the availability of online services, and overall IT maturity. This will allow the AoM to focus on key indicators in ensuring the continuity of service to Canadians.

MAF 2020-21 IM/IT/Service Management Questionnaire

Question #1 Indicators Preserved from MAF 2019-20

What is the extent to which a department has mature service standards for priority services (composite indicator)?

Rationale

Service standards are essential for good client service and effectively managing performance. They help clarify expectations for clients and employees and drive service improvement. The assessment of whether service standards have been established and whether services are meeting their service standard targets provides a comprehensive composite indicator on the maturity of service standards and service performance for the department’s priority services.

Category

  1. Policy compliance
  2. Performance
  3. Other

Target (where applicable)

High, ≥80%

Calculation method (where applicable)

Indicator: Extent to which departments have mature service standards for priority services (composite indicator)

A result of high, medium, or low for this composite indicator will be determined by analysis of the following elements derived from the GC Service Inventory:

Composite indicator composed of elements:

  • Percentage of priority services that have service standards
    • Calculation: Number of priority services that have at least one service standard / Total number of priority services
  • Percentage of priority service standards that met their target
    • Calculation: Number of priority service standards that met their target / Total number of priority service standards

Calculation Formula:

(Element A) (0.50) + (Element B) (0.50)

Rating Scale:

  • High: [80 – 100]
  • Medium: [50 – 79]
  • Low: [0 – 49]

Evidence source and document limit

TBS to answer

This question applies to the following organizations:

  • Large departments and Agencies

Data source: GC Service Inventory

Date of data extraction: December 21, 2020

Data collection method: Documentary evidence (Outside of MAF Portal)

Evidence: Data from the Service Inventory will be used for the calculation.

Document limit: N/A

Period of assessment: Fiscal Year 2019-2020 Service Inventory

Department or agency to answer

Reference materials

Treasury Board policy reference or Government of Canada priority

Question #2 Indicators Preserved from MAF 2019-20

What is the level of maturity towards making priority services available online (composite indicator)?

Rationale

By assessing the extent to which priority services can be completed online end-to-end, and the extent to which applicable interaction points for priority services are online, this composite indicator assesses a department’s overall progress in making their services available online to the fullest extent possible.

Category

  1. Policy compliance
  2. Performance
  3. Other

Target (where applicable)

N/A

Calculation method (where applicable)

Indicator:Level of maturity towards making priority services available online (composite indicator)

A result of high, medium, or low for this composite indicator will be determined by analysis of the following data elements derived from the GC Service Inventory:

Elements:

  • Percentage of priority services that can be completed online from end-to-end
    • Calculation: Number of priority services that can be completed online from end-to-end / Total number of priority services
    • Note: Calculation is based on information in the department or agency’s service inventory and takes into consideration interaction points that are not applicable (N/A) to the service. Rationales for any interaction points identified as “N/A” must be provided in the GC Service Inventory.
  • Percentage of the department or agency’s client interaction points, that are available online for priority services
    • Calculation: Number of client interaction points available online for priority services / Total number of applicable client interaction points for priority services
    • Note: Calculation is based on information in the department or agency’s service inventory and takes into consideration interaction points that are not applicable (N/A) to the service. Rationales for any interaction points identified as “N/A” must be provided in the GC Service Inventory.

Calculation Formula:

(Element A) (0.43) + (Element B) (0.57)

Rating Scale:

  • High: [80 – 100]
  • Medium: [50 – 79]
  • Low: [0 – 49]

Evidence source and document limit

TBS to answer

This question applies to the following organizations:

  • Large departments and Agencies

Data source: GC Service Inventory

Date of data extraction: December 21, 2020

Data collection method: Documentary evidence (Outside of MAF Portal)

Evidence: Data from the Service Inventory will be used for the calculation.

Document limit: N/A

Period of assessment: Fiscal Year 2019-2020 Service Inventory

Department or agency to answer

Reference materials

Treasury Board policy reference or Government of Canada priority

Question #3 Indicators Preserved from MAF 2019-20

What is the department’s level of overall IT maturity (composite indicator)?

Rationale

To provide an overall view of a department’s IT maturity by looking at many indicators across a variety of IT themes (e.g., IT applications, architecture review processes, IT investment planning, cloud maturity).

Category

  1. Policy compliance
  2. Performance
  3. Other

Target (where applicable)

N/A

Calculation method (where applicable)

Indicator: Level of overall IT maturity (composite indicator)

Elements:

  • Element A - The business value, technical condition, Aging IT and mission criticality of existing IT applications
    • Calculation: Application Portfolio Health Indicator (APHI) score
  • Element B - Existence of a Departmental Architecture Review Board (DARB)
    • Submission of the last 3 Record of Decisions from DARB proving the existence of a valid and functioning DARB
  • Element C - Percentage of SSC Business Requirements aligned to IT Plan investments (%)
    • Calculation: total number of active Business Requirements (BR) aligned to IT Plan investments/total number of active BR (%)
    • Active BRs is defined as a BR not canceled or in-service
  • Element D - Cloud maturity as self-assessed in the existing 2020-23 IT Plan narratives
    • Calculation: Percentage of “transformational activities” (column 3) identified in the Cloud Adoption Maturity Level table in the 2020-23 IT Plan
  • Element E - Level of management practice maturity of IT plans based on quality and completeness
    • Calculation: Degree to which the online 2019-20 IT Plan and the total value of 2019-20 IT Expenditure report match (%)

Calculation Formula:

(Element A) (0.2) + (Element B) (0.2) + (Element C) (0.2) + (Element D) (0.2) + (Element E) (0.2)

Rating Scale:

  • High: [71– 100]
  • Medium: [51 – 70]
  • Low: [0 – 50]

Evidence source and document limit

TBS to answer

This question applies to the following organizations:

  • Large departments and Agencies

Data source:

  • Element A: APM (Clarity)
  • Element C: Online IT Plan (Clarity) and SSC BITS
  • Element D: 2020-23 IT Plan narratives
  • Element E:
    • 2019-20 IT Expenditure Reports
    • 2019-20 Online IT Plan (Clarity)

Date of data extraction:

  • Element A: July 2020
  • Element B:  MAF Portal close date (i.e., December 10, 2020)
  • Element C:  MAF Portal close date (i.e., December 10, 2020)
  • Element D:  MAF Portal close date (i.e., December 10, 2020)
  • Element E: IT Plans from August 21, 2019, IT Expenditures from November 30, 2020

Data collection method:

Documentary Evidence

  • Element A, C, D, E: via the source tools (Outside of the MAF Portal)
  • Element B:  MAF portal
Department or agency to answer

This question applies to the following organizations:

  • Large departments and Agencies

Evidence: Departments are to upload Element B: Minutes of meetings of the DARB to the Portal

Document limit: Element B: 1

Period of assessment:

  • Element A: state as of July 2020
  • Element B, C, D, E: current state at portal close date 

Reference materials

Treasury Board policy reference or Government of Canada priority

Note: The Policy on Service and Digital came into effect on April 1, 2020, after the indicated assessment period for MAF 2020-21.

Question #4 New Indicators for MAF 2020-21

If at all, how did the department modify the way it delivered services during the pandemic? What were the underlying factors that enabled successful service delivery? Would you expect these services to continue into the future? How did your department account for client feedback?

Rationale

Seeks to understand the actions taken by organizations to ensure that services could continue to be delivered or scaled up, with a focus on adding value to clients during the pandemic. Lessons learned can be used to improving the service delivery efforts of federal organizations in the future.

Category

  1. Policy compliance
  2. Performance
  3. Other

Target (where applicable)

N/A

Calculation method (where applicable)

N/A

Evidence source and document limit

TBS to answer
Department or agency to answer

This question applies to the following organizations:

  • Large departments and Agencies

Data source: Department official(s)

Date of data extraction: N/A

Data collection method: Interview(s) with department official(s)

Evidence: N/A

Document limit: N/A

Period of assessment: March 2020 until the date of the interview

Reference materials

Treasury Board policy reference or Government of Canada priority

Question #5 New Indicators for MAF 2020-21

In responding to the pandemic (e.g., developing new programs, adjusting internal operations), did your department face any challenges about the availability, collection, use, or sharing of key data that would have assisted in the response? If yes, what were the challenges and how did you adapt to them?

Rationale

Seeks to understand any common challenges, opportunities, and innovations related to the availability of data that could have supported pandemic response and recovery, and assessment of the impacts of implemented response initiatives.

Category

  1. Policy compliance
  2. Performance
  3. Other

Target (where applicable)

N/A

Calculation method (where applicable)

N/A

Evidence source and document limit

TBS to answer
Department or agency to answer

This question applies to the following organizations:

  • Large departments and Agencies

Data source: Department official(s)

Date of data extraction: N/A

Data collection method:Interview(s) with department official(s)

Evidence: N/A

Document limit: N/A

Period of assessment: March 2020 until the date of the interview.

Reference materials

Treasury Board policy reference or Government of Canada priority

Glossary

Client:
individuals, businesses or their representatives served by or using either internal or external services provided by the Government of Canada. When describing interactions with information technologies, clients can be referred to as users.
Client-centric:
an approach that focuses on addressing client or user expectations, needs, challenges and feedback. It enables the creation of a positive experience for the client or user, considering a broad range of factors such as access, inclusion, accessibility, security, privacy, simplicity, and choice of official language.
Critical Service:
a service or activity whose disruption would result in a high or very high degree of injury to the health, safety, security or economic well-being of Canadians or to the effective functioning of the Government of Canada.
Data:
set of values of subjects with respect to qualitative or quantitative variables representing facts, statistics, or items of information in a formalized manner suitable for communication, reinterpretation, or processing.
Digital:
processes, practices and technologies related to the production, storage, processing, dissemination and exchange of electronic information and data.  It refers to, among other things, information and communications technologies, infrastructures, and the information and data they produce and collect.
External services:
a service where the intended client is external to the Government of Canada.
GC Service Inventory:
a consolidated service inventory containing service data from all departments and agencies subject to MAF Service Management published via the Open Government Registry.
Information:
knowledge captured in any format, such as facts, events, things, processes, or ideas, that can be structured or unstructured, including concepts that within a certain context have particular meaning. Information includes data.
Information management:
a discipline that directs and supports effective and efficient management of information and data in an organization, from planning and systems development to disposal or long-term preservation.
Information technology:
any equipment or system that is used in the acquisition, storage, manipulation, management, movement, control, display, switching, interchange, transmission, or reception of information or data. It includes all matters concerned with the design, development, installation and implementation of information systems and applications.
IT applications:
subclass of software that employs the capabilities of an electronic device directly and thoroughly for a task that the user wishes to perform.
Internal Enterprise Services:
a service provided by a Government of Canada department to other Government of Canada departments intended on a government-wide basis.
Management of information technology:
planning, acquiring, building, implementing and operating of IT assets, systems or services, measuring their performance, and arranging their disposal.
Mission Critical Application:
business application that is, or supports, a critical service.
Online end-to-end:
services available on the internet from beginning to end, without having to move off-line to complete a step in the process. For example, the ability to receive a service online from the application, to the receipt of the final output and the provision of feedback.
Priority Service(s):
external and internal enterprise services, determined by each department considering one or more of the following: volume (e.g., transactions per year), importance of service to clients (e.g., entitlements, permits, benefits, authorizations, mission-critical services), use of sensitive personal or commercial information, cost-benefit analysis, and affordability.
Service:
provision of a specific final output that addresses one or more needs of an intended recipient and contributes to the achievement of an outcome.
Service Inventory:
a catalogue of external and internal enterprise services that provides detailed information based on a specific set of elements (e.g., channel, client, volume, etc.).
Service Standard:
public commitment to a measurable level of performance that clients can expect under normal circumstances.

Acronyms

AOM
Area of Management
DARB
Departmental Architecture Review Board
DOSP
Digital Operations Strategic Plan
GC
Government of Canada
IM
Information Management
IT
Information Technology
MAF
Management Accountability Framework
N/A
Not applicable
TBS
Treasury Board of Canada Secretariat

Page details

Date modified: