MAF 2021 to 2022 Information Management/Information Technology and Service Management

On this page

Methodology Overview

The COVID-19 pandemic has reinforced the importance of implementing the Government of Canada’s transition to digital. In light of this context, the 2021-20 IM/IT/Service Management methodology is focused on advancing the delivery of services and the effectiveness of government operations through the strategic management of government information and data and leveraging of information technology. The three retained questions focus on client service experience and the capacity to effectively deliver services. In addition, three other questions have been added. Two seek to understand the departmental adoption of elements of the new Policy on Service and Digital considered foundational by the AoM. One final question was added to determine if users and work points in GC managed facilities have the technology needed to enable a digital workspace while also retiring technologies not aligned to a digital workspace in a timely manner.

Use of MAF Results

The results reported on IM/IT/Service will provide insight into the government’s digital direction. The results serve two purposes: to identify opportunities and best practices as they relate to key digital tools and initiatives and to demonstrate how the three AoMs are now integrated under the Policy on Service and Digital.

Deputy Heads

  • Deputy Heads will be presented a clear view of how their department is performing in the digital era and will be able to trace their overall performance to key indicators.

Departmental Functional Communities

  • Departmental Functional Communities will see their work reflected in the results and will be able to use MAF to improve their activities in their respective fields.

Treasury Board of Canada Secretariat

  • assess departmental digital maturity in the planning, implementation and operation of past, present and future projects.
  • assess the substance and performance of services across government, including level of policy compliance, and areas for improvement.

Period of assessment

While the period of assessment for each indicator may vary depending on the information required, the overall timeframe for this year’s assessment falls within the April 1, 2020 to October 31, 2021 timeframe. Most questions retain the standard reporting period of April 1, 2020 to March 31, 2021.

Impact on Departments

Service and IM/IT 2020-21 2021-22
Total number of questions

2 Service

1 IM/IT

3 Service

3 IM/IT

Total number of questions which require the submission of evidence by departments

0 Service

1 IM/IT (but it was minor)

0 Service

1 IM/IT

Changes for 2021-22

  • Three questions were repeated from 2020-21, however one element was added to Question #1. There is one additional service question from 2020-21, one additional IT questions and one new IM question.
  • Of the 6 questions in 2021-22, one requires evidence from departments. The remaining questions are TBS to answer.

Overall outcomes

The IM/IT/Service Area of Management MAF 2021-22 cycle provides insight into service standards, the availability of online services, service improvement and overall IT maturity. This will allow the AoM to focus on key indicators in ensuring the continuity of service to Canadians.

MAF 2021-22 AoM Questionnaire

Question #1 preserved

What is the extent to which a department has mature service standards for services (composite indicator)?

Rationale

Setting and meeting service standards is essential for good client service and effectively managing performance. Service standards help clarify expectations for clients and employees and drive service improvement. The assessment of whether service standards have been established and whether services are meeting their service standard targets provides a comprehensive composite indicator on the maturity of service standards and service performance for the department’s services. Services that don’t have any standards cannot be managed effectively and services that don’t meet their standards are not meeting client expectations, thereby eroding trust in government.

Directive on Service and Digital - Canada.ca (tbs-sct.gc.ca)

Category

  1. Policy compliance
  2. Performance
  3. Other

Target

High, ≥80%

Calculation method (where applicable)

Indicator: Extent to which departments have mature service standards for services (composite indicator)

A result of high, medium, or low for this composite indicator will be determined by analysis of the following elements derived from the GC Service Inventory:

Composite indicator composed of elements:

  • Element A: Percentage of services that have service standards
    • Calculation: Number of services that have at least one service standard / Total number of services
  • Element B: Percentage of service standards that met their target
    • Calculation: Number of service standards that met their target / Total number of service standards
  • Element C: Percentage of service standards which have been reviewed in the last five years.
    • Calculation: Number of service standards which have been reviewed using the GC Service Standard Development and Assessment Tool in the last five years / Total number of service standards

Note: Services applicable for this measure are client-based, external and internal enterprise services. Five years covers the 2016-17 fiscal year through to 2020-21.

Calculation Formula:

(Element A) (0.45) + (Element B) (0.35) + (Element C) (0.20)

Rating Scale:

  • High: [80 – 100]
  • Medium: [50 – 79]
  • Low: [0 – 49]

Evidence source and document limit

TBS to answer

This question applies to the following organizations:

  • Large departments and Agencies

Data source: GC Service Inventory

Date of data extraction: MAF Portal close date

Data collection method: Documentary evidence (Outside of MAF Portal)

Evidence: Data from the Service Inventory will be used for the calculation.

Document limit: N/A

Period of assessment: April 1, 2020-March 31, 2021

Department or agency to answer

Other TBS use only new

Government Wide Average:

Will this indicator be used in the determination of the Government Wide Average?

Yes

Year over Year Analysis:

Can the indicator be used in a year over year analysis?

Yes

Departmental Results Framework (DRF)

Is this indicator used in the TBS DRF?

Yes

Reference materials

Treasury Board policy reference or Government of Canada priority

Question #2 preserved

What is the level of maturity towards making services available online? (composite indicator)

Rationale

By assessing the extent to which services can be completed online end-to-end, and the extent to which applicable interaction points for services are online, this composite indicator assesses a department’s overall progress in making their services available online to the fullest extent possible. Services offered online end-to-end give clients options to interact with government services remotely and electronically, and are a policy requirement in the Policy on Service and Digital. Services with partial or no online interaction points risk not providing client-centric service and increasing costs to access and process service applications.

Category

  1. Policy compliance
  2. Performance
  3. Other

Target

High, ≥80%

Calculation method (where applicable)

Indicator: Level of maturity towards making services available online (composite indicator)

A result of high, medium, or low for this composite indicator will be determined by analysis of the following data elements derived from the GC Service Inventory:

Elements:

  • Percentage of services that can be completed online from end-to-end
    • Calculation: Number of services that can be completed online from end-to-end / Total number of services
    • Note: Calculation is based on information in the department or agency’s service inventory and takes into consideration interaction points that are not applicable (N/A) to the service. Rationales for any interaction points identified as “N/A” must be provided in the GC Service Inventory.
  • Percentage of client interaction points that are available online for services
    • Calculation: Number of client interaction points available online for services / Total number of applicable client interaction points for services
    • Note: Calculation is based on information in the department or agency’s service inventory and takes into consideration interaction points that are not applicable (N/A) to the service. Rationales for any interaction points identified as “N/A” must be provided in the GC Service Inventory.

Note: Services applicable for this measure are client-based, external and internal enterprise services.

Calculation Formula:

(Element A) (0.50) + (Element B) (0.50)

Rating Scale:

  • High: [80 – 100]
  • Medium: [50 – 79]
  • Low: [0 – 49]

Evidence source and document limit

TBS to answer

This question applies to the following organizations:

  • Large departments and Agencies

Data source: GC Service Inventory

Date of data extraction: MAF Portal close date

Data collection method: Documentary evidence (Outside of MAF Portal)

Evidence: Data from the Service Inventory will be used for the calculation.

Document limit: N/A

Period of assessment: April 1, 2020-March 31, 2021

Department or agency to answer

Other TBS use only new

Government Wide Average:

Will this indicator be used in the determination of the Government Wide Average?

Yes

Year over Year Analysis:

Can the indicator be used in a year over year analysis?

Yes

Departmental Results Framework (DRF)

Is this indicator used in the TBS DRF?

Yes

Reference materials

Treasury Board policy reference or Government of Canada priority

Question #3 new

What is the extent to which the department is taking steps to manage and improve their services in accordance with the Policy on Service and Digital? (composite indicator)

Rationale

By assessing the extent to which departments manage their services through coordinated engagement, integrated planning, and continuous improvement, this indicator assesses compliance with the Policy on Service and Digital. Achieving a High rating shows that departments have established a strong foundation in implementing the policy with respect to governance, planning and reporting, and client-centric service design and delivery. A low rating would indicate that the department should review whether service delivery is sufficiently prioritized, and that greater efforts are required in leading integrated service management and service improvement activities across the organization.

Element 1: Each department is required by the Policy on Service and Digital to have officials designated for service management and cyber security. Not meeting these policy requirements would indicate that a department does not take a coordinated approach to managing its services.

Element 2: A service inventory is a useful tool to manage services and demonstrates an organization’s commitment to transparency and to service excellence.

Element 3: Integrated planning and reporting across service, information, data, IT and cyber security supports effective planning and better decision-making. It helps ensure that services are client-centric and that resources are managed effectively across different functional areas, with a focus on service improvement through client feedback, service standards and service reviews.

Element 4:  Seeking client feedback and using it to improve services is a foundational element in ensuring client centricity. Departments must use the various tools at their disposal, such as in-service client feedback, client satisfaction surveys, and user experience testing, to improve their services.

These elements are considered some of the foundational service requirements of the Policy on Service and Digital.

Category

  1. Policy compliance
  2. Performance
  3. Other

Target

High, ≥80%

Calculation method (where applicable)

Indicator: What is the extent to which the department is taking steps to manage and improve their services in accordance with the Policy on Service and Digital.

A result of high, medium, or low for this indicator will be determined by identification of the following elements derived from various sources available to TBS:

  • Departmental official(s) have been designated for service management and cyber security. (yes/no)
  • Service inventory for the 2020-21 fiscal year was approved by the Deputy Head and submitted to TBS. (yes/no)
  • Integrated plan for the 2020-21 fiscal year (Departmental Plan on Service and Digital) includes service improvement priorities and was submitted to TBS. (yes/no)
  • Percentage of services which have used client feedback to improve services in the last year.
    • Calculation: Number of services which identified 2020-21 as the year in which the service was improved based on client feedback / Total number of services.
    • Note: Calculation is based on information in the department or agency’s service inventory.
    • Note: Services applicable for this measure are client-based, external and internal enterprise services.

Note: Services applicable for this measure are client-based, external and internal enterprise services.

Calculation Formula:

(Element A) (0.25) + (Element B) (0.25) + (Element C) (0.25) + (Element D) (0.25)

Rating Scale:

  • High: [80 – 100]
  • Medium: [50 – 79]
  • Low: [0 – 49]

Evidence source and document limit

TBS to answer

This question applies to the following organizations:

  • Large departments and Agencies

Data source: GC Service Inventory, Departmental Plans on Service and Digital and Designated Officials for service survey (list published on GCPedia)

Date of data extraction: MAF Portal close date

Data collection method: Documentary evidence (Outside of MAF Portal)

Evidence: Data from the Service Inventory will be used for the calculation.

Document limit: N/A

Period of assessment: Fiscal Year 2020-2021

Department or agency to answer

Other TBS use only new

Government Wide Average:

Will this indicator be used in the determination of the Government Wide Average?

Yes

Year over Year Analysis:

Can the indicator be used in a year over year analysis?

No

Departmental Results Framework (DRF)

Is this indicator used in the TBS DRF?

No

Reference materials

Treasury Board policy reference or Government of Canada priority

Question #4 preserved

What is the department’s level of overall IT maturity (composite indicator)?

Rationale

To provide an overall view of a department’s IT maturity. The elements of this indicator have been chosen to gain insight on the department’s management of mission critical applications, investment in cloud technologies, and IT planning maturity. This view will help Deputy Heads and CIOs to identify progress year-to-year. Recognising progress contributes to the generation and sharing of best practices.

Category

  1. Policy compliance
  2. Performance
  3. Other

Target

N/A High

  • High: [71– 100]
  • Medium: [51 – 70]
  • Low: [0 – 50]

Calculation method (where applicable)

Level of overall IT maturity (composite indicator):

Elements:

  • Element A: The business value, technical condition, Aging IT and mission criticality of existing IT applications
    • Calculation: Current Application Portfolio Health Indicator (APHI) score for mission critical applications
  • Element B: Percentage of SSC Business Requirements aligned to IT Plan investments (%)
    • Calculation: total number of active business requirements (BR) aligned to IT Plan investments/total number of active BR (%)
  • Element C: Cloud maturity
    • Calculation: total cloud expenditures / annual operating expenditure on IT (%)
  • Element D: level of management practice maturity of IT plans based on quality and completeness
    • Calculation: degree to which the online 2020-2021 IT Investments and the total value of 2020-2021 IT Expenditure report match (%)

Calculation Formula:

(Element A)(0.25) + (Element B)(0.25) + (Element C)(0.25) + (Element D)(0.25)

Evidence source and document limit

TBS to answer

This question applies to the following organizations:

  • Large departments and Agencies

Data source:

  • Element A: APM (TBS Clarity)
  • Element B: online IT Investments (TBS Clarity) and SSC BITS
  • Element C: 2020-2021 IT Expenditure Reports
  • Element D:
    • 2020-2021 IT Expenditure Reports
    • 2020-2021 online IT Investments (TBS Clarity)

Date of data extraction:

  • Element A: GC APM (March 29, 2021 extract)
  • Element B:
    • SSC BR: BITS extract on the MAF 2021 portal close date
    • IT Investments: IT Plan 2021-2022 Q2 extract on the MAF 2021 portal close date
  • Element C: 2020-2021 IT Expenditure Report due to TBS on October 29, 2021
  • Element D:
    • IT Investments: IT Plan 2020-2021 Q2 extract
    • IT Expenditure: 2020-2021 IT Expenditure Report due to TBS on October 29, 2021

Data collection method: N/A

Evidence: N/A

Document limit: N/A

Period of assessment:

  • Element A: state of July 2021
  • Element B, C, D, E: current state at portal close date
Department or agency to answer

Other TBS use only new

Government Wide Average:

In previous cycles a GC wide average has been produced for this indicator.

Year over Year Analysis:

Year over year analysis cannot be done for this composite indicator since elements have changed between cycles. Some of the elements have remained constant between cycles thus year of year analysis is possible at this level.

Departmental Results Framework (DRF)

No

Reference materials

Treasury Board policy reference or Government of Canada priority

Question #5 new

What is the departments compliance with the Directive on Service and Digital Standards for IT provisions? (composite indicator)

Rationale

The Directive on Service and Digital Appendix G (Standard on Information Technology Provisions) defines standards which intend to ensure that users and workpoints are the tools needed to enable a digital workspace. At the same time, these questions help ensure technologies not aligned to a digital workplace are being retired in a timely manner.

Category

  1. Policy compliance
  2. Performance
  3. Other

Target

100%

Calculation method (where applicable)

A result of high, medium, or low for this composite indicator will be determined by analysis of the completion of departmental planning submissions.

Elements:

Completion of Departmental Plans for Service and Digital (DPSD) are important for forecasting the rate at which departments will become compliant with the policy standard [source: Departmental DPSD submissions]

  1. The DPSD was submitted by the April 1st, 2021 deadline or the OCIO approved extension.
  2. A compliance plan for fixed-line telephony service was completed
  3. A compliance plan for mobile phone service was completed
  4. A compliance plan for either the Your Email System (YES) OR Legacy OR Office 365 email systems was completed
  5. A plan for the Cloud Brokering service was completed
  6. A compliance plan for Conferencing & Collaboration service was completed

The following Element G is used in this evaluation cycle to establish a baseline and results from this year will be compared against a target to be established at a future date. Departments will not be assessed on Element G this year.

  1. Adherence to Provision Targets [source: SSC actuals for end of March 2021 and OCHRO FTE counts]
    1. All WebEx accounts have been retired.
    2. Ratio of all telephone plans (fixed-line + mobile) to FTEs is 1.2 : 1 or less
    3. Less than 10% of total telephone plan (fixed-line + mobile) are fixed-line
    4. The ratio of personal inboxes to FTEs is 1.2 : 1 or less

Calculation Formula:

  1. Submission of plan before deadline or granted extension – 50 points, else 0
  2. Each compliance plan submitted – 10 points ea. Possible total of 50 points

Rating Scale ():

  1. High: [80 – 100]
  2. Medium: [50 – 79]
  3. Low: [0 – 49]

Note:

  1. Next year, measures will grow to include the existing data completeness measures (Elements A to F) and service consumption measures (Element G) will be added.

Evidence source and document limit

TBS to answer

This question applies to the following organizations:

  • Large departments and Agencies
  • Small departments and Agencies
  • Other (see guidance)

Data source:

Departmental Plan on Service and Digital (DPSD)

SSC Enterprise Service Model (ESM) Consumption Report

OCHRO

Date of data extraction:

ESM Consumption Report – March 2022

DPSD Service Demand – Date submitted to TBS

and OCHRO FTE counts

Data collection method:Departmental Plan on Service and Digital (DPSD)

Evidence:

No evidence required from departments. TBS/SSC will draw data from identified sources.

Document limit: N/A

Period of assessment:

2020-21

Department or agency to answer

Other TBS use only new

Government Wide Average:

Will this indicator be used in the determination of the Government Wide Average?

No

Year over Year Analysis:

Can the indicator be used in a year over year analysis?

Yes

Departmental Results Framework (DRF)

Is this indicator used in the TBS DRF?

No

Reference materials

Treasury Board policy reference or Government of Canada priority

Question #6 new

With the rapid deployment of systems and remote work, does your department have an approach for managing information and data throughout their lifecycles across multiple systems?

  • Yes, the approach includes the use of standardized metadata across all systems
  • Yes, but the approach does not include the use of standardized metadata across all systems
  • No, but standardized metadata is used in some of the department's corporate repositories
  • No, no standardized metadata is used in any of the department's corporate repositories

Rationale

TBS is currently drafting a new Standard on Systems that Manage Information which will be inclusive of both ‘information’ and ‘data’, will be technology agnostic, and will be based in principles and business outcomes. Once this new Standard is in place, it would rescind the current Standard on EDRMS. One key proposed principle of this new standard will be the use of standardized metadata and data taxonomies at a departmental level. Using standardized metadata across all systems that manage information will make information and data more findable, accessible, interoperable and reusable, as is currently required in the Directive on Service and Digital: 

  • requirement 4.3.1.3: Ensuring information and data are managed to enable data interoperability, reuse and sharing to the greatest extent possible within and with other departments across the government to avoid duplication and maximize utility, while respecting security and privacy requirements. 
  • requirement 4.3.1.5: Establishing and maintaining taxonomies or classification structures to manage, store, search, and retrieve information and data in all formats according to prescribed enterprise-wide standards. 

The use of standardized metadata across all systems would also align with current requirements within the Standard on Metadata: 

  • requirement 6.1.3: Ensuring that GC standardized metadata and value domains are incorporated in the design and implementation of departmental systems managing information resources.  

This question will provide baseline information to better understand where departments are at in their journey of managing information in place and whether standardized metadata is already included within their departmental IM approach in preparation for this new Standard.  

Category

  1. Policy compliance
  2. Performance
  3. Other

Target

N/A

Calculation method (where applicable)

N/A

Evidence source and document limit

TBS to answer
Department or agency to answer

This question applies to the following organizations:

  • Large departments and Agencies
  • Small departments and Agencies
  • Other (see guidance)

Data source: N/A

Date of data extraction: N/A

Data collection method: N/A

Evidence: N/A

Document limit: N/A

Period of assessment: N/A

Other TBS use only new

Government Wide Average:

Will this indicator be used in the determination of the Government Wide Average?

No

Year over Year Analysis:

Can the indicator be used in a year over year analysis?

Yes

Departmental Results Framework (DRF)

Is this indicator used in the TBS DRF?

No

Reference materials

Treasury Board policy reference or Government of Canada priority

Page details

Date modified: