MAF 2017 to 2018 security management methodology

From: Treasury Board of Canada Secretariat

On this page

Methodology Overview

The Government of Canada has the responsibility to ensure that information, assets and services are protected against compromise, and that individuals are protected against workplace violence. Proactive and rigorous management of government security enables departments and agencies to adapt to the evolving risk environment and to ensure timely and coordinated responses to security incidents. The overall objectives of the 2017-18 Security Management Accountability Framework (MAF) methodology are to:

  • Assess compliance and monitor progress towards implementation of key policy requirements of the Policy on Government Security;
  • Provide deputy heads and security officials with an integrated view of their organization’s security management practices; and,
  • Strengthen oversight and management practices in departments and agencies to encourage conversations within departments and with other organizations.

The two key areas of the 2017-18 Security MAF methodology focus on:

  • Effective and Integrated Security Management; and
  • Security Controls and Practices.

The methodology includes indicators designed to assess aspects of integrated security planning processes as well as the implementation of management practices and security controls, including business continuity planning, physical security and information technology security. The outcome statements and the rationale for the indicators are included as part of the MAF questionnaire.

The methodology will generate management performance information that provides insight into a department’s or agency’s security planning and security control framework and security management practices that contribute to improving the overall security posture of the Government of Canada. This information is important for validating and informing security management decisions and direction, observing trends and changes, identifying areas of strength and areas that need attention and sharing leading security management practices.

The 2017-18 MAF results will provide the following to the three key audiences listed below:

Deputy Heads:

  • Ensure that business planning and resource allocation align with government security management priorities;
  • Indicate the extent to which departmental security management practices are compliant with the Policy on Government Security; and,
  • Support the secure delivery of programs and services.

Security Community:

  • Assist in benchmarking;
  • Inform the development of departmental guides, procedures and tools; and,
  • Drive actions to improve security.

TBS:

  • Assess the level of compliance and maturity levels of organizations;
  • Identify government-wide risks and systemic issues; and,
  • Inform the development of policy instruments, guidance and tools.

The intent of the security management methodology is to assess the current state of performance and practice. As a result, most questions are from a “point-in-time” perspective to provide the most up-to-date representation possible. The approach is to capture information on security performance as strategic priorities evolve. To ensure that the questions used in the security management methodology capture pertinent information on security performance, previous MAF findings were reviewed and the methodology was refined and re-scoped so that relevant information can be provided to deputy heads on management practices and performance in this area.

The timeframe for the information requested is indicated for each question. In addition to the expected types of evidence, the maximum number of documents that can be provided as evidence for each question is also provided. TBS may refer to internal or external evaluations and audits (including Office of the Auditor General audits), the Application Portfolio Management process and associated Clarity tool, as well as other relevant documents to inform the MAF security management assessment and reporting.

The methodology has been developed in consultation with the security functional community.

Questionnaire

Effective and integrated security management

Outcome statement: Departments and agencies demonstrate a strong security planning cycle.

Indicators and Calculation Method (where applicable) Expected Result Policy Requirement Evidence Source and Document Limit Category
  1. Does the department or agency have a Departmental Security Plan (DSP) that covers the present fiscal year (2017-18) and is approved by the Deputy Head?

    • Yes
    • No

    Rationale: A DSP provides an integrated view of security risks and requirements, and outlines the strategies, priorities and timelines for strengthening and monitoring departmental security practices and controls. An approved and current plan is a tool which communicates the priorities of the organization and the means for addressing risks, and against which achievement of objectives are measured and reported.

An approved DSP is in place and priorities are established for the present fiscal year.

Policy on Government Security 6.1.4

Directive on Departmental Security Management, 6.1.1

Evidence Source:

DSP approved by deputy head (i.e. signature) with established priorities for fiscal year 2017-18.

For the approved DSP that does not have established priorities for fiscal year 2017-18, a DSP progress report or other document with established priorities for fiscal year 2017-18 with evidence that these priorities have been approved by the deputy head before mid-year (i.e. before October 2017) is acceptable.

Evidence Limit: 3 documents

  • Management Practice
    • Policy Compliance
  1. Did the department or agency report to the Deputy Head on the effectiveness of the DSP, including implementation of planned activities for completion for the previous fiscal year (2016-17)?

    • Yes
    • No

    Rationale: Annual reporting of DSP implementation progress and effectiveness supports the organization and the Deputy Head in responding to implementation challenges and provides the opportunity to course-correct or re-prioritize activities as needed.

The Deputy Head was provided with information on the progress of implementing the DSP as well as the effectiveness of this plan.

Policy on Government Security 6.1.1

Directive on Departmental Security Management, 6.1.3

Evidence Source:

DSP progress report covering all areas of departmental security priority activities for fiscal year 2016-17, as well as overall effectiveness of the plan, with evidence that progress report was submitted to the deputy head.

Evidence Limit: 3 documents (generally the same as Q1)

  • Management Practice
    • Policy Compliance
  1. What percentage (%) of activities identified in the approved DSP for the previous fiscal year (2016-17) was completed as planned?

    Calculation of the measure:

    # of planned activities completed in FY (2016-17) ÷ the total # of activities planned for completion in FY (2016-17)

    Note: Priorities cannot be retroactively approved (i.e. priorities need to be approved before mid-year in order to provide a basis for measuring progress in that year).

    Rationale: The extent to which activities identified in the Departmental Security Plan were completed provides an indication of the success in managing department-wide activities to achieve planned improvements. This result provides an indication of the effectiveness of the plan in driving desired security outcomes.

Most activities identified in the DSP for completion in 2016-17 are completed as planned.

Policy on Government Security 6.1.4

Directive on Departmental Security Management, 6.1.1

The evidence source must clearly identify the approved priorities planned for the last fiscal year (2016-17) and indicate which were achieved in that year.

Evidence Source:

Evidence of priority activities planned for the last fiscal year, as identified in:

  • the implementation strategy section of the approved Departmental Security Plan (or other section where priorities are consolidated), or
  • a DSP progress report or other document establishing priorities for the last fiscal year, with evidence that it had been approved by the deputy head before mid-year.

Evidence of priority activities achieved over the last fiscal year, as identified in:

  • an annex to a recently approved Departmental Security Plan outlining progress achieved since the last time the DSP was approved,
    or
  • A DSP progress report or other document outlining progress achieved, with evidence that it had been submitted to, and approved by the deputy head.

Evidence Limit: 3 documents (generally the same as Q1)

  • Management Performance
    • Performance Indicator

Security controls and practice

Business continuity planning

Outcome statement: Departments and agencies demonstrate preparedness in the event of a disruption.

Indicators and Calculation Method (where applicable) Expected Result Policy Requirement Evidence Source and Document Limit Category
  1. What is the number (#) of critical services identified by the department or agency?

    Rationale: Information gathered from the response to this question provides the number of critical services and ensures that each has been identified.

It is expected that 100% of the organization’s critical services are identified.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4

Evidence Source: 

Business continuity planning fields from the Clarity tool through the Application Portfolio Management (APM) process.Footnote 1

  • Descriptive Statistic
  1. What is the percentage (%) of the department’s or agency’s critical services with a business continuity plan (BCP) in place?

    Note: For this response it is recognized that a critical service may have its own BCP, be included within a broader BCP, or be supported by multiple BCPs.

    Rationale: Information gathered from the response to this question provides insight into the extent of progress being made to ensure BCPs are in place for all critical services.

It is expected that 100% of the organization’s critical services have a BCP in place.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4

Evidence Source:

Business continuity planning fields from the Clarity tool through the Application Portfolio Management (APM) process.Footnote 1

  • Management Performance
    • Performance Indicator
  1. What percentage (%) of the department’s or agency’s critical services with a BCP had updates made to the BCPs since ?

    Note: For this response it is recognized that a critical service may have its own BCP, be included within a broader BCP, or be supported by multiple BCPs.

    Rationale: Information gathered from the response to this question provides insight into the extent of progress being made to ensure up-to-date BCPs are in place for all critical services.

It is expected that, for all critical services of the organization with a BCP in place, 100% of the BCPs have had updates made within the last two years.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4

Evidence Source:

Business continuity planning fields from the Clarity tool through the Application Portfolio Management (APM) process.Footnote 1

  • Management Performance
    • Performance Indicator
  1. What is the percentage (%) of critical services with a BCP which was tested or exercisedFootnote 2 since ?

    Note: For this response, it is recognized that a critical service may have its own BCP, be included within a broader BCP, or be supported by multiple BCPs.

    Rationale: Information gathered from the response to this question provides insight into the extent of progress being made to ensure that BCPs are effective and procedures and measures are in place in the event of a disruption.

It is expected that, for all critical services of the organization with a BCP in place, 100% of the BCPs have been tested or exercised within the last two years.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4

Evidence Source: 

Business continuity planning fields from the Clarity tool through the Application Portfolio Management (APM) process.Footnote 1

  • Management Performance
    • Performance Indicator

Physical Security

Outcome statement: Departments and agencies demonstrate sound risk management practices related to facilities.

Indicators and Calculation Method (where applicable) Expected Result Policy Requirement Evidence Source and Document Limit Category
  1. Does the department or agency have a mechanism to identify facilities with higher security requirements?

    • Yes
    • No

    Note 1: The scope of the question includes facilities where the department is the custodian or has operations (buildings, floors within a building, space in foreign embassy, agricultural fields, etc.).

    Note 2: Facilities with higher security requirements are defined by the department’s or agency’s environmental scan (internal and external factors, potential impacts of compromise, the defined threats and current security controls effectiveness) and risk tolerance.

    Rationale: Modernization of the workplace and an evolving threat environment necessitate assessment of security risks to facilities.

It is expected that departments or agencies have a mechanism to identify facilities with higher security requirements.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard on Physical Security

Evidence Source: 

Document describing the mechanism to identify facilities with higher security requirements.

Evidence Limit: 2 Documents

  • Management Practice
    • Practice
  1. What is the percentage (%) of facilities with higher security requirements that have an up-to-date security assessment?

    Note 1: The scope of the question includes facilities where the department is the custodian or has operations (buildings, floors within a building, space in foreign embassy, agricultural fields, etc.).

    Note 2: Facilities with higher security requirements are defined by the department’s or agency’s environmental scan (internal and external factors, potential impacts of compromise, the defined threats and current security controls effectiveness) and risk tolerance.

    Note 3: Up-to-date is considered to be within the departmental timeframe, as established in policy or procedure, for renewing security risk assessments, or three years in the absence of an established departmental timeframe.

    Rationale: Modernization of the workplace and an evolving threat environment necessitate up-to-date assessment of security risks to facilities.

It is expected that all facilities with higher security requirements have an up-to-date security assessment.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard on Physical Security

Evidence Source: 

Departmental or Agency tracking tool including:

  • List of facilities
  • Type of assessment
  • Date of last Security Assessment
  • Frequency of Assessment Cycle
  • Identification of facilities with higher security requirements

Evidence Limit: 2 Documents

  • Management Performance
    • Performance Indicator

Information technology security

Outcome statement: Key security risk mitigation measures are effectively established within the department or agency to ensure the protection of Government of Canada assets and to contribute to continued program/service delivery.

Indicators and Calculation Method (where applicable) Expected Result Policy Requirement Evidence Source and Document Limit Category
  1. With respect to the department’s or agency’s mission critical applications currently in operation:  What is the percentage (%) of mission critical applications that have been authorized for operation by the Program and Service Delivery Manager?

    Calculation of the measure: 

    Number of mission critical applications that are authorized ÷ Number of mission critical applications

    Rationale: Focusing on authorization of applications deemed to be “mission critical” to business operations is important as this category of applications, if compromised or unavailable, has the potential to cause a high degree of injury to the health, safety, security or economic well-being of Canadians, or to the effective functioning of government.

As a best practice, 100% of the organization’s mission critical applications have been authorized.

Directive on Departmental Security Management (Appendix C)

Management of Information Technology Security (MITS) Standard, 12.3.3

Application Portfolio Management (APM) system.Footnote 3

  • Management Performance
    • Performance Indicator
  1. With respect to the department’s or agency’s mission critical applications currently in operation:  What is the percentage (%) of mission critical applications that have been authorized with conditions (formerly known as Interim Authority to Operate)?

    Calculation of the measure: 

    Number of mission critical applications that are authorized  with conditions ÷ Number of mission critical applications

    Rationale: Provides information about the number of mission critical applications which have conditions that require remediation.

As a best practice, 100% of the organization’s mission critical applications have been authorized and have a plan in place to address any residual risks.

Directive on Departmental Security Management (Appendix C)

Management of Information Technology Security (MITS) Standard, 12.3.3

Application Portfolio Management (APM) system.Footnote 3

  • Management Performance
    • Performance Indicator
  1. With respect to the department’s or agency’s mission critical applications currently in operation:  What is the percentage (%) of mission critical applications that have been authorized with conditions that have a remediation plan to fulfill the authorization conditions?

    Calculation of the measure: 

    Number of mission critical applications that are authorized  with conditions that have a remediation plan ÷ Number of mission critical applications that are authorized  with conditions

    Rationale: Demonstrates the tracking of remediation activities is in place for mission critical activities

As a best practice, 100% of the organization’s mission critical applications have been authorized and have a plan in place to address any residual risks.

Directive on Departmental Security Management (Appendix C)

Management of Information Technology Security (MITS) Standard, 12.3.3

Application Portfolio Management (APM) system.Footnote 3

  • Management Performance
    • Performance Indicator
  1. With respect to the department’s or agency’s mission critical applications currently in operation, authorized with conditions, and that have a risk remediation plan or equivalent; what is the percentage (%) of remediation conditions that have been met?

    Calculation of the measure:

    Number of remediation activities met ÷ Number of remediation activities

    Rationale: Demonstrates progress against planned remediation activities (i.e. year over year improvement) for mission critical applications.

As a best practice, it is expected that the organizations are tracking progress on 100% of the mission critical applications that have risk remediation activities.

Directive on Departmental Security Management (Appendix C)

Management of Information Technology Security (MITS) Standard, 12.3.3

Evidence Source:

Departmental or Agency summary document identifying the mission critical applications and the status of risk remediation actions (i.e. dashboard reporting)

Evidence Limit: 1 Document

  • Management Performance
    • Performance Indicator
  1. For vulnerabilities assessed as Extreme Risk, what percentage of systems or services have been patched within Communications Security Establishment (CSE)’s recommended deployment time frames for fiscal year 2017-18?

    Assessed risk:

    Extreme risk (within 48 hours);

    Calculation of the measure:

    Number of extreme risk remediation activities met ÷ Number of systems or services required patching

    Note: CSE’s IT Security Bulletin (ITSB-96 Security Vulnerabilities and Patches Explained) provides information on vulnerability-patch risk assessments and examples of the above ratings. https://www.cse-cst.gc.ca/en/node/241/html/25220

    Rationale: Demonstrates the effectiveness of patch management processes. Results that are lower than expected may leave the GC exposed to known vulnerabilities where a patch is available.

Organizations have indicated their internal targets and timeframes for patch deployment, these organizations should strive to have 100% of patches deployed and installed on all their systems or services within CSE’s targets for fiscal year 2017-18.

Security Vulnerabilities and Patches Explained - IT Security Bulletin for the Government of Canada

Management of Information Technology Security (MITS) Standard, 12.5.2

Evidence Source: 

Departmental or Agency log or other document identifying patches and whether they have been addressed within established timeframes for patch deployment

Evidence Limit: 1 Document

  • Management Performance
    • Performance Indicator
  1. For vulnerabilities assessed as High Risk, what percentage of systems or services have been patched within Communications Security Establishment (CSE)’s recommended deployment time frames for fiscal year 2017-18?

    Assessed risk:

    High risk (within 2 weeks);

    Calculation of the measure:

    Number of high risk remediation activities met ÷ Number of systems or services required patching

    Note: CSE’s IT Security Bulletin (ITSB-96 Security Vulnerabilities and Patches Explained) provides information on vulnerability-patch risk assessments and examples of the above ratings. https://www.cse-cst.gc.ca/en/node/241/html/25220

    Rationale: Demonstrates the effectiveness of patch management processes. Results that are lower than expected may leave the GC exposed to known vulnerabilities where a patch is available.

Organizations have indicated their internal targets and timeframes for patch deployment, these organizations should strive to have 100% of patches deployed and installed on all their systems or services within CSE’s targets for fiscal year 2017-18.

Security Vulnerabilities and Patches Explained - IT Security Bulletin for the Government of Canada

Management of Information Technology Security (MITS) Standard, 12.5.2

Evidence Source:

Departmental or Agency log or other document identifying patches and whether they have been addressed within established timeframes for patch deployment

Evidence Limit: 1 Document

  • Management Performance
    • Performance Indicator
  1. For vulnerabilities assessed as Medium Risk, what percentage of systems or services have been patched within Communications Security Establishment (CSE)’s recommended deployment time frames for fiscal year 2017-18?

    Assessed risk:

    Medium risk (at the next major update or within three months); and

    Calculation of the measure:

    Number of medium risk remediation activities met ÷ Number of systems or services required patching

    Note: CSE’s IT Security Bulletin (ITSB-96 Security Vulnerabilities and Patches Explained) provides information on vulnerability-patch risk assessments and examples of the above ratings. https://www.cse-cst.gc.ca/en/node/241/html/25220

    Rationale: Demonstrates the effectiveness of patch management processes. Results that are lower than expected may leave the GC exposed to known vulnerabilities where a patch is available.

Organizations have indicated their internal targets and timeframes for patch deployment, these organizations should strive to have 100% of patches deployed and installed on all their systems or services within CSE’s targets for fiscal year 2017-18.

Security Vulnerabilities and Patches Explained - IT Security Bulletin for the Government of Canada

Management of Information Technology Security (MITS) Standard, 12.5.2

Evidence Source: 

Departmental or Agency log or other document identifying patches and whether they have been addressed within established timeframes for patch deployment

Evidence Limit: 1 Document

  • Management Performance
    • Performance Indicator
  1. For vulnerabilities assessed as Low Risk, what percentage of systems or services have been patched within Communications Security Establishment (CSE)’s recommended deployment time frames for fiscal year 2017-18?

    Assessed risk:

    Low risk (at the next major update or within one year).

    Calculation of the measure:

    Number of low risk remediation activities met ÷ Number of systems or services required patching

    Note: CSE’s IT Security Bulletin (ITSB-96 Security Vulnerabilities and Patches Explained) provides information on vulnerability-patch risk assessments and examples of the above ratings. https://www.cse-cst.gc.ca/en/node/241/html/25220

    Rationale: Demonstrates the effectiveness of patch management processes. Results that are lower than expected may leave the GC exposed to known vulnerabilities where a patch is available.

Organizations have indicated their internal targets and timeframes for patch deployment, these organizations should strive to have 100% of patches deployed and installed on all their systems or services within CSE’s targets for fiscal year 2017-18.

Security Vulnerabilities and Patches Explained - IT Security Bulletin for the Government of Canada

Management of Information Technology Security (MITS) Standard, 12.5.2

Evidence Source: 

Departmental or Agency log or other document identifying patches and whether they have been addressed within established timeframes for patch deployment

Evidence Limit: 1 Document

  • Management Performance
    • Performance Indicator
  1. Does the department verify that patches have been successfully installed?

    • Yes
    • No

    Rationale:

    Successful installation of the patches is recognized as common practice within the security community and provides an indication of the department’s or agency’s due diligence to protect GC assets (e.g. information).

It is expected that all organizations have a documented patch management plan that details the verification process to ensure patches were properly installed. Evidence could be a spreadsheet, output from a patch management tool or Vulnerability Assessment Scan.

Security Vulnerabilities and Patches Explained - IT Security Bulletin for the Government of Canada

Management of Information Technology Security (MITS) Standard, 12.5.2

Evidence Source: 

Departmental or Agency log or other document identifying patches have been successfully installed and verified.

Evidence Limit: 1 Document

  • Management Practice
    • Practice

Glossary

Critical service

is a service whose compromise in terms of availability or integrity would result in a high degree of injury to the health, safety, security or economic well-being of Canadians, or to the effective functioning of the government.

CSE ratings of vulnerability/patch risk assessments:

Extreme risk:

  • vulnerability allows remote code execution;
  • critical business system/information affected;
  • exploits exist and are in use; and
  • system is connected to the Internet without having mitigating controls in place.

High risk:

  • vulnerability allows remote code execution;
  • critical business system information affected;
  • exploits exist and are in use; and
  • system is in a protected enclave with strong access controls.

Medium risk:

  • vulnerability allows an attacker to impersonate a legitimate user on a remote access solution;
  • system is exposed to unauthenticated users; and
  • system requires two-factor authentication and administrator-level remote login is disallowed.

Low risk:

  • a vulnerability requires authenticated users to perform malicious actions, such as SQL injection;
  • affected system contains non-sensitive, publicly-available information; and
  • mitigating controls exist that make exploitation unlikely or very difficult.
Facility
a physical setting used to serve a specific purpose. A facility may be part of a building, a whole building, or a building plus its site; or it may be a construction that is not a building. The term encompasses both the physical object and its use (for example, weapons ranges, agriculture fields). (Operational Security Standard on Physical Security, Appendix A)
Facilities with higher security requirements
are defined by the department’s or agency’s environmental scan (internal and external factors, potential impacts of compromise, the defined threats and current security controls effectiveness) and risk tolerance.
Mission critical application
is a business application that is, or supports, a critical service.
Security assessment
The process of identifying and qualifying security-related threats, vulnerabilities and risks, to support the definition of security requirements and the identification of security controls to reduce risks to an acceptable level.
Up-to-date
is considered to be within the departmental timeframe, as established in policy or procedure, for renewing security risk assessments, or 3 years in the absence of an established departmental timeframe.

Acronyms

Acronym Spelled Out
APM Application Portfolio Management system
BCP Business Continuity plan
CSE Communications Security Establishment
DSP Departmental Security Plan
EX Executive
FY Fiscal year
GC Government of Canada
MAF Management Accountability Framework
SQL Structured Query Language
TBS Treasury Board of Canada Secretariat

Page details

Date modified: