MAF 2015 to 2016 security management methodology

Table of contents

Methodology overview

Introduction

The Security Management questionnaire is intended to generate management performance information that provides insight into a department’s or agency’s security management practices for the benefit of the organization and to contribute to improving the overall situational awareness and security posture of the Government of Canada (GC).

This information will be important for validating and informing security management decisions and direction, observing trends and changes, identifying areas of strength and areas that need attention, benchmarking, and sharing leading security management practices. In addition, from the responses, deputy heads will be informed of the extent to which departmental management practices are aligned with GC priorities for security and of the department’s or agency’s progress supporting secure delivery and modernization goals for GC programs and services.

The three key areas of the methodology focus on:

  • Practices to ensure the effectiveness, integration and inclusion of all security activity areas,
  • Key security risk mitigation measures that enable the protection of GC assets and continued program and service delivery, and
  • Alignment to GC security policy priorities that underpin secure delivery and modernization goals for GC programs and services.

The questionnaire will assess security management practices, performance and milestones, using a variety of means for benchmarking. These means may evolve annually from establishing baseline information to determining targets based on GC averages.

Security management three-year approach to the Management Accountability Framework

The Security Management approach to the three-year cycle for the Management Accountability Framework will ensure that the questions used for the Security Management methodology capture information on security performance and transformation as strategic priorities for security, security policy evolution and implementation, and GC initiatives continue to advance. Capturing this information will be achieved by building on Year 1 questions on policies and initiatives for information and information technology security, physical security, personnel security screening and cyber authentication in addition to capturing supplementary information on cybersecurity, enterprise security architecture, and internal identity, credential and access management priorities in years 2 and 3.

TBS will continue to review government-wide findings of the new and ongoing questions used in this methodology to ensure stability when conducting a three-year trend analysis of findings following the 2016-17 MAF. Where appropriate, TBS may revise questions to continue implementing the principles of MAF 2.0 including posing questions that provide deputy heads and TBS with useful information to improve management performance.

Security management methodology for 2015-16

To ensure that the questions used in the Security Management methodology capture pertinent information on security performance and transformation, the Area of Management reviewed the 2014-15 MAF findings in order to refine the methodology and contribute to providing clear and valuable information to deputy heads on management practices and performance in this area.

Changes in the 2015-16 Security methodology include the following:

  • Questions and sub-questions that collected contextual and statistical information were removed to allow to a more streamlined questionnaire and to reduce the departmental reporting burden. Examples are contextual questions on the nature of security incidents and the total number of facilities in the context of physical security.
  • Contextual information is incorporated into the evidence source.
  • Refinement to definitions and descriptions of the measures, criteria, rationale, and scope for each question is provided to enable consistent interpretations and measurement across departments and agencies.
  • New questions to specifically assess compliance concerning responsibilities related to the Departmental Security Plan and the effectiveness of processes that support this management tool have been incorporated.
  • Questions on the security screening of individuals have evolved to assess the implementation requirements of the new Standard on Security Screening, effective .
  • New questions on IT security and cybersecurity have been introduced to reflect lessons learned from recent GC cyberincidents and to focus on high-dividend IT security actions. These questions will help create a baseline of practices for the implementation of the GC Cyber Security Event Management Plan coming into effect in Fiscal Year 2015-16.

Response timeframes, evidence and validation

The intention of the Security Management methodology is to reflect upon the current state of performance and practice. As a result, most questions are from a “point-in–time” perspective to provide the most up-to-date representation possible. In other instances, a timeframe for the information requested will be indicated.

In addition to the expected types of evidence, the maximum number of documents that can be provided as evidence for each question is provided in the table. TBS may refer to internal or external evaluations and audits as well as other relevant documents to support the MAF Security Management assessments and reporting.

Questionnaire

Effective and integrated department or agency security management

Outcome statements: Departments and Agencies establish integrated governance, planning, monitoring and reporting processes that address all security activity areas.

Governance and monitoring/reporting

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. Governance mechanisms for departmental/agency security management;

    • 1.1

      Does the department or agency have established governance mechanisms involving the deputy head at which departmental security and/or departmental security management is discussed/considered at a minimum once a year?

      • Yes
      • No - go to Q2
    • 1.2

      Do the established governance mechanisms involve the appointed Departmental Security Officer?

      • Yes
      • No

    Rationale: 

    Security Management addresses risks and threats to the department or agency and GC-wide operations and is distributed within and across organizations.

    Established governance mechanisms that involve the Deputy Head where security management is considered/discussed at least once a year, and which include the participation of the Departmental Security Officer, ensure that departmental security management involves both business and security expertise to contribute to its effective management.

    Definitions: 

    Governance mechanisms
    Involve the deputy head can be a committee chaired by the deputy head or an established process which involves the deputy head

    Note: 

    The discussion (or consideration) referred to in the question is in addition to the approval of the Departmental Security Plan. It should support the development and the implementation of the security objectives and priorities established in the Departmental Security Plan.

Policy on Government Security, 6.1.1, 6.1.2

Directive on Departmental Security Management 6.1.6

Evidence Source: 

Departmental or Agency

  • Agendas
  • Terms of Reference
  • Memorandum of Understanding (MOU)
  • Briefing notes/decks
  • Reports (monitoring, progress, results, DPR, RPP)

Evidence Limit: 

4 Documents

Practice

Planning

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. Does the department or agency have a Departmental Security Plan that covers the present fiscal year (2015-16) which is approved by the Deputy Head?

    • Yes
    • No

    Rationale: 

    A Departmental Security Plan identifies security risks and outlines the strategies, objectives, priorities and timelines for improving departmental security. An approved and current plan is a tool which communicates the priorities of the organization and the means for addressing risks, against which achievement of objectives are measured and reported.

Policy on Government Security 6.1.4

Directive on Departmental Security Management, 6.1.1

Evidence Source: 

Departmental Security Plan

Evidence Limit: 

1 Document

Practice

  1. Does the department or agency review its Departmental Security Plan annually?

    • Yes
    • No

    Rationale: 

    In a constantly evolving security risk environment, an annual review of the Departmental Security Plan supports the organization and the Deputy Head to respond to changes in risks and their impacts, by providing an opportunity to re-evaluate and re-prioritize activities in order to maintain secure and continued programs and service delivery.

Policy on Government Security 6.1.4

Directive on Departmental Security Management, 6.1.1

Evidence Source: 

Most recent annual review (i.e. progress report)

Evidence Limit: 

1 Document

Practice

  1. Does the department’s or agency’s process for developing or maintaining the Departmental Security Plan consider all programs and activities identified in the department or agency’s Program Alignment Architecture (i.e. an integrated view)?

    • Yes
    • No

    Rationale: 

    A comprehensive Departmental Security Plan is enabled by and references the processes which ensure that all of the program activities as described in the organization’s Program Alignment Architecture (PAA) have been reviewed to provide an integrated view of security risks and mitigations.

Policy on Government Security 6.1.4

Directive on Departmental Security Management, 6.1.1.1

Evidence Source: 

Departmental Security Plan

Evidence Limit: 

1 Document

Practice

Protection of Government of Canada assets and continued program/service delivery

Outcome statements: Key security risk mitigation measures are effectively established within the department or agency to ensure the protection of Government of Canada assets and to contribute to continued program/service delivery.

Information technology security 

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
Table 3 Notes
Table 3 Note 1

TBS will populate the department’s or agency’s response.

Return to table 3 note * referrer

  1. With respect to the department’s or agency’s mission critical applications;

    • 5.1

      What is the percentage (%) of mission critical applications currently in operation that have been authorized for operation by the Program and Service Delivery Managers?

    • 5.2

      What percentage (%) of authorized mission critical applications (Q5.1) are operating under an interim authority to operate (IAO) or equivalent?

    Rationale: 

    Focusing on authorizations of those applications deemed to be “mission critical” to business operations is important as this category of applications, if compromised or unavailable, has the potential to cause a high degree of injury to the health, safety, security or economic well-being of Canadians, or the effective functioning of government.

    Calculation of measure 5.1: 

    Number of mission critical applications authorized

    (divided by)

    Total number of mission critical applications

    Calculation of measure 5.2: 

    Number of mission critical applications with IAO

    (divided by)

    Total number of mission critical applications authorized

    Definitions: 

    Mission critical application
    Is a business application that is, or supports, a critical service. A critical service is a service whose compromise in terms of availability or integrity would result in a high degree of injury to the health, safety, security or economic well-being of Canadians, or to the effective functioning of the government.

    Note: 

    This response will be aligned with the applications which have been identified as mission critical by departments or agencies when completing their submission for the TBS-CIOB Application Portfolio Management (APM).

Directive on Departmental Security Management (Appendix C)

Management of Information Technology Security (MITS) Standard, 9.6 & 12.3.3

Evidence Source: 

Application Portfolio Management systemTable 7 note *

Performance

  1. With respect to the department’s or agency’s mission critical applications operating under an interim authority to operate (IAO) or equivalent;

    • 6.1

      What percentage (%) has a plan in place to address residual risks which resulted in the requirement of the interim authority to operate?

    • 6.2

      What percent (%) have resources available to address the residual risks which resulted in the requirement of interim authority to operate?

    Rationale: 

    Focusing on authorizations of those applications deemed to be “mission critical” to business operations is important as this category of applications, if compromised or unavailable, has the potential to cause a high degree of injury to the health, safety, security or economic well-being of Canadians, or the effective functioning of government.

    Further focus on authorized applications operating under an interim authority to operate or equivalent is important as these applications have outstanding risks identified requiring management and resourcing to address.

    Measuring the performance of establishing plans and resources to address residual risks provides an indication of the department’s or agency’s undertaking to protect GC assets (e.g. information).

    Calculation of measure 6.1: 

    Number of IAO applications that have a plan to address the residual risks

    (divided by)

    Number of mission critical applications with IAO

    Calculation of measure 6.2: 

    Number of IAO applications that have resources available to address the residual risks

    (divided by)

    Number of mission critical applications with IAO

    Definitions: 

    Mission critical application
    Is a business application that is, or supports, a critical service. A critical service is a service whose compromise in terms of availability or integrity would result in a high degree of injury to the health, safety, security or economic well-being of Canadians, or to the effective functioning of the government.
    Resources
    Means that salaried and non-salaried funds are in place for the present fiscal year (2015-16).

    Note: 

    This response will be aligned with the applications which have been identified as mission critical by departments or agencies when completing their submission for the TBS-CIOB Application Portfolio Management (APM).

Directive on Departmental Security Management (Appendix C)

Management of Information Technology Security (MITS) Standard, 12.3.3

Evidence Source: 

Application Portfolio Management systemTable 7 note *

Performance

  1. Does the department or agency have a documented vulnerability assessment process or plan?

    • Yes
    • No

    Rationale: 

    Vulnerability assessments are recognized as a best practice by governments and industry for addressing security gaps in IT systems and applications.

    Departments and agencies must conduct and document vulnerability assessments regularly on highly sensitive or highly exposed systems and on a discretionary basis on other systems.

Management of Information Technology Security (MITS) Standard, 12.5.1

Evidence Source: 

Departmental or Agency document describing the process or plan

Evidence Limit: 

1 Document

Practice

  1. Does the department or agency keep a record of identified vulnerabilities including whether they have been addressed?

    • Yes
    • No

    Rationale: 

    Vulnerability assessments are recognized as a best practice by governments and industry for addressing security gaps in IT systems and applications.

    Departments and agencies must document vulnerability assessments, subsequent decisions and remedial actions.

Management of Information Technology Security (MITS) Standard 12.5.1

Evidence Source: 

Departmental  or Agency log or other document identifying vulnerabilities and whether they have been addressed

Evidence Limit: 

1 Document

Practice

  1. Departmental or Agency patch management process or plan;

    • 9.1

      Does the department or agency have a documented patch management process or plan?

      • Yes
      • No – go to Q11
    • 9.2

      Does the departmental or agency process or plan include targets or timeframes for patch deployment prioritized by severity and impact?

      • Yes
      • No – go to Q11

    Rationale: 

    Vulnerability assessments are recognized as a best practice by governments and industry for addressing security gaps in IT systems and applications.

    Departments and agencies must establish a systematic, documented patch management process to ensure they apply security-related patches in a timely manner. Implementing timely patch management will reduce departmental exposure to threats that could exploit known vulnerabilities and provides an indication of the department’s or agency’s undertaking to protect GC assets (e.g. information).

Management of Information Technology Security (MITS) Standard 12.5.2

Evidence Source: 

Departmental or Agency documents describing the process or plan and outlining patch management targets or timeframes

Evidence Limit: 

2 Documents

Practice

  1. For departments or agencies with a documented patch management process or plan (Q9), what percentage of systems or services have been patched within the department’s or agency’s established targets or timeframes for fiscal year 2015-2016?

    Rationale: 

    Measuring the performance of systems or services that have been patched in a timely manner provides an indication of the department’s or agency’s undertaking to protect GC assets (e.g. information).

    Calculation of measure: 

    Number of systems or services that have been patched within established targets/timeframes

    (divided by)

    Number of systems or services requiring patching

Management of Information Technology Security (MITS) Standard 12.5.2

Evidence Source: 

Departmental or Agency list of systems or services requiring patches and patch status

Evidence Limit: 

1 document

Performance

Physical security

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. What is the percentage (%) of the department’s or agency’s facilities that have an up-to-date security risk assessment?

    Rationale: 

    Modernization of the workplace and evolving threats necessitate up-to-date assessment of security risks to facilities.

    Assessment of the extent to which the department or the agency is assessing the security risks of their facilities is important they can also impact or introduce risk to information, people, and other GC assets.

    Calculation of measure: 

    Number of facilities with an up-to-date assessment

    (divided by)

    Total number of facilities

    Definitions: 

    Facility
    A physical setting used to serve a specific purpose. A facility may be part of a building, a whole building, or a building plus its site; or it may be a construction that is not a building. The term encompasses both the physical object and its use (for example, weapons ranges, agriculture fields). (Operational Security Standard on Physical Security, Appendix A)
    Up‑to‑date
    Is considered to be within the departmental timeframe, as established in policy or procedure, for renewing security risk assessments, or 3 years in the absence of an established departmental timeframea
    Security risk assessment
    The process of identifying and qualifying security-related threats, vulnerabilities and risks, to support the definition of security requirements and the identification of security controls to reduce risks to an acceptable level.

    Note: 

    The scope of the question includes both 1) facilities where there are departmental or agency operations (buildings, floors within a building, space in foreign embassy, agricultural fields, etc.) and; 2) facilities where the department is a custodian and may not have departmental or agency operations.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard on Physical Security

Evidence Source: 

Departmental or Agency documentation providing the total  number of facilities

Evidence Limit: 

1 document

Performance

  1. With respect to the department’s or agency’s facilities which contain sensitive information and assets;

    • 12.1

      Has the department or agency identified the facilities which contain the most sensitive information and assets?

      • Yes
      • No – go to Q13
      12.2

      What percentage of the facilities identified in Q12.1 have an up-to-date security risk assessment?

    Rationale: 

    Modernization of the workplace and evolving threats necessitate up-to-date assessment of security risks to facilities. Assessment of security risks of facilities is important they can also impact or introduce risk to information, people, and other GC assets.

    Further assessment of identified facilities which contain the most sensitive information and assets and the performance of up-to-date security risk assessments on those facilities, provides an indication of the organizations knowledge of the risks of greatest impact to the organization and GC operations.

    Calculation of measure 12.2: 

    Number of facilities with an up-to-date assessment

    (divided by)

    Total number of facilities identified in Q12.1

    Facility
    A physical setting used to serve a specific purpose. A facility may be part of a building, a whole building, or a building plus its site; or it may be a construction that is not a building. The term encompasses both the physical object and its use (for example, weapons ranges, agriculture fields). (Operational Security Standard on Physical Security, Appendix A)
    Up‑to‑date
    Is considered to be within the departmental timeframe, as established in policy or procedure, for renewing security risk assessments, or 3 years in the absence of an established departmental timeframea
    Security risk assessment
    The process of identifying and qualifying security-related threats, vulnerabilities and risks, to support the definition of security requirements and the identification of security controls to reduce risks to an acceptable level.

    Note: 

    The scope of the question includes both 1) facilities where there are departmental or agency operations (buildings, floors within a building, space in foreign embassy, agricultural fields, etc.) and; 2) facilities where the department is a custodian and may not have departmental or agency operations.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard on Physical Security

Evidence Source: 

Departmental or Agency list of facilities which contain sensitive information and assets.

Evidence Limit: 

1 document

  • Practice
  • Performance
  1. Presently, of the department’s or agency’s facilities with material residual security deficiencies;

    • 13.1

      What percentage (%) has an approved and funded (where required) risk mitigation action plan?

    • 13.2

      What percentage (%) has any deficiency been outstanding for greater than 12 months?

    Rationale: 

    Measuring the performance of an organization’s ability to ensure all material residual security deficiencies are being managed with an approved and funded risk mitigation plan provides an indication of the department’s or agency’s undertaking to protect individuals and assets.

    Measuring the extent to which material residual security deficiencies remain outstanding after one year provides an indication of the organization’s ability resolve these deficiencies in a timely manner.

    Calculation of measure 13.1: 

    Number of facilities with deficiencies with a plan/funded

    (divided by)

    Total number of facilities with deficiencies

    Calculation of measure 13.2: 

    Number of facilities with deficiencies outstanding > 12 months

    (divided by)

    Total number of facilities with deficiencies

    Definitions: 

    Residual security deficiencies
    Are those where base building and/or departmental or agency security requirements (identified through risk assessments) have not been fully addressed.
    Material
    Meaning caused or could reasonably be expected to cause serious injury or harm (i.e. medium or high impact) to the health and safety of an individual, a high value government asset, the delivery of a critical service or the interest of individuals or businesses.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard on Physical Security

Evidence Source: 

Departmental or Agency documentation providing the total number of facilities with residual material deficiencies and which have any deficiency that has been outstanding for greater than 12 months

Evidence Limit: 

1 document (same document for Q13 and Q13.1)

Performance

Security screening of individuals

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. Which of the 6 main components of the new Standard on Security Screening, effective , has the department or agency started to address/implement?

    Check all that apply: 

    Rationale: 

    The information gathered from response to this question provides an indication of the progress (by requirement) being made by departments and agencies in implementing the main components of the new Standard on Security Screening. This will be informative for organizations to measure progress on a comparative basis, for seeing overall GC progress and to potentially highlight areas requiring additional support.

Standard on Security Screening 6.2.2

Evidence Source: 

Department or Agency security policy or procedures

Evidence Limit: 

1 Document

Milestones

Security awareness

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. Departmental or agency security awareness program;

    • 15.1

      Does the department or agency have an established security awareness program which includes requirements or activities that are mandatory for all employees?

      • Yes
      • No
      • N/A – the department or agency does not have an established security awareness program
    • 15.2

      Does the departmental or agency security awareness program define targets and timeframes for ensuring individuals have been informed (initially and on an on-going basis) of security issues and of their security responsibilities?

      • Yes
      • No
    • 15.3

      Does the department or agency use performance indicators to measure if individuals understand and are complying with security responsibilities?

      • Yes, please provide
      • No

    Rationale: 

    Departments and agencies are required to establish a security awareness program. The departmental and agency responses will provide a view of the mechanisms organizations use to deliver, monitor and measure their program and outcomes.

    Evidence provided to support responses can be used to highlight and share notable practices.

Directive on Departmental Security Management (Appendix C)

Evidence Source: 

Department or Agency:

  • Copy of core security awareness curriculum (Briefing, Handbook, Deck)
  • Policy or procedural document
  • Schedules, etc

Evidence Limit: 

3 Documents

Practice

Business continuity planning

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
Table 7 Notes
Table 7 Note 1

The template, along with guidance, will be provided to departments and agencies.

Return to table 7 note * referrer

Table 7 Note 2

Tests can include communications trees (direct call, text or email contact/reply), alternate site set access and function capability, ability to access critical applications remotely; supplier’s BCP capability and several others. Discussion exercises can include activities such as seminar, workshop or tabletop. Operation exercises can include a drill, functional or full scale simulation exercises.

Return to table 7 note ** referrer

  1. Business continuity plans (BCPs) for departmental or agency critical services; Following sub-questions relate to critical services and whether they have an updated business continuity plan in place;

    • 16.1

      What is the percentage of the department’s or agency’s critical services without a business continuity plan (BCP) in place?

    • 16.2

       What is the percentage of the department’s or agency’s critical services for which business continuity plans (BCPs) are in place and last updated between and ?

    • 16.3

      What is the percentage of the department’s or agency’s critical services for which business continuity plans (BCPs) are in place and last updated between and ?

    • 16.4

      What is the percentage of the department’s or agency’s critical services for which business continuity plans (BCPs) are in place and last updated between and , or earlier?

    Rationale: 

    Information gathered from response to this question provides insight into the extent of progress being made to ensure business continuity plans are in place for all critical services to ensure business continuity requirements are met in the event of a disruption.

    Note 1: 

    The sum of 16.1 to 16.4 must equal 100%.

    Note 2: 

    For this response it is recognized that a critical service may have its own business continuity plan, be covered within a broader business continuity plan, or be supported by multiple business continuity plans

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4

Evidence Source: 

Completed templateTable 7 note * or equivalent

Evidence Limit: 

1 Document

Performance

  1. Testing and exercising of business continuity plans (BCPs) for departmental or agency critical services;

    • 17.1

      What is the percentage of critical services with a business continuity plan (BCP) which has not been tested or exercised?

    • 17.2

      What is the percentage of critical services with a business continuity plan (BCP) which was last tested or exercisedTable 7 note ** between and ?

    • 17.3

      What is the percentage of critical services with a business continuity plan (BCP) which was last tested or exercisedTable 7 note ** between and ?

    • 17.4

      What is the percentage of critical services with a business continuity plan (BCP) which was last tested or exercisedTable 7 note ** between and ?

    Rationale: 

    Information gathered from response to this question provides insight into the extent of progress being made to ensure business continuity plans and measures are in place, and ready for all critical services to ensure business continuity requirements are met in the event of a disruption.

    Note 1: 

    The sum of 17.1 to 17.4 must equal 100%

    Note 2: 

    For this response it is recognized that a critical service may have its own business continuity plan, be covered within a broader business continuity plan, or be supported by multiple business continuity plans

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4

Same evidence as Q16

Performance

  1. Has the department or agency developed an audit cycle for their Business Continuity Planning Program?

    • Yes
    • No

    Rationale: 

    As the basis of regular reporting to the Treasury Board Secretariat, departments and agencies are required to develop an audit cycle for their Business Continuity Planning (BCP) Program. This is important as it provides information on the state of readiness of the department or agency specific to its most critical services.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.4 d)

Evidence Source: 

Departmental or Agency policy or program document

Evidence Limit: 

1 Document

Practice

Security incidents

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. Following a security incident are the facts, root causes, and actions taken recorded in a consistent manner?

    • Yes
    • No

    Rationale: 

    An objective of the Policy on Government Security is that the management of security incidents is effectively coordinated within departments and government-wide.

    Using standardized practices for recording security incidents indicates that the organization has the foundation necessary to produce analytic information to plan security improvements and to contribute to the GC-wide analysis and strengthen response to security incidents.

    Definitions: 

    Security incident
    Any workplace violence toward an employee or any act, event or omission that could result in the compromise of information, assets or services.

    Note: 

    For a “Yes” response, evidence should demonstrate that tools are in place to ensure consistency within the department or agency concerning the information on security incidents that is recorded, thereby facilitating its analysis for monitoring and reporting purposes.

Policy on Government Security, 5.2

Evidence Source: 

Departmental or Agency

  • Incident/event templates (spreadsheet, e-mail, etc.)
  • Forms

Evidence Limit: 

1 Document

Practice

Government of Canada security policy priority alignment

Outcome statements: Departments and Agencies are aligning to Government of Canada security policy priorities that underpin and support secure delivery and modernization goals for Government of Canada programs and services.

Canada’s cyber security strategy

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. Does the department or agency keep a record of all of their cybersecurity incidents?

    • Yes
    • No – go to Q22

    Rationale: 

    Keeping a record of all cybersecurity incidents is recognized as a best practice by governments and industry for highlighting anomalies and patterns of attempted and successful breaches. Information gathered from the responses can be used to highlight and share notable practice.

    Definitions: 

    Cybersecurity incident
    Is any cybersecurity event (or collection of security events) or omission that results in the compromise of an IT system. A compromise is the unauthorized access to, disclosure, modification, use, interruption, removal, or destruction of information or assets, causing a loss of confidentiality, integrity, availability and/or value.

    Note: 

    For a “Yes” response, evidence should demonstrate that templates are in use as outlined in the GC IT Incident Management Plan (Appendix E) and/or GC Cybersecurity Event Management Plan (Cyber Security Incident Reporting Template, 5.2.2) and the information from these templates is being recorded.

Government of Canada IT Incident Management Plan (GC IT IMP) (Annex E)

Government of Canada Cyber Security Event Management Plan (GC CSEMP), 5.2.2

Evidence Source: 

Departmental or Agency extract or sample of a log, etc.

Evidence Limit: 

1 Document

Practice

  1. For fiscal year 2015-2016, what percentage of recorded cybersecurity incidents were reported to GC-CIRT?

    Rationale: 

    As per the reporting requirements outlined in section 5.2 of the Government of Canada Cyber Security Event Management Plan (GC CSEMP), departments and agencies will carry out monitoring and detection activities established on department-owned IT infrastructure and notify the GC-CIRT upon detection of a cybersecurity event.

    Calculation of measure: 

    Number of reported cybersecurity incidents

    (divided by)

    Total number of recorded incidents

Government of Canada IT Incident Management Plan (GC IT IMP) (Annex E)

Government of Canada Cyber Security Event Management Plan (GC CSEMP), 5.2

No evidence required

Performance

Federating identity

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. What is the percentage (%) of the department’s or agency’s online external services which require a client to sign-in with a username and password (i.e. client authentication) that have a completed Assurance Level assessment?

    Rationale: 

    The Government of Canada is committed to providing online services that enable trusted transactions between clients and the government. Online transactions provide challenges to departments and agencies for establishing or verifying the identity of the requester and also ensuring secure transactions.

    Prior to the delivery of any service to a client, departments and agencies are required to assess the identity and credential risks of the service or transactions. Departments and agencies assess these risks by completing an assurance level assessment.

    The assessment is then used by the organization to select the appropriate processes and technologies to ensure identified risks are being addressed in order to enable the provision of the service online. This standardized process is also the foundation for federating identity.

    Calculation of measure: 

    Number of online external services with assessments

    (divided by)

    Total number of online external services

    Definitions: 

    External services
    Are those delivered to citizens and businesses.

Standard on Identity and Credential Assurance, 6.1.1.

Evidence Source: 

Departmental or Agency list of online external services.

In the list please include information regarding whether an assurance level assessment was completed, if the service requires client authentication, and if the service uses the mandatory External Credential Management service

(GCKey/Credential Broker Service)

Evidence Limit: 

1 Document

Performance

  1. What is the percentage (%) of the department’s or agency’s online external services which require a client to sign-in with a username and password (i.e. client authentication) that use the mandatory External Credential Management service (GCKey/Commercial Broker Service)?

    Rationale: 

    The Government of Canada is committed to providing online services that reduce the duplication of login and passwords by encouraging the use of those that the client has already established or by using one common login and password across GC services and transactions. This is provided through the mandatory External Credential Management service (GCKey/Commercial Broker Service).

    This indicator measures the uptake of the mandatory service across the assessed departments and agencies, focusing on online services to citizens and businesses, and provides an indication of the alignment of the department or agency with GC-wide modernization goals.

    Calculation of measure: 

    Number of online external services with using ECMS

    (divided by)

    Total number of online external servicea

    Definitions: 

    External services
    Are those delivered to citizens and businesses.

Directive on Identity Management, 5.2

Standard on Identity and Credential Assurance, 5.2

Same evidence as Q22.

Performance

GC wide security awareness

Indicators and Calculation Method (where applicable) Policy Reference Evidence Source and Document Limit Category
  1. Does the department or agency include the foundational Security Awareness Course (course A230) offered by the Canada School of Public Service within its departmental security awareness program?

    • Yes
    • No

    Rationale: 

    As an employer, a goal of the Government of Canada is to ensure that all employees receive a common baseline of security awareness information. This core skill is important in a working environment where there is high mobility of employees.

    In addition, the GC is committed to optimizing investments, such as foundational materials provided by common service providers.

N/A

No evidence required

Practice

Page details

Date modified: