MAF 2016 to 2017 security management methodology

Table of contents

Methodology overview

Introduction

The security management questionnaire is intended to generate management performance information that provides insight into a department’s or agency’s security management practices for the benefit of the organization and to contribute to improving the overall situational awareness and security posture of the Government of Canada (GC).

This information will be important for validating and informing security management decisions and direction, observing trends and changes, identifying areas of strength and areas that need attention, benchmarking, and sharing leading security management practices. In addition, from the responses, deputy heads will be informed of the extent to which departmental management practices are aligned with GC priorities for security and of the department’s or agency’s progress supporting secure delivery and modernization goals for GC programs and services.

The three key areas of the methodology focus on: 

  • Practices to ensure the effectiveness, integration and inclusion of all security activity areas,
  • Key security risk mitigation measures that enable the protection of GC assets and continued program and service delivery, and
  • Alignment to GC security policy priorities that underpin secure delivery and modernization goals for GC programs and services.

The questionnaire will assess security management practices, performance and milestones, using a variety of means for benchmarking. These means may evolve annually from establishing baseline information to determining targets based on GC averages.

Security management three-year approach to the management accountability framework

The security management approach to the three-year cycle for the Management Accountability Framework is to utilize questions within the methodology to capture information on security performance and transformation as strategic priorities for security, security policy evolution and implementation, and GC initiatives continue to advance. Building on year one and year two indicators, TBS will continue to review government-wide findings of new and ongoing questions used in this methodology to ensure stability when conducting a three-year trend analysis of findings following the 2016-17 MAF.

Security management methodology for 2016-17

To ensure that the questions used in the security management methodology capture pertinent information on security performance and transformation, the Area of Management reviewed the 2015-16 MAF findings. The methodology was then refined to enable clear and valuable information provisioning to deputy heads on management practices and performance in this area.

Overall, the 2016-17 security methodology is consistent with last year’s. The following updates were made to improve clarity, reflect current priorities, and align with the direction of the Security Policy reset.

  • Removal of questions: 

    • Q1 on governance mechanisms for departmental security management and the appointment of a Departmental Security Officer;
    • Q12 on facilities containing sensitive information and assets, and up-to-date security risk assessment;
    • Q19 on security incident standardized practices; and
    • Q21 on reporting of recorded cybersecurity incidents to the GC-CIRT
  • Modification of questions: 

    • Q3 on departmental security plan annual review to new Q2 on progress reporting to the Deputy Head regarding the implementation of the departmental security plan;
    • Q4 on consideration of program and activities identified in their Program Alignment Architecture to new Q3 on percentage of activities completed as planned in the departmental security plan; 
    • Q9 and 10 on patch management process or plan will be incorporated into Q8;
    • Q15 on security awareness program and Q24 on the foundational security awareness course (A230) offered by the Canada School of Public Service merged into Q14; and,
    • Q18 on business continuity planning program
  • Addition of questions: 

    • Q12 on the implementation of mandatory credit checks into the security screening process; and
    • Q20 on departmental event management plan alignment with the GC Cyber Security Event Management Plan (GC CSEMP)

Response timeframes, evidence and validation

The intention of the security management methodology is to assess the current state of performance and practice. As a result, most questions are from a “point-in–time” perspective to provide the most up-to-date representation possible. In other instances, a timeframe for the information requested will be indicated.

In addition to the expected types of evidence, the maximum number of documents that can be provided as evidence for each question is provided in the table. TBS may refer to internal or external evaluations and audits as well as other relevant documents to support the MAF security management assessments and reporting.

Questionnaire

Effective and integrated department or agency security management

Planning

Indicators and Calculation Method (where applicable) Expected Result Policy Reference Evidence Source and Document Limit Category
  1. Does the department or agency have a Departmental Security Plan that covers the present fiscal year (2016-17) which is approved by the Deputy Head?

    • Yes
    • No

    Rationale: 

    A Departmental Security Plan provides an integrated view of security risks and requirements, and outlines the strategies, priorities and timelines for strengthening and monitoring departmental security practices and controls. An approved and current plan is a tool which communicates the priorities of the organization and the means for addressing risks, against which achievement of objectives are measured and reported.

An approved Departmental Security Plan is in place and priorities for the current fiscal year are established.

Policy on Government Security 6.1.4

Directive on Departmental Security Management, 6.1.1

Evidence Source: 

DSP approved by deputy head and if the approved DSP does not cover the present fiscal year, a DSP progress report or addendum that establishes priorities for the present fiscal year, with evidence that these priorities have been approved by the deputy head.

Evidence Limit: 

2 documents

  • Management Practice
    • Policy Compliance
  1. During Fiscal Year 2015-16, did the department or agency report to the Deputy Head on progress in implementing the Departmental Security Plan and on the effectiveness of the plan?

    • Yes
    • No

    Rationale: 

    Annual reporting of DSP implementation progress and effectiveness supports the organization and Deputy Head to respond to implementation challenges as well as to changes in risks or requirements, by providing an opportunity to re-evaluate and re-prioritize activities in order to maintain secure and continued programs and service delivery.

The Deputy Head was provided with information on the progress of implementing the Departmental Security Plan as well as the effectiveness of the plan.

Policy on Government Security 6.1.4

Directive on Departmental Security Management, 6.1.1

Evidence Source: 

DSP approved by deputy head and DSP progress report covering all areas of departmental security priority activities with evidence that it has been submitted to the deputy head during the fiscal year 2015-16.

Evidence Limit: 

2 documents

  • Management Practice
    • Policy Compliance
  1. What percentage (%) of activities identified in the Departmental Security Plan, planned for completion for the previous fiscal year (2015-16) was completed as planned?

    Rationale: 

    The extent to which activities identified in the Departmental Security Plan were completed provides an indication of the effectiveness of the plan in driving desired security outcomes.

    Calculation of measure: 

    # of planned activities completed in FY (2015-16)

    divided by

    total # of activities planned for completion in FY (2015-16)

All activities identified in the DSP for completion in 2015-16, completed as planned.

Policy on Government Security 6.1.4

Directive on Departmental Security Management, 6.1.1.1

The evidence source must clearly identify the approved priorities planned for the last fiscal year and indicate which were achieved in that year.

Evidence Source: 

Evidence of priority activities planned for the last fiscal year, as identified in: 

  • the implementation strategy section of the approved Departmental Security Plan, or
  • a DSP progress report or addendum establishing priorities for the last fiscal year with evidence that it had been approved by the deputy head.

Evidence of priority activities achieved over the last fiscal year, as identified in: 

  • an annex to the approved Departmental Security Plan outlining progress achieved since the last time the DSP was approved, or
  • a DSP progress report or addendum with evidence that it had been submitted to, and/or approved by the deputy head.

Evidence Limit: 

2 documents

  • Management Performance
    • Performance Indicator

Protection of Government of Canada assets and continued program/service delivery

Information technology security

Indicators and Calculation Method (where applicable) Expected Result Policy Reference Evidence Source and Document Limit Category
  1. With respect to the department’s or agency’s mission critical applications: 

    • 4.1

      What is the percentage (%) of mission critical applications currently in operation that have been authorized for operation by the Program and Service Delivery Managers?

    • 4.2

      What is the percentage (%) of mission critical applications, currently in operation, that have been authorized with conditions (formerly known as Interim Authority to Operate)?

    • 4.3

      What is the percentage (%) of mission critical applications, currently in operation, that has been authorized with conditions that have a remediation plan to fulfill the authorization conditions?

    Rationale: 

    Focusing on authorization of applications deemed to be “mission critical” to business operations is important as this category of applications, if compromised or unavailable, has the potential to cause a high degree of injury to the health, safety, security or economic well-being of Canadians, or to the effective functioning of government.

    Calculation of measure 4.1: 

    Number of mission critical applications currently in operation that are authorized

    divided by

    Number of mission critical applications currently in operation

    Calculation of measure 4.2: 

    Number of mission critical applications currently in operation that are authorized with conditions

    divided by

    Number of mission critical applications currently in operation

    Calculation of measure 4.3: 

    Number of mission critical applications currently in operation that are authorized with conditions that have a remediation plan

    divided by

    Number of mission critical applications currently in operation that are authorized with conditions

    D�finitions : 

    Mission critical application
    Is a business application that is, or supports, a critical service.
    Critical service
    Is a service whose compromise in terms of availability or integrity would result in a high degree of injury to the health, safety, security or economic well-being of Canadians, or to the effective functioning of the government.

As a best practice, 100% of the organization’s mission critical applications have been authorized and have a plan in place to address any residual risks.

Directive on Departmental Security Management (Appendix C)

Management of Information Technology Security (MITS) Standard, 12.3.3

Application Portfolio Management (APM) system.

TBS will populate the department’s or agency’s response.

  • Management Performance
    • Performance Indicator
  1. With respect to the department’s or agency’s mission critical applications with a risk remediation plan or equivalent; what is the percentage (%) of remediation conditions that have been met for mission critical applications?

    Rationale: 

    Demonstrates progress against planned remediation activities (i.e. year over year improvement) for mission critical applications.

    Calculation of measure: 

    Number of remediation activities met

    divided by

    Number of remediation activities

As a best practice, it is expected that the organizations are tracking progress on 100% of the mission critical applications that have risk remediation activities

Directive on Departmental Security Management (Appendix C)

Management of Information Technology Security (MITS) Standard, 12.3.3

Evidence Source: 

Departmental or Agency summary document identifying the mission critical applications and the status of risk remediation actions (i.e. dashboard reporting).

Evidence Limit: 

1 Document

  • Management Performance
    • Performance Indicator
  1. Does the department or agency have a documented IT vulnerability assessment process or plan which, at a minimum includes the following elements: 

    1. Scheduling and frequency of Scans
    2. Targets, workstations, applications
    3. Impact assessment (High, Medium, Low vulnerabilities)
    4. Prioritization of detected vulnerabilities for Remediation
    • Yes
    • No

    Rationale: 

    Vulnerability assessments are recognized as a best practice by governments and industry for addressing security gaps in IT systems and applications.

    Departments and agencies must conduct and document vulnerability assessments regularly on highly sensitive or internet facing systems and on a discretionary basis on other systems.

It is expected that all organizations have a documented vulnerability assessment process or plan.

Management of Information Technology Security (MITS) Standard, 12.5.1

Evidence Source: 

Departmental or Agency document describing the process or plan.

Evidence Limit: 

1 Document

  • Management Practice
    • Policy Compliance
  1. Does the department or agency keep a record of identified IT vulnerabilities including the following elements: 

    1. Vulnerable systems
    2. Priority (High, Medium, Low)
    3. Status (Open, Closed or In Progress)
    • Yes
    • No

    Rationale: 

    Vulnerability assessments are recognized as a best practice by governments and industry for addressing security gaps in IT systems and applications.

    Departments and agencies must document vulnerability assessments, subsequent decisions and remedial actions.

It is expected that all organizations keep a record of identified vulnerabilities including whether they have been addressed.

Management of Information Technology Security (MITS) Standard, 12.5.1

Evidence Source: 

Departmental or Agency log or other document identifying vulnerabilities and whether they have been addressed.

Evidence Limit: 

1 Documents

  • Management Practice
    • Policy Compliance
  1. What is the percentage (%) of high priority patches that have been deployed within the department’s or agency’s established targets or timeframes for fiscal year 2016-2017?

    Rationale: 

    Measuring the performance of systems or services that have been patched in a timely manner provides an indication of the department’s or agency’s due diligence to protect GC assets (e.g. information).

    Calculation of measure: 

    Number of high priority patches installed within a department’s or agency’s established targets or time frames

    divided by

    Total number of high priority patches identified for deployment

Organizations must have a documented patch management process or plan that includes targets and timeframes for patch deployment, these organizations should strive to have 100% of high priority patches deployed and installed on their systems or services within established targets or timeframes for fiscal year 2016-17.

Management of Information Technology Security (MITS) Standard, 12.5.2

Evidence Source: 

Departmental or Agency log or other document identifying patches and whether they have been addressed within established timeframes for patch deployment.

Evidence Limit: 

1 Document

  • Management Performance
    • Performance Indicator

Physical security

Indicators and Calculation Method (where applicable) Expected Result Policy Reference Evidence Source and Document Limit Category
  1. What is the percentage (%) of the department’s or agency’s facilities that have an up-to-date security assessment?

    Rationale: 

    Modernization of the workplace and an evolving risk and threat environment necessitate up-to-date assessment of security risks to facilities.

    Assessment of the extent to which the department or the agency is assessing the security risks of facilities is important, as these risks can also impact or introduce risks to information, people, and other GC assets.

    Calculation of measure: 

    Number of facilities with an up-to-date security assessment

    divided by

    Total number of facilities

    Definitions: 

    Facility
    Physical setting used to serve a specific purpose. A facility may be part of a building, a whole building, or a building plus its site; or it may be a construction that is not a building. The term encompasses both the physical object and its use (for example, weapons ranges, agriculture fields). (Operational Security Standard on Physical Security, Appendix A)
    Up-to-date
    Is considered to be within the departmental timeframe, as established in policy or procedure, for renewing security risk assessments, or 3 years in the absence of an established departmental timeframe.
    Security assessment
    The process of identifying and qualifying security-related threats, vulnerabilities and risks, to support the definition of security requirements and the identification of security controls to mitigate risks to an acceptable level.

    Note: 

    The scope of the question includes both 1) facilities where there are departmental or agency operations (buildings, floors within a building, space in foreign embassy, agricultural fields, etc.) and; 2) facilities where the department is a custodian and may not have departmental or agency operations.

It is expected that all facilities have an up-to-date security assessment.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard on Physical Security

Evidence Source: 

Departmental or Agency tracking tool including: 

  • List of facilities
  • Type of assessment
  • Date of last Security Assessment
  • Frequency of Assessment Cycle

Evidence Limit: 

1 Document

  • Management Performance
    • Performance Indicator
  1. Presently, of the department’s or agency’s facilities with material residual security deficiencies.

    • 10.1

      What percentage (%) has an approved and funded (where required) risk mitigation action plan?

    Rationale: 

    Measuring the performance of an organization’s ability to ensure all material residual security deficiencies are being managed with an approved and funded risk mitigation plan provides an indication of the department’s or agency’s due diligence to protect individuals, information and assets.

    Measuring the extent to which material residual security deficiencies remain outstanding after one year provides an indication of the organization’s ability to resolve these deficiencies in a timely manner.

    Calculation of measure: 

    Number of facilities with material residual deficiencies with a funded plan

    divided by

    Total number of facilities with material residual deficiencies

    Definitions: 

    Residual security deficiencies
    Are those where base building and/or departmental or agency security requirements (identified through risk assessments) have not been fully addressed.
    Material
    Meaning caused or could reasonably be expected to cause serious injury or harm (i.e. medium or high impact) to the health and safety of an individual, a high value government asset, the delivery of a critical service or the interest of individuals or businesses.

It is expected that 100% of the department or agency’s facilities with material security deficiencies have an approved and funded risk mitigation action plan.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard on Physical Security

Evidence Source: 

Departmental or Agency documentation providing the total number of facilities with residual material deficiencies and which have any deficiency that has been outstanding for greater than 12 months.

Evidence Limit: 

1 document

  • Management Performance
    • Performance Indicator
  1. Presently, of the department’s or agency’s facilities with material residual security deficiencies.

    • 10.2

      What percentage (%) has any deficiency been outstanding for greater than 12 months?

    Rationale: 

    Measuring the performance of an organization’s ability to ensure all material residual security deficiencies are being managed with an approved and funded risk mitigation plan provides an indication of the department’s or agency’s due diligence to protect individuals, information and assets.

    Measuring the extent to which material residual security deficiencies remain outstanding after one year provides an indication of the organization’s ability to resolve these deficiencies in a timely manner.

    Calculation of measure: 

    Number of facilities with deficiencies outstanding greater than 12 months

    divided by

    Total number of facilities with deficiencies

    Definitions: 

    Residual security deficiencies
    Are those where base building and/or departmental or agency security requirements (identified through risk assessments) have not been fully addressed.
    Material
    Meaning caused or could reasonably be expected to cause serious injury or harm (i.e. medium or high impact) to the health and safety of an individual, a high value government asset, the delivery of a critical service or the interest of individuals or businesses.

It is expected that a low percentage of departments or agencies’ facilities has any deficiencies being outstanding for greater than 12 months.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard on Physical Security

Evidence Source: 

Departmental or Agency documentation providing the total number of facilities with residual material deficiencies and which have any deficiency that has been outstanding for greater than 12 months.

Evidence Limit: 

1 document

  • Management Performance
    • Performance Indicator

Security screening of individuals

Indicators and Calculation Method (where applicable) Expected Result Policy Reference Evidence Source and Document Limit Category
  1. Which of the 6 main components of the Standard on Security Screening, effective , has the department or agency implemented?

    Check all that apply

    • Roles and responsibilities of Delegated Security Officials, Managers, and Individuals, including security screening decisions for which authority should be delegated (SSS section 6)
    • Departmental or agency security screening requirements of positions have been determined in accordance with the security screening model  (SSS Appendix B)
    • Collection, use, disclosure, retention and disposition of personal information for security screening (SSS Appendix C)
    • Evaluation, decision making, and review for cause (SSS Appendix D)
    • Review and rights of redress (SSS Appendix E)
    • Aftercare (SSS Appendix F)

    Rationale: 

    The information gathered from response to this question provides an indication of the progress (by requirement) being made by departments and agencies in implementing the main components of the Standard on Security Screening. This will be informative for organizations to measure progress on a comparative basis, to ascertain overall GC progress and to potentially highlight areas requiring additional support, guidance and tools.

It is expected that organizations progress towards implementing the 6 main components of the Standard on Security Screening.

Standard on Security Screening, 6.2

Evidence Source: 

Department or Agency security policy or procedures.

Evidence Limit: 

3 Documents

Management Milestones

  1. Has the requirement to conduct mandatory credit checks as part of the security screening process been fully implemented?

    • Yes
    • No

    Rationale: 

    To achieve policy compliance, departments and agencies must perform credit checks on new hires, renewals and upgrades.

It is expected that credit checks are conducted for all new hires, renewals and upgrades as part of the organization’s security screening process.

Standard on Security Screening 6.2.2

Evidence Source: 

  • Copy of their agreement with one or both credit reporting agencies that would indicate they’ve made formal; arrangements to process the credit checks; and/or
  • Record of how many credit checks were processed for 2014/2016; and/or
  • Formal documentation indicating that credit checks have been formally implemented across the department, e.g. notice to all employees. 

Evidence Limit: 

3 Documents

  • Management Practice
    • Policy Compliance

Security awareness

Indicators and Calculation Method (where applicable) Expected Result Policy Reference Evidence Source and Document Limit Category
  1. Which of the following security awareness topics are addressed in the department’s or agency’s initial security briefing?

    Check all that apply

    • Workplace security, including building access, the requirement to display security badges, and practices related to monitoring work-related behaviour through access controls
    • Physical and information-technology security practices related to the proper handling, marking, transport, transmittal, storage and destruction of sensitive information or assets
    • Emerging security concerns, including security threats from malware, phishing, or social engineering
    • Minimizing risks inherent in working with sensitive information away from the official workplace (e.g., telework, mobile computing, travel)
    • Obligations to report significant changes in personal circumstances that may warrant a review of the status or clearance granted

    Rationale: 

    To ascertain the type and scope of security awareness and security responsibilities information provided to new hires.

New hires are provided with comprehensive information on their security responsibilities.

Directive on Departmental Security Management (Appendix C)

Evidence Source: 

Departmental or Agency security briefing policy or procedural documents.

Evidence Limit: 

2 Documents

Descriptive Statistic

  1. Inclusion of the foundational Security Awareness Course (CSPS course A230): 

    • 14.1

      Recognizing that the course is not a mandatory requirement, does the department or agency include the foundational Security Awareness Course (course A230) offered by the Canada School of Public Service (CSPS) within its departmental security awareness program?

      • Yes
      • No - Go to Question 16
    • 14.2

      How does the department or agency include CSPS course A230 within its departmental security awareness program?

      Check only one of the following

      • As a discretionary or optional component for all employees
      • As a mandatory component for all employees
      • As a mandatory component for some employees (please provide context)

    Rationale: 

    As an employer, a goal of the Government of Canada is to ensure that all employees receive a common baseline of security awareness information, initially and on an on-going basis. This core skill is important in a working environment where there is high mobility of employees.

    In addition, the GC is committed to optimizing investments, such as foundational materials provided by common service providers.

Directive on Departmental Security Management (Appendix C)

No evidence required

  • Management Practice
    • Practice (14.1)
  • Descriptive Statistic (14.2)
  1. Implementation of the foundational Security Awareness Course (CSPS course A230): 

    • 15.1

      What is the department’s timeframe for new employees to successfully complete the course? (response in # of weeks)

    • 15.2

      For FY 2015-16 - What is the percentage (%) of new employees that met the department’s timeframe to successfully complete the course?

    • 15.3

      Does the department or agency have a plan with timeframes for existing employees (i.e. excluding new employees) to successfully complete the course?

      • Yes – provide plan
      • No

Baseline indicators

Directive on Departmental Security Management (Appendix C)

Evidence Source: 

Department or Agency: 

  • Copy of core security awareness curriculum (Briefing, Handbook, Deck)
  • Policy or procedural document
  • Schedules, etc

Evidence Limit: 

3 Documents

  • Management Practice
    • Practice (15.3)
  • Management Performance
    • Performance Indicator (15.2)
  • Descriptive Statistic (15.1)

Business continuity planning

Indicators and Calculation Method (where applicable) Expected Result Policy Reference Evidence Source and Document Limit Category

Table 6 Notes

Table 6 Note 1

Tests can include communications trees (direct call, text or email contact/reply), alternate site set access and function capability, ability to access critical applications remotely; supplier’s BCP capability and several others. Exercises can also include activities such as seminar, workshop or tabletop. Operation exercises can include a drill, functional or full scale simulation exercises.

Return to table 6 note * referrer

  1. What is the # of critical services identified by the department or agency?

    • 16.1

      What is the percentage (%) of the department’s or agency’s critical services with a business continuity plan (BCP) in place?

    Rationale: 

    Information gathered from response to this question provides insight into the extent of progress being made to ensure business continuity plans are in place for all critical services.

    Note : 

    For this response it is recognized that a critical service may have its own business continuity plan, be covered within a broader business continuity plan, or be supported by multiple business continuity plans.

It is expected that 100% of the organization’s critical services have a business continuity plan in place.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4

Evidence Source: 

Business continuity planning fields from the Application Portfolio Management (APM) system.

TBS will populate the department’s or agency’s response.

  • Management Practice
    • Policy Compliance
  1. What is the # of critical services identified by the department or agency?

    • 16.2

      What is the percentage (%) of the department’s or agency’s critical services for which business continuity plans (BCPs) are in place and last updated since ?

    • 16.3

      What is the percentage (%) of the department’s or agency’s critical services for which business continuity plans (BCPs) are in place and last updated between and ?

    • 16.4

      What is the percentage (%) of the department’s or agency’s critical services for which business continuity plans (BCPs) are in place and last updated between and ?

    • 16.5

      What is the percentage (%) of the department’s or agency’s critical services for which business continuity plans (BCPs) are in place and last updated between and , or earlier?

    Rationale: 

    Information gathered from response to this question provides insight into the extent of progress being made to ensure business continuity plans are in place for all critical services.

    Note 1: 

    The sum of 16.2 to 16.5 must equal the percentage (%) reported in 16.1.

    Note 2: 

    For this response it is recognized that a critical service may have its own business continuity plan, be covered within a broader business continuity plan, or be supported by multiple business continuity plans.

These are additional performance indicators allowing for an understanding of the cycle of review and revision of the plans.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4

Evidence Source: 

Business continuity planning fields from the Application Portfolio Management (APM) system.

TBS will populate the department’s or agency’s response.

Descriptive Statistic

  1. Testing and exercising of business continuity plans (BCPs) for departmental or agency critical services: 

    • 17.1

      What is the percentage (%) of critical services with a business continuity plan (BCP) which has been tested or exercisedtable 6 note *?

    Rationale: 

    Information gathered from responses to this question provides insight into the extent of progress being made to ensure business continuity plans and measures are in place, and ready for all critical services to ensure business continuity requirements are met in the event of a disruption.

    Note: 

    For this response it is recognized that a critical service may have its own business continuity plan, be covered within a broader business continuity plan, or be supported by multiple business continuity plans.

It is expected that, for all critical services of the organization with a business continuity plan in place, 100% have been tested or exercised.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4

Evidence Source: 

Business continuity planning fields from the Application Portfolio Management (APM) system.

TBS will populate the department’s or agency’s response.

  • Management Performance
    • Performance Indicator (17.1)
  1. Testing and exercising of business continuity plans (BCPs) for departmental or agency critical services: 

    • 17.2

      What is the percentage (%) of critical services with a business continuity plan (BCP) which was last tested or exercisedtable 6 note * since April 1, 2016?

    • 17.3

      What is the percentage (%) of critical services with a business continuity plan (BCP) which was last tested or exercisedtable 6 note * between April 1, 2015 and March 31, 2016?

    • 17.4

      What is the percentage (%) of critical services with a business continuity plan (BCP) which was last tested or exercisedtable 6 note * between April 1, 2014 and March 31, 2015?

    • 17.5

      What is the percentage (%) of critical services with a business continuity plan (BCP) which was last tested or exercisedtable 6 note * between April 1, 2013 and March 31, 2014?

    Rationale: 

    Information gathered from responses to this question provides insight into the extent of progress being made to ensure business continuity plans and measures are in place, and ready for all critical services to ensure business continuity requirements are met in the event of a disruption.

    Note 1: 

    The sum of 17.2 to 17.5 must equal the percentage (%) reported in 17.1.

    Note 2: 

    For this response it is recognized that a critical service may have its own business continuity plan, be covered within a broader business continuity plan, or be supported by multiple business continuity plans.

These are additional performance indicators allowing TBS to capture and understand the interpretation by departments and agencies of the timeframe for “regular testing and validation of all plans”.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4

Evidence Source: 

Business continuity planning fields from the Application Portfolio Management (APM) system.

TBS will populate the department’s or agency’s response.

Descriptive Statistic

  1. Does the department or agency review its business continuity planning program on a cyclical basis?

    • Yes
    • No

    Rationale: 

    The basis of regular reporting to the Treasury Board Secretariat requires that departments and agencies develop a review cycle for their Business Continuity Planning (BCP) Program, it is important that departments and agencies review their business continuity planning program on a cyclical basis.

All organizations are expected to have developed a cyclical approach for their Business Continuity Planning (BCP) Program.

Directive on Departmental Security Management (Appendix C)

Operational Security Standard - Business Continuity Planning Program, 3.4 d)

Evidence Source: 

Departmental or Agency policy or program document.

Evidence Limit: 

1 Document

  • Management Practice
    • Policy Compliance

Government of Canada security policy priority alignment

Canada’s cyber security strategy

Indicators and Calculation Method (where applicable) Expected Result Policy Reference Evidence Source and Document Limit Category
  1. Does the department or agency keep a record of all of its cybersecurity incidents which, at a minimum includes the following elements: 

    1. Date
    2. Type
    3. Incident priority
    4. Assignment
    5. Status
    6. Date completed
    7. Reported to GC-CIRT
    • Yes
    • No

    Rationale: 

    Keeping a record of all cybersecurity incidents is recognized as a best practice by governments and industry for highlighting anomalies and patterns of attempted and successful breaches. Information gathered from the responses can be used to highlight and share notable practice.

    Definitions: 

    Cybersecurity incident
    Is any cybersecurity event (or collection of security events) or omission that results in the compromise of an IT system. A compromise is the unauthorized access to, disclosure, modification, use, interruption, removal, or destruction of information or assets, causing a loss of confidentiality, integrity, availability and/or value.

    Note: 

    For a “Yes” response, evidence should demonstrate that templates are in use as outlined in the GC Cybersecurity Event Management Plan (Cyber Security Incident Reporting Template, 5.2.2) and the information from these templates is being recorded.

All organizations are expected to keep a record of all their cybersecurity incidents. This reporting requirement has been in place since May 2010 (GC IT IMP).

Government of Canada Cyber Security Event Management Plan (GC CSEMP), 5.2.2

Evidence Source: 

Departmental or Agency extract or sample of a log, etc.

Evidence Limit: 

1 Document

  • Management Practice
    • Practice
  1. Did the department or agency update its event management processes so that it is in alignment with the GC Cyber Security Event Management Plan? (GC CSEMP)

    • Yes
    • No

    Rationale: 

    Departments are responsible for developing, maintaining and testing departmental cyber security event management plans and processes, and ensure alignment with GC-wide direction, plans and processes.

It is expected that all organizations have a documented event management plan that is in alignment with the GC Cyber Security Event Management Plan.

Management of Information Technology Security (MITS) Standard

Government of Canada Cyber Security Event Management Plan (GC CSEMP), 5.2.2

Evidence Source: 

Departmental or Agency event management processes document.

Evidence Limit: 

1 Document

  • Management Practice
    • Practice

Federating identity

Indicators and Calculation Method (where applicable) Expected Result Policy Reference Evidence Source and Document Limit Category
  1. Online external services which require a client to sign-in with a username and password (i.e. client authentication): 

    • 21.1

      What is the number of online external services which require client authentication?

    • 21.2

      What is the percentage (%) of the department’s or agency’s online external services which require client authentication that have a completed Assurance Level Assessment?

    Rationale: 

    The Government of Canada is committed to providing online services that enable trusted transactions between clients and the government. Online transactions provide challenges to departments and agencies for establishing or verifying the identity of the requester and also ensuring secure transactions.

    Prior to the delivery of any service to a client, departments and agencies are required to assess the identity and credential risks of the service or transactions. Departments and agencies assess these risks by completing an assurance level assessment.

    The assessment is then used by the organization to select the appropriate processes and technologies to ensure identified risks are being addressed in order to enable the provision of the service online. This standardized process is also the foundation for federating identity.

    Calculation of measure: 

    Number of online external services which require client authentication that have a completed assessment

    divided by

    Total number of online external services which require client authentication

    Definitions: 

    External services
    Are those delivered to citizens and businesses.

It is expected that an assurance level assessment has been completed for all online external services which require a client to sign-in with a username and password.

Directive on Identity Management, 6.1.2, 6.1.3

Standard on Identity and Credential Assurance, 6.1.1, 6.1.2

Evidence Source: 

Departmental or Agency list of all online external services which require client authentication providing the date for when an assurance level assessment was completed, and indicating if the service uses the mandatory External Credential Management service (GCKey/Credential Broker Service).

Evidence Limit: 

1 Document

  • Management Performance
    • Performance Indicator (21.2)
  • Descriptive Statistic (21.1)
  1. What is the percentage (%) of the department’s or agency’s online external services which require a client to sign-in with a username and password (i.e. client authentication) that use the mandatory External Credential Management service (GCKey / Credential Broker Service)?

    Rationale: 

    The Government of Canada is committed to providing online services that enable secure Single Sign-On (SSO) and reduce the duplication of login and passwords by encouraging the use of those that the client has already established or by using one common login and password across GC services and transactions. This is provided through the mandatory External Credential Management service (GCKey / Commercial Broker Service). This enterprise service also reduces the number of attack surfaces and improves the security posture of the government as a whole.

    This indicator measures the uptake of the mandatory service across the assessed departments and agencies, focusing on online services to citizens and businesses, and provides an indication of the alignment of the department or agency with GC-wide modernization goals.

    Calculation of measure: 

    Number of online external services which use ECMS

    divided by

    Total number of online external services which require client authentication

    Definitions: 

    External services
    Are those delivered to citizens and businesses.

It is expected that all online external services which require a client to sign-in with a username and password are using the mandatory External Credential Management service.

Directive on Identity Management, 6.1.4

Same evidence as Q 21.

  • Management Performance
    • Performance Indicator

Glossary

Critical service
Is a service whose compromise in terms of availability or integrity would result in a high degree of injury to the health, safety, security or economic well-being of Canadians, or to the effective functioning of the government.
Cybersecurity incident
Is any cybersecurity event (or collection of security events) or omission that results in the compromise of an IT system. A compromise is the unauthorized access to, disclosure, modification, use, interruption, removal, or destruction of information or assets, causing a loss of confidentiality, integrity, availability and/or value.
External services
Are those delivered to citizens and businesses.
Facility
A physical setting used to serve a specific purpose. A facility may be part of a building, a whole building, or a building plus its site; or it may be a construction that is not a building. The term encompasses both the physical object and its use (for example, weapons ranges, agriculture fields). (Operational Security Standard on Physical Security, Appendix A)
Material
Meaning caused or could reasonably be expected to cause serious injury or harm (i.e. medium or high impact) to the health and safety of an individual, a high value government asset, the delivery of a critical service or the interest of individuals or businesses.
Mission critical application
Is a business application that is, or supports, a critical service.
Residual security deficiencies
Are those where base building and/or departmental or agency security requirements (identified through risk assessments) have not been fully addressed.
Security assessment
The process of identifying and qualifying security-related threats, vulnerabilities and risks, to support the definition of security requirements and the identification of security controls to mitigate risks to an acceptable level.
Up-to-date
Is considered to be within the departmental timeframe, as established in policy or procedure, for renewing security risk assessments, or 3 years in the absence of an established departmental timeframe.

Page details

Date modified: