MAF 2016 to 2017 security management methodology
Table of contents
Methodology overview
Introduction
The security management questionnaire is intended to generate management performance information that provides insight into a department’s or agency’s security management practices for the benefit of the organization and to contribute to improving the overall situational awareness and security posture of the Government of Canada (GC).
This information will be important for validating and informing security management decisions and direction, observing trends and changes, identifying areas of strength and areas that need attention, benchmarking, and sharing leading security management practices. In addition, from the responses, deputy heads will be informed of the extent to which departmental management practices are aligned with GC priorities for security and of the department’s or agency’s progress supporting secure delivery and modernization goals for GC programs and services.
The three key areas of the methodology focus on:
- Practices to ensure the effectiveness, integration and inclusion of all security activity areas,
- Key security risk mitigation measures that enable the protection of GC assets and continued program and service delivery, and
- Alignment to GC security policy priorities that underpin secure delivery and modernization goals for GC programs and services.
The questionnaire will assess security management practices, performance and milestones, using a variety of means for benchmarking. These means may evolve annually from establishing baseline information to determining targets based on GC averages.
Security management three-year approach to the management accountability framework
The security management approach to the three-year cycle for the Management Accountability Framework is to utilize questions within the methodology to capture information on security performance and transformation as strategic priorities for security, security policy evolution and implementation, and GC initiatives continue to advance. Building on year one and year two indicators, TBS will continue to review government-wide findings of new and ongoing questions used in this methodology to ensure stability when conducting a three-year trend analysis of findings following the 2016-17 MAF.
Security management methodology for 2016-17
To ensure that the questions used in the security management methodology capture pertinent information on security performance and transformation, the Area of Management reviewed the 2015-16 MAF findings. The methodology was then refined to enable clear and valuable information provisioning to deputy heads on management practices and performance in this area.
Overall, the 2016-17 security methodology is consistent with last year’s. The following updates were made to improve clarity, reflect current priorities, and align with the direction of the Security Policy reset.
-
Removal of questions:
- Q1 on governance mechanisms for departmental security management and the appointment of a Departmental Security Officer;
- Q12 on facilities containing sensitive information and assets, and up-to-date security risk assessment;
- Q19 on security incident standardized practices; and
- Q21 on reporting of recorded cybersecurity incidents to the GC-CIRT
-
Modification of questions:
- Q3 on departmental security plan annual review to new Q2 on progress reporting to the Deputy Head regarding the implementation of the departmental security plan;
- Q4 on consideration of program and activities identified in their Program Alignment Architecture to new Q3 on percentage of activities completed as planned in the departmental security plan;
- Q9 and 10 on patch management process or plan will be incorporated into Q8;
- Q15 on security awareness program and Q24 on the foundational security awareness course (A230) offered by the Canada School of Public Service merged into Q14; and,
- Q18 on business continuity planning program
-
Addition of questions:
- Q12 on the implementation of mandatory credit checks into the security screening process; and
- Q20 on departmental event management plan alignment with the GC Cyber Security Event Management Plan (GC CSEMP)
Response timeframes, evidence and validation
The intention of the security management methodology is to assess the current state of performance and practice. As a result, most questions are from a “point-in–time” perspective to provide the most up-to-date representation possible. In other instances, a timeframe for the information requested will be indicated.
In addition to the expected types of evidence, the maximum number of documents that can be provided as evidence for each question is provided in the table. TBS may refer to internal or external evaluations and audits as well as other relevant documents to support the MAF security management assessments and reporting.
Questionnaire
Effective and integrated department or agency security management
Planning
Indicators and Calculation Method (where applicable) | Expected Result | Policy Reference | Evidence Source and Document Limit | Category |
---|---|---|---|---|
|
An approved Departmental Security Plan is in place and priorities for the current fiscal year are established. |
Evidence Source: DSP approved by deputy head and if the approved DSP does not cover the present fiscal year, a DSP progress report or addendum that establishes priorities for the present fiscal year, with evidence that these priorities have been approved by the deputy head. Evidence Limit: 2 documents |
|
|
|
The Deputy Head was provided with information on the progress of implementing the Departmental Security Plan as well as the effectiveness of the plan. |
Evidence Source: DSP approved by deputy head and DSP progress report covering all areas of departmental security priority activities with evidence that it has been submitted to the deputy head during the fiscal year 2015-16. Evidence Limit: 2 documents |
|
|
|
All activities identified in the DSP for completion in 2015-16, completed as planned. |
The evidence source must clearly identify the approved priorities planned for the last fiscal year and indicate which were achieved in that year. Evidence Source: Evidence of priority activities planned for the last fiscal year, as identified in:
Evidence of priority activities achieved over the last fiscal year, as identified in:
Evidence Limit: 2 documents |
|
Protection of Government of Canada assets and continued program/service delivery
Information technology security
Indicators and Calculation Method (where applicable) | Expected Result | Policy Reference | Evidence Source and Document Limit | Category |
---|---|---|---|---|
|
As a best practice, 100% of the organization’s mission critical applications have been authorized and have a plan in place to address any residual risks. |
Directive on Departmental Security Management (Appendix C) Management of Information Technology Security (MITS) Standard, 12.3.3 |
Application Portfolio Management (APM) system. TBS will populate the department’s or agency’s response. |
|
|
As a best practice, it is expected that the organizations are tracking progress on 100% of the mission critical applications that have risk remediation activities |
Directive on Departmental Security Management (Appendix C) Management of Information Technology Security (MITS) Standard, 12.3.3 |
Evidence Source: Departmental or Agency summary document identifying the mission critical applications and the status of risk remediation actions (i.e. dashboard reporting). Evidence Limit: 1 Document |
|
|
It is expected that all organizations have a documented vulnerability assessment process or plan. |
Management of Information Technology Security (MITS) Standard, 12.5.1 |
Evidence Source: Departmental or Agency document describing the process or plan. Evidence Limit: 1 Document |
|
|
It is expected that all organizations keep a record of identified vulnerabilities including whether they have been addressed. |
Management of Information Technology Security (MITS) Standard, 12.5.1 |
Evidence Source: Departmental or Agency log or other document identifying vulnerabilities and whether they have been addressed. Evidence Limit: 1 Documents |
|
|
Organizations must have a documented patch management process or plan that includes targets and timeframes for patch deployment, these organizations should strive to have 100% of high priority patches deployed and installed on their systems or services within established targets or timeframes for fiscal year 2016-17. |
Management of Information Technology Security (MITS) Standard, 12.5.2 |
Evidence Source: Departmental or Agency log or other document identifying patches and whether they have been addressed within established timeframes for patch deployment. Evidence Limit: 1 Document |
|
Physical security
Indicators and Calculation Method (where applicable) | Expected Result | Policy Reference | Evidence Source and Document Limit | Category |
---|---|---|---|---|
|
It is expected that all facilities have an up-to-date security assessment. |
Evidence Source: Departmental or Agency tracking tool including:
Evidence Limit: 1 Document |
|
|
|
It is expected that 100% of the department or agency’s facilities with material security deficiencies have an approved and funded risk mitigation action plan. |
Evidence Source: Departmental or Agency documentation providing the total number of facilities with residual material deficiencies and which have any deficiency that has been outstanding for greater than 12 months. Evidence Limit: 1 document |
|
|
|
It is expected that a low percentage of departments or agencies’ facilities has any deficiencies being outstanding for greater than 12 months. |
Evidence Source: Departmental or Agency documentation providing the total number of facilities with residual material deficiencies and which have any deficiency that has been outstanding for greater than 12 months. Evidence Limit: 1 document |
|
Security screening of individuals
Indicators and Calculation Method (where applicable) | Expected Result | Policy Reference | Evidence Source and Document Limit | Category |
---|---|---|---|---|
|
It is expected that organizations progress towards implementing the 6 main components of the Standard on Security Screening. |
Evidence Source: Department or Agency security policy or procedures. Evidence Limit: 3 Documents |
Management Milestones |
|
|
It is expected that credit checks are conducted for all new hires, renewals and upgrades as part of the organization’s security screening process. |
Evidence Source:
Evidence Limit: 3 Documents |
|
Security awareness
Indicators and Calculation Method (where applicable) | Expected Result | Policy Reference | Evidence Source and Document Limit | Category |
---|---|---|---|---|
|
New hires are provided with comprehensive information on their security responsibilities. |
Evidence Source: Departmental or Agency security briefing policy or procedural documents. Evidence Limit: 2 Documents |
Descriptive Statistic |
|
|
No evidence required |
|
||
|
Baseline indicators |
Evidence Source: Department or Agency:
Evidence Limit: 3 Documents |
|
Business continuity planning
Indicators and Calculation Method (where applicable) | Expected Result | Policy Reference | Evidence Source and Document Limit | Category |
---|---|---|---|---|
Table 6 Notes
|
||||
|
It is expected that 100% of the organization’s critical services have a business continuity plan in place. |
Directive on Departmental Security Management (Appendix C) Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4 |
Evidence Source: Business continuity planning fields from the Application Portfolio Management (APM) system. TBS will populate the department’s or agency’s response. |
|
|
These are additional performance indicators allowing for an understanding of the cycle of review and revision of the plans. |
Directive on Departmental Security Management (Appendix C) Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4 |
Evidence Source: Business continuity planning fields from the Application Portfolio Management (APM) system. TBS will populate the department’s or agency’s response. |
Descriptive Statistic |
|
It is expected that, for all critical services of the organization with a business continuity plan in place, 100% have been tested or exercised. |
Directive on Departmental Security Management (Appendix C) Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4 |
Evidence Source: Business continuity planning fields from the Application Portfolio Management (APM) system. TBS will populate the department’s or agency’s response. |
|
|
These are additional performance indicators allowing TBS to capture and understand the interpretation by departments and agencies of the timeframe for “regular testing and validation of all plans”. |
Directive on Departmental Security Management (Appendix C) Operational Security Standard - Business Continuity Planning Program, 3.3, 3.4 |
Evidence Source: Business continuity planning fields from the Application Portfolio Management (APM) system. TBS will populate the department’s or agency’s response. |
Descriptive Statistic |
|
All organizations are expected to have developed a cyclical approach for their Business Continuity Planning (BCP) Program. |
Directive on Departmental Security Management (Appendix C) Operational Security Standard - Business Continuity Planning Program, 3.4 d) |
Evidence Source: Departmental or Agency policy or program document. Evidence Limit: 1 Document |
|
Government of Canada security policy priority alignment
Canada’s cyber security strategy
Indicators and Calculation Method (where applicable) | Expected Result | Policy Reference | Evidence Source and Document Limit | Category |
---|---|---|---|---|
|
All organizations are expected to keep a record of all their cybersecurity incidents. This reporting requirement has been in place since May 2010 (GC IT IMP). |
Government of Canada Cyber Security Event Management Plan (GC CSEMP), 5.2.2 |
Evidence Source: Departmental or Agency extract or sample of a log, etc. Evidence Limit: 1 Document |
|
|
It is expected that all organizations have a documented event management plan that is in alignment with the GC Cyber Security Event Management Plan. |
Management of Information Technology Security (MITS) Standard Government of Canada Cyber Security Event Management Plan (GC CSEMP), 5.2.2 |
Evidence Source: Departmental or Agency event management processes document. Evidence Limit: 1 Document |
|
Federating identity
Indicators and Calculation Method (where applicable) | Expected Result | Policy Reference | Evidence Source and Document Limit | Category |
---|---|---|---|---|
|
It is expected that an assurance level assessment has been completed for all online external services which require a client to sign-in with a username and password. |
Evidence Source: Departmental or Agency list of all online external services which require client authentication providing the date for when an assurance level assessment was completed, and indicating if the service uses the mandatory External Credential Management service (GCKey/Credential Broker Service). Evidence Limit: 1 Document |
|
|
|
It is expected that all online external services which require a client to sign-in with a username and password are using the mandatory External Credential Management service. |
Same evidence as Q 21. |
|
Glossary
- Critical service
- Is a service whose compromise in terms of availability or integrity would result in a high degree of injury to the health, safety, security or economic well-being of Canadians, or to the effective functioning of the government.
- Cybersecurity incident
- Is any cybersecurity event (or collection of security events) or omission that results in the compromise of an IT system. A compromise is the unauthorized access to, disclosure, modification, use, interruption, removal, or destruction of information or assets, causing a loss of confidentiality, integrity, availability and/or value.
- External services
- Are those delivered to citizens and businesses.
- Facility
- A physical setting used to serve a specific purpose. A facility may be part of a building, a whole building, or a building plus its site; or it may be a construction that is not a building. The term encompasses both the physical object and its use (for example, weapons ranges, agriculture fields). (Operational Security Standard on Physical Security, Appendix A)
- Material
- Meaning caused or could reasonably be expected to cause serious injury or harm (i.e. medium or high impact) to the health and safety of an individual, a high value government asset, the delivery of a critical service or the interest of individuals or businesses.
- Mission critical application
- Is a business application that is, or supports, a critical service.
- Residual security deficiencies
- Are those where base building and/or departmental or agency security requirements (identified through risk assessments) have not been fully addressed.
- Security assessment
- The process of identifying and qualifying security-related threats, vulnerabilities and risks, to support the definition of security requirements and the identification of security controls to mitigate risks to an acceptable level.
- Up-to-date
- Is considered to be within the departmental timeframe, as established in policy or procedure, for renewing security risk assessments, or 3 years in the absence of an established departmental timeframe.
Page details
- Date modified: