Executive (EX) Group Job Evaluation Standard 2022

On this page

Introduction

The Executive Group Job Evaluation Standard (formerly known as the Executive Group Position Evaluation Plan) has been used to evaluate Executive Group positions in the Federal Public Service since 1980. The plan is based on the Korn Ferry Hay Guide Chart – Profile Methodology and is more commonly known as the Hay Plan. This methodology is used by over 12,000 organizations in over 90 countries in both public and private sector jurisdictions across a variety of industries and sectors. 

This latest version of the Executive Group Job Evaluation Standard came into effect on October 3rd, 2022 and replaces the September 2005 version.  Changes made to this latest version include:

  • Revised guide charts and updated language to facilitate the use of the standard and reduce potential bias.
  • Revised benchmark positions to reflect the full range of modern Executive Group jobs.
  • Revised Accountability Magnitude Index (AMI) from 8.0 to 9.0.

This guide is designed to:

  • Provide guidelines that will foster consistency in the evaluation of Executive Group jobs while retaining the flexibility required to properly reflect the diverse nature of these jobs.
  • Provide an overview of the basic concepts and principles underlying the job evaluation process.
  • Serve as an adjunct to the materials and experience received during basic job evaluation training or refresher courses.

Executive (EX) Group Definition

This guide is intended for classification advisors and those involved in the evaluation of jobs allocated to the Executive (EX) Group.

Group Definition

The Executive Group comprises jobs located no more than three hierarchical levels below the Deputy or Associate Deputy level and that have significant executive managerial or executive policy roles and responsibilities or other significant influence on the direction of a department or agency. Jobs in the Executive Group are responsible and accountable for exercising executive managerial authority or providing recommendations and advice on the exercise of that authority.

Inclusions

Notwithstanding the generality of the foregoing, it includes jobs that have, as their primary purpose, responsibility for one or more of the following activities:

  • Managing programs authorized by an Act of Parliament, or an Order-in-Council, or major or significant functions or elements of such programs.
  • Managing substantial scientific or professional activities.
  • Providing recommendations on the development of significant policies, programs or scientific, professional, or technical activities; and
  • Exercising a primary influence over the development of policies or programs for the use of human, financial or material resources in one or more major organizational units or program activities in the Public Service.
Exclusions
  • Jobs excluded from the Executive Group are those whose primary purpose is included in the definition of any other group.

Introduction to Job Evaluation using the Korn Ferry Hay Guide Chart Profile Method

Job Evaluation Fundamentals

Purpose

Job evaluation provides a foundation for an organization to:

  • Establish the appropriate rank order of jobs
  • Establish the relative distance between jobs within the ranking
  • Provide a systematic measurement of job size relative to other positions, to make salary comparisons possible
  • Provide a reliable basis for connecting to external market data
  • Provide a source of information on the work being done in a unit prior to making restructuring decisions

Fundamental Premises

The evaluation of Executive (EX) Group jobs is based on the Korn Ferry Hay Guide Chart Profile Method. The logic behind the Method is:

  • Every organization exists to produce identifiable end results
  • An organization is created when more than one individual is required to accomplish the tasks to produce those end results
  • Every viable job in an organization has been designed to make some contribution toward reaching those end results
  • That contribution can be systematically measured using a common set of factors across all executive jobs

The Ranking / Validation Process

The Method is a process to determine the relative value of EX Group positions. In other words, value determinations are made based on the relative degree to which any position, competently performed, contributes to what its unit has been created to accomplish within a specific organizational context and to other EX positions.

Concept Application
The notion of competent performance in job evaluation Job evaluation measures the contribution made by a position, not the contribution an incumbent may or may not make in the position. Since jobs are designed on the assumption that they can and will be competently performed, the evaluator assumes that competent performance exists and makes no judgements about performance.

The contribution the position makes to the organization is determined by measuring job content, as set out in the job description, using three factors:

  • Know-How
  • Problem Solving
  • Accountability

The three factors are inter-dependent. We are concerned with the Know-How that is required by the role to Solve the Problems which must be overcome to achieve the results for which the role is Accountable.

The Method uses these factors and their sub-factors in combination to determine the value of positions using numerical points. The Korn Ferry Hay Guide Charts are the tools used to determine the degree to which the factors (or dimensions), are found in one job relative to the degree to which they are found in another, and the numerical values (points) to be assigned. See Appendix A – Guide Charts.

It is important to remember that there are no absolutes. It is simply a matter of determining how much more or less of each factor any job has relative to others around it. As a result, three key steps in the evaluation process are:

  1. Looking at jobs within their organizational context, not in isolation.
  2. Looking at the descriptions and definitions of each sub-factor and applying evaluation rules on a consistent basis to establish the appropriate rating for each sub-factor.
  3. Validating the evaluations for each of the factors, through precise Benchmark comparisons.

Overview of the Evaluation Process

1. Understanding the Job

An accurate job description is an essential component of the job evaluation process. It provides much of the necessary information from which to construct an evaluation of the job. To do that, it must provide a clear and succinct description of:

  • The job's purpose and the end results for which it is accountable (found in the General and Specific Accountability statements)
  • Where the job sits in organizational terms (found in the Organization Structure statement and the organization charts)
  • The dimensions of the job (found in the Dimensions statement)
  • The nature of the job/role (found in the Nature and Scope of duties statements)

Three key concepts which govern the use of job descriptions in arriving at a valid evaluation are:

Concept Application
The need for up-to-date job descriptions The job description should be up to date so that the job can be evaluated as it is, not as it was and not as it might be or could be. It should describe what is required of the job. Jobs change, and it is important to have accurate, complete, and current information.
Avoiding title comparisons The title of a position can provides a strong clue about where to look for appropriate Benchmark comparators. However, by themselves, title comparisons can distort valid evaluations, because what the job holder does and what occurs in another job with a similar title may not be the same at all. For this reason, titles are never adequate for making proper evaluations.
Examining the meaning behind the words There are no “magic words”. Job descriptions may contain words/phrases that are designed to impress evaluators (e.g., strategic, complex, transformational); evaluators must look beyond the words used to understand the reality of the job.

2. Understanding the Job Context: Using the Organization Charts

It is important to avoid viewing the job as though it exists in isolation. Organization Charts show two things:

  • Where the position fits within the unit structure (its hierarchical level). This is very important information for identifying potential Benchmark comparators.
  • The impact and influence of other jobs on the position. Organizational interrelationships, particularly where one job provides functional guidance to another, have a strong influence on job size. Organizational interrelationships can also indicate potential overlaps or duplications, which the job descriptions, taken in isolation, could mask. The evaluator should always consider which other positions are also involved in the work and their contribution to the work.

A key concept for weighing the influence of organizational relationships is:

Concept Application
The need to recognize both lateral and vertical relationships Both vertical and lateral relationships affect job size. It is a common mistake to overlook the lateral relationships between peer positions and overemphasize the vertical ones between superior and subordinate. It is important to look at both equally critically.

3. Evaluating the Job: Using the Three Factors

The three evaluation factors provide a common yardstick which makes it possible for actual job comparisons to be made. The three factors represented on the charts are:

  • Know-How, which encompasses three scaled sub-factors:
    • Depth and scope of practical/technical/specialized Know-How
    • Planning, organizing, and integrating knowledge
    • Communicating and influencing skills
  • Problem Solving, which encompasses two scaled sub-factors:
    • Thinking environment
    • Thinking challenge
  • Accountability, which encompasses three scaled sub-factors:
    • Freedom to act
    • Area of Impact (Magnitude)
    • Nature of Impact

Three key concepts which underlie the Know-How, Problem Solving and Accountability factors are:

Concept Application
Comparing jobs according to universal factors Work is a process through which skills, knowledge and abilities are applied to challenges, issues, or problems in order to achieve outcomes or deliverables for which the job is accountable. It is possible to evaluate diverse jobs using the three factors of the Method and Working Conditions, because they incorporate the fundamental characteristics that researchers have found are common to the nature of work and are therefore present to some degree in every job. These factors form a common measure that can be appropriately applied to any job in order to evaluate the work done in the job.
The need to focus on job content The purpose of job evaluation is to establish, as objectively as possible, each job's relationship to others in terms of content and requirements. This is particularly difficult if the current classification level, rating, or historical relationship is referred to during evaluation. The evaluator must take pains to ignore the related assumptions that may go with knowing the suggested organizational level of the job, the incumbent, or the (likely) salary connected with the position.
The interrelationships of the factors The Korn Ferry Hay Guide Chart Methodology differs from most Job Evaluation tools used by the Federal Government in that the factors are not evaluated or assessed independently. The Know-How, Problem Solving and Accountability factors and their evaluations are linked. This is described further in application guidelines for each factor and should be kept in mind throughout the job evaluation process. There are logical evaluations which further reflect job and organization design.

4. Using the Numbering Pattern of the Guide Charts

The numbering system on the Guide Charts is geometric, with point values increasing in steps of approximately 15%. For example: 100, 115, 132, 152, 175, 200 and so on, with the value doubling every five steps. This numbering progression runs through all Guide Charts.

Concept Application
Step differences: the building blocks of job evaluation

In developing the Guide Charts the decision to adopt a geometric progression was made because in comparing objects generally, we perceive relative (rather than absolute) differences. Fifteen percent (rather than 10, 12 or 20%) was chosen because in tests it gave the best inter-rater reliability and it reflects the human capability to perceive a noticeable difference in the relative size; each 15% Step difference represents the smallest perceptible difference on which consensus can be established.

Conveniently, it also gives a pattern of numbers that is easy to remember. Starting by 100 and continuing in (rounded) 15% Steps, the pattern of numbers doubles every five Steps.

The notion of Step differences is critical because it provides a framework for consistent, quantified judgements to be made based on the minimum perceptible difference (just noticeable difference) that well-informed and experienced evaluators can discern between jobs or elements of jobs.

The Guide Charts used for EX Group positions are a subset of the Korn Ferry Hay Master Charts. As such, they have been sized to include only the relevant portions for evaluating EX Group positions, plus a suitable floor and ceiling to provide the outer parameters for the evaluation context.

The structure of the Guide Charts also allows for an evaluation to be nuanced or “shaded.” Modifiers (pull-up or pull-down) illustrated as + or -, can be added to several of the factors (as indicated in the respective factor sections below) to support the selection of a higher or lower point value. Modifiers allow the evaluator to recognize that some jobs that might have the same general factor rating are in fact, closer to the top or bottom of the value range provided for that rating. 

5. Validate the Evaluation Logic

There are several ways to ensure the quality of EX Group position evaluations. The first of these quality assurance measures involves double checking that the combination and value assigned to each factor makes evaluation sense. Factor-specific validation checks are included in the explanation of each factor in this manual.

6.  Ranking the Position

Once points have been assigned to all three factors, it is a straightforward matter to make a preliminary ranking of the position based on the sum of those points. The minimum and maximum points for each of the levels in the EX-Group indicate its classification.

7. Reconciling the Evaluation with the Benchmarks

The most important test of the validity of the evaluation is finding comparable reference evaluations in the standardized continuum of the Benchmarks.

The method for validating against the continuum is to “prove” the evaluation by finding several comparable reference positions from the standardized Benchmark positions. Generally, the benchmark validation step is done after the position has been evaluated against all three factors.

8. Reconciling the Evaluation with Others Around It (Classification Relativity)

The other aspect of quality assurance involves ensuring that the evaluation makes sense within the continuum of EX Group evaluations. This means checking the evaluation against the evaluations of other positions around it in the unit (based on the organization charts), and more broadly checking the evaluation against similar roles in other organizations (relativity).

9. Job Profiles

A key step in validating the evaluation logic is the concept of short profiles. This is a check on the relative proportions of Know-How, Problem Solving and Accountability in a position. Positions that exist to deliver operations and services will have a relatively high score in Accountability, while those focused on long-term planning and policy development will be more focused on Problem Solving. The combination of the points should reflect the nature of the work.

10. Documenting the Job Evaluation

The final evaluation should be supported by a written rationale.

Measuring Know-How

Know-How is the sum of every kind of knowledge and skill, however acquired, that is required for fully competent job performance. It can be thought of as “how much skill and/or knowledge about how many things and how complex each is”.

Know-How has three sub-factors:

  • Practical, Technical, Specialized Know-How measures the depth and scope of the subject matter knowledge and recognizes increasing specialization (depth) and/or the requirement for greater breadth (scope) of knowledge.
  • Planning, Organizing and Integrating Knowledge measures the knowledge required for integrating and managing programs/functions. It involves combining the elements of planning, organizing, coordinating, directing, executing, and controlling over time and is related to the size of the organization, the functional and geographic diversity, the diversity and complexity of stakeholder relationships, and the time horizon.
  • Communicating and Influencing measures the criticality of interpersonal relationships with individuals and/or groups internal and/or external to the organization in achieving objectives.

Practical, Technical, Specialized Know-How

There are three important concepts to grasp in order to apply this sub-factor scale correctly. These concepts are as follows and are covered on the pages that follow:

  • Equivalency of depth and breadth
  • The Know-How required to manage specialist positions
  • Equivalency of work experience and formal education
Concept Application
Equivalency of depth and breadth in Practical, Technical, Specialized Know-How

It is important to appreciate the combination of both the breath and the depth of the varieties of Know-How required by a role. Generalists may need a broader range of knowledge than their specialist colleagues who may well require more depth of knowledge in their area of specialization.

It is important to recognize that the demands for Practical Know-How in operational/service positions such as line management and business partner roles (e.g., human resources or communications), can be as great as the Technical/Specialized knowledge requirements of professional jobs such as engineering, science, law, or information technology.

The Know-How required to oversee specialist positions

Executives do not necessarily need the same depth of subject-specific Technical or Specialized Know-How as those working below them. This is because managers are not required to do their subordinates’ jobs. However, they do require sufficient understanding of their subordinates’ areas of expertise to be able to manage their activities and evaluate their performance (e.g., The position may not be the subject matter expert but would still need enough knowledge to be able evaluate the work of their subordinates.).

Furthermore, executives are likely to require a broader range of knowledge than their subordinates, as subordinates are likely to focus on a sub-set of the activities that the manager needs to understand.

Equivalency of work experience and formal education

While it is true that some Know-How can only be gained formally (e.g., an M.D. in Medicine), it is important to focus on the knowledge and skill required to do the work, not on how an incumbent might come to possess that knowledge.  Know-How can be acquired in multiple ways and does not necessarily need specialized formal education. 

  • Phrases in a job description such as “requires an MBA” are helpful but may cause the evaluator to inflate the Know-How requirements.
  • In addition, there are aspects of Practical/Technical/Specialized Know-How that can be gained through either formal education or work experience (e.g., negotiation skills) or that can be learned only through experience. For example: a Regional Director of Economic Development must understand not only the socio-economic conditions of the region but also the operations of the legislature and the government, and the names of key legislators and their political positions and issues. This Specialized Know-How cannot be learned in school but must receive its proper value.

EX Group positions fall in the range of F to G for Practical, Technical, Specialized Know-How. An EX-Group position is unlikely to be below F, as this applies to positions requiring a seasoned professional having command of their subject area. Similarly, EX Group positions are unlikely to be above G as this would suggest a global authority on a complex domain.

Note: A modifier (pull-up or pull-down) can be used on the Practical, Technical, Specialized Know-How sub-factor.

The following provides some additional guidance on this sub-factor for EX Group work:

Degree Deep Expert (Depth) Generalist Leader (Breadth)
F- Senior practitioners providing guidance to more junior colleagues but that are not the approval authority for complex/technical advice. Requires deep/specialized knowledge of an area of a sub-domain (e.g., Costing as an accounting sub-function). Roles requiring a breadth of knowledge to provide leadership to multiple disciplines, but where technical authority is provided by another role.
F

Roles that provide technical leadership and guidance to other specialists. Roles are regarded as technical specialists when they do not require additional review or approval (from a technical perspective). Requires complete knowledge of a domain and common interrelationships with other relevant domains (e.g., Accounting). E.g., Director, Financial Accounting

Roles requiring a breadth of knowledge to provide leadership to multiple disciplines. E.g., Director, Health Program Integrity and Control
F+ Roles requiring complete knowledge of a domain, its common interrelationships with other relevant domains, and new/emerging applications of the subject domain (e.g., Application of nascent technologies to substantially change the work of the domain). E.g., Director, Drug Policy Roles requiring a breadth of knowledge to provide leadership to multiple disciplines within multiple domains (e.g., Corporate Services roles providing leadership to distinctly different disciplines.)  E.g., Director, Contracting, Materiel Management and Systems
G- Roles requiring very deep specialization in complex fields of knowledge and that provide authoritative and determinative knowledge and insights to the organization but limited by direction from a central authority or peer position with specialization in an interrelated field. E.g., Assistant Deputy Commissioner, Correctional Operations, Prairie Region Roles requiring substantial comprehension of all relevant fields of knowledge within a complex organization and that provide leadership to multiple disciplines in a large and complex organization, with considerable organizational guidance. E.g., Chief Executive Officer, Canadian Forces Housing Agency
G Roles requiring very deep specialization in complex fields of knowledge and that provide authoritative and determinative knowledge and insights to the organization. E.g., Vice-President, Health Security Infrastructure Roles requiring substantial comprehension of all relevant fields of knowledge within a complex organization and that provide leadership to multiple disciplines in a large and complex organization. E.g., Assistant Deputy Minister, Americas
G+ Pull-towards H as a “Global Authority” which is relevant in fields where Canada has a Global Leadership role. This position is the primary Government of Canada expert. E.g., Assistant Deputy Minister, International Trade and Finance Appropriate for first hierarchical level roles expected to be the subject matter authority within the branch/sector versus an integrator of expertise from subordinate roles.  E.g., Assistant Deputy Minister, Science and Technology

The depth and breadth of Practical, Technical, Specialized Know-How is measured on the vertical axis of the Chart.

Planning, Organizing and Integrating Know-How

EX Group positions must know how to do such things as plan, organize, motivate, co-ordinate internally and externally, direct, develop, control, evaluate or check the results of others' work. These skills are required in directing activities, such as those inherent with daily operational activities (line management); or through consultative activities, such as those inherent with directing policy development; or both (as in positions which direct secondary business activities that support the business). The following key concept should be considered when evaluating this sub-factor:

Concept Application
Glossary of Terms - Planning, Organizing, and Integrating

Function: a group of diverse activities which, because of common objectives, similar skill requirements, and strategic importance to an organization, are usually directed by a member of top management.

Subfunction: a major activity which is part of and more homogeneous than a function.

Element: A part of a subfunction, usually very specialized in nature, and restricted in scope or impact.

The more complex the job, the broader the management skills required.

The following affect the complexity of the job and the degree to which the need for planning, organizing, and integrating skills are required:

Functional Diversity – the range of programs/functional activities and the complexity of their inter-relatedness, requiring planning, organizing and integrating to achieve unit objectives and the extent to which interests and objectives are aligned.

Stakeholder Diversity –the number and diversity of stakeholders internal to the department, horizontally across one or more departments and external to the Government of Canada and the frequency and complexity of these interactions, as well as the extent to which interests and objectives are aligned.

Time Horizon –the degree to which jobs deal with long or short-term issues, for example, building / evolving organizational capacity for future challenges rather than delivering today’s services.

Physical Scale – the size of the operation required to achieve expected results and geographic dispersion of the work.

Consideration should be given to the combination of these elements and a balanced judgment made.

Note: A modifier (pull-up or pull-down) can be applied to the Planning, Organizing Know-How sub-factor.

The selection of a reasonable degree for this sub-factor is based on the collective weight of the above indicators and should not be driven by the highest or lowest indicator. Benchmark jobs are to be referenced to validate the evaluation. The chart below provides guidance on how to assess and measure Planning, Organizing, and Integrating (POI).

POI II+/III- III III+/IV- IV IV+
Functional Diversity

Related – Management of operations or services which are generally related in nature and objective, where there may be a requirement to coordinate with associated functions.

Differing – Management of major program with activities differing in objectives

or

Management and integration of function(s) with multiple business lines and/or of differing elements that have an impact on all the organization

Diverse – Integration and management of multiple programs with divergent purposes

or

Management of a significant function critical to the achievement of a broader mandate. (e.g., entire Public Affairs program in a substantial organization)

Broad – Strategic integration and business / functional leadership of: programs with significant differences and divergent purposes (e.g., a sector that is a major component of the departmental mandate)

or

strategic functions that affect the whole organization (e.g., complete Corporate Services)

Heterogeneous – Strategic integration and business/ functional leadership for the national delivery of multiple major programs/functions that are broad in scope

or

government-wide leadership of broad key initiatives.

Stakeholder Diversity Local / internal and aligned; consider effect on other groups Regional / some external and differing priorities National / significant external and cross government; consensus building is key International / all levels of government; competing objectives to reconcile Global / alignment around key government priorities; conflicting objectives to reconcile
Time Horizon Months to Annual Current year and next Two to three years Three to five years Five years +
Physical Scale (FTEs) 20-60 60-200 200-600 600-2000 2000-6000+
Examples of Benchmarks Director, Program Evaluation Director General, Operations and Departmental Security Officer Director General, Human Resources Operations Assistant Deputy Minister, Small Business, Tourism and Marketplace Services Assistant Deputy Minister, Real Property Services

The requirement for Planning, Organizing and Integrating Know-How is measured on the horizontal axis of the Guide Chart.

Communicating and Influencing

This final Know-How sub-factor measures the degree to which establishing and maintaining effective interpersonal relationships is central to the position achieving its end results.

The requirement for using communicating and influencing skills on the job is represented by three possible levels. For most executive positions, because of their size and/or nature, the achievement of objectives hinges on the establishment and maintenance of effective interpersonal relations.

In assessing each executive position, evaluators must weigh a variety of considerations in making their judgements, such as:

  • The degree to which leadership and engagement of others are both integral to achieving results and highly complex or difficult in nature.
  • The importance of service to clients and client contact (both internal and external) as integral elements of the job.
  • The nature of the client relationship(s).
Concept Application
Assessing the Frequency and Nature/Intensity of Contacts

In assessing the significance of client contact, evaluators should consider such factors as the frequency and nature or intensity of these contacts. There is a significant difference in communicating and influencing skills where contact is established simply to gather or exchange information and where contact is established and maintained to influence decisions, processes or behaviors which are crucial to the organization successfully achieving its goals.

It is also important to relate the nature of the job’s contacts to its objectives. Evaluators should avoid being misled by statements in job descriptions which ascribe contacts to a job that are not in keeping with its objectives and accountabilities.

There are three levels of Communicating and Influencing:

Level 1 Common courtesy must be employed, and appropriate working relationships established and maintained with subordinates, colleagues, and superiors to accomplish the position’s objectives. However, there is no significant need to influence others in carrying out assignments. Interaction with others is generally for the purpose of a straightforward information exchange or seeking instruction or clarification. This degree of Communicating and Influencing skills is not consistent with EX Group work and should not be used at these levels.

Level 2 In dealing with subordinates, colleagues, and superiors, and in the course of some contact with clients inside and/or outside government, it is necessary to establish and maintain the kind of relationships that will facilitate the acceptance and utilization of the position’s conclusions, recommendations, and advice. In order to achieve desired results, positions must interact regularly with subordinates, colleagues and superiors and have some contact with clients. The nature of these contacts is such that tact and diplomacy beyond the demands of normal courtesy are required. This degree of Communicating and Influencing skills is extremely unlikely for EX Group work.

Level 3 Successful achievement of the position’s program delivery and/or service and/or advisory objectives hinges on the establishment and maintenance of appropriate interpersonal relationships in dealings with subordinates, colleagues, and superiors and in ensuring the provision of service through substantive contact with clients inside or outside government. Skills of persuasiveness or assertiveness as well as empathy and sensitivity to the other person's point of view are essential to ensuring the delivery of service. This involves understanding the other's point of view, determining whether a behavioral change is warranted and, most importantly, causing such a change to occur through the exercise of interpersonal skills.

The Communicating and Influencing Know-How sub-factor is measured, along with Planning, Organizing and Integrating Know-How, on the Guide Chart's horizontal axis.

Note: A modifier (pull-up or pull-down) cannot be applied to the Communicating and Influencing sub-factor.

Combining the Know-How Elements

To this point, three independent assessments regarding Know-How have been made – one for each sub-factor. For example:

Practical/ Technical/ Specialized Planning, Organizing, and Integrating Communicating and Influencing Skills
Position 1 F II 3
Position 2 G III 3
Position 3 G II 2

The total weight of Know-How is derived from the combination of the three sub-factors. The values assigned to the sub-factors will lead the evaluator to a cell on the chart. This cell will contain three numbers, representing three step values.

Example 1: Contents of cell F II 3
Contents of cell F II 3. Text version below:
Example 1 - Text version
  • 350
  • 400
  • 460

The final decision about which of these numbers to choose to represent the job's total Know-How requirement will be based on the degree of confidence in the validity of the cell selected. Normally, a solid fit on all three sub-factors would lead you to select the middle number in the cell (e.g., for FII3, 400).  If you believe one of the sub-factors might be approaching another degree definition, you can “pull-up” (+) the evaluation by choosing the upper value in the cell or “pull-down” (-) to the lower value. When contemplating the use of such modifiers, evaluators should always consider the relative strength of the position as compared to Benchmark ratings, regarding the sub-factor being evaluated.

Example 2: Contents of cell F II 3; selecting F+ II 3 at 460 point
Contents of cell F II 3; selecting F+ II 3 at 460 point. Text version below:
Example 2 - Text version
  • 350
  • 400
  • 460

Regardless of the number chosen, you should record any shadings in your evaluation (i.e., any “pulls” up or down). You can do this by using a + or – beside the sub-factor(s) as a modifier. For the example above, F+ II 3, illustrates a pull-up on the Practical, Technical, Specialized Know-How (degree F) resulting in the high value (e.g., 460). Should there be two modifiers in the same direction, for example F+II+3, the result would remain the highest value (e.g., 460). Where there are two opposing modifiers, for example F+II-3, these modifiers would cancel each other, resulting in a middle value (e.g., 400 points).

Concept Application
Making numbering differentiations The overlapping numbering system is designed to allow different kinds of jobs to receive equivalent points, if appropriate. The numbering system also permits the evaluator to show relative differences between jobs whose evaluations put them in the same cell. This is done by assigning a higher number from the cell to the stronger job.
The continuum of the cells

The cells on the Guide Chart represent steps along a continuum.  It is possible to carefully evaluate a job on each of the sub-factors and still be aware that the cell selected does not completely reflect your final opinion. In this case, you might choose the top or bottom number in the cell, depending on whether you thought there was a “pull” up or down on the evaluation.

The notion of “pull” reflects the fact that evaluation is not an exercise in precision but rather a judgmental process, with answers in shades of grey. Therefore, the differentiation between one level and another may not be clear. For instance: an evaluator could decide a job is G IV 3 (920) based on comparisons to Benchmarks, however they may also feel it is moving towards the H level of Practical/Technical/Specialized Know-How, and with a pull towards III for Planning Organizing and Integrating.  This would represent 920 points for the evaluation, expressed as G+ IV-3 (920).

Since the numbering patterns overlap with adjacent cells, it really is about selecting the correct cell and then determining if it “pulls” in the direction of one of the sub-factors, but the selection of the cell is key.

Double Check - Checking the Step Relationships of a Know-How Evaluation Relative to its Superior

There are some guidelines that can assist you in making/validating your judgments. It is important to bear in mind that these are not hard-and-fast rules. They should not be used as a substitute for thorough analysis of the job and interpretation of the Guide Charts.

As a rule, when you are considering a hierarchy of jobs in a job family, technical ladder or reporting structure, the number of steps in the Know-How score can give some insight into the vertical structure of the hierarchy:

No difference
e.g., 608 to 608

When a supervisor and subordinate are evaluated at the same Know-How level it means that the reporting relationship is administrative only. The supervisor will not be managing or evaluating the content of the subordinate’s work, only whether it was delivered on time / on budget. Such situations are extremely rare.

One-step difference
e.g., 460 to 528

A one-step difference between a supervisor and subordinate position generally indicates a point of compression in the structure, giving reason to question the need for the number of organizational layers found. These situations can be found where the superior is responsible for a portfolio of areas (e.g., Assistant Deputy Minister, Corporate Services) and the subordinates have the subject-matter expertise (e.g., Director General, IM/IT). In this type of organizational design, the superior would be responsible for bringing the organizational requirements to the subordinate and assessing whether a proposed solution satisfies the requirements (the “what”) but would not assess the methodology of the solution (the “how”).  

Two-step difference
e.g., 460 to 608

This is the typical or logical relationship/vertical distance in a reporting sequence, where the positions operate in the same or related domains. The supervisor provides subject-matter guidance.

Three-step difference
e.g., 460 to 700

Three steps between levels are characteristic of reporting relationships in organizations with relatively homogenous/ well-defined work. These situations also support a broad span of control as the superior does not need to spend a lot of time with each subordinate. It is often found when the subordinates deliver the services in the short-term and the superior focusses on the medium to long-term evolution of the services and the organization.

Four-step difference
e.g., 460 to 800

This represents a very significant difference, perhaps suggesting that a level may be missing in the organizational structure. Care should be taken to ensure that it is not the result of an evaluation error.

Measuring Problem Solving

Problem Solving is the requirement on the part of the position to put Know-How to use in original, self-starting thinking to deal with issues and solve problems on the job.

Measuring Problem Solving involves evaluating the intensity of the mental processes required by the position. Activities include employing Know-How to analyze, identify, define, evaluate, create, and to use judgement to draw conclusions about and resolve issues. To the extent that thinking is circumscribed by policies, centrally guided or referred to others, the Problem Solving requirement of the job is diminished.

Problem Solving is viewed as the mental manipulation of Know-How and is different from the straight application of skills measured by the Know-How factor. For this reason, not all of the Know-How required in a job will necessarily be applied in the Problem Solving elements of that job. Problem Solving is treated and measured as a percentage of Know-How, and the numbering pattern on the chart is comprised of a series of percentages rather than point values. This percentage can be thought of as the proportion of thinking that is above and beyond the straight-forward Know-How (e.g., Self-starting, and original thinking). For example, knowing all the control buttons and knobs on an airplane is Know-How, but landing the plane safely in an emergency is Problem Solving.

Lower-level jobs tend to follow standard procedures and policies most of the time, so Problem Solving is relatively low. High level jobs operate in grey areas with nebulous situations and incomplete / conflicting information, so their Know-How is “stretched” and Problem Solving is relatively high.

Problem Solving has two sub-factors:

  • Thinking Environment (vertical axis) – how much assistance is available to help the incumbent do the thinking required and the degree of ambiguity with respect to parameters that the thinking environment presents.
  • Thinking Challenge (horizontal axis) – the complexity of the problems encountered and the extent to which original thinking must be employed to arrive at solutions.
Concept Application
The difference between Thinking Environment and Thinking Challenge

Thinking Environment measures the context in which problem solving takes place. Its main consideration is the amount of help available in that context and the degree of ambiguity that a problem situation presents.

Thinking Challenge measures the inherent difficulty of the problems encountered. Its main consideration is the novelty of the solutions being considered. Does the issue require adaptation or innovation/creativity to address it?

Thinking Environment

Thinking environment is concerned with the freedom to think and/or the degree of guidance available in approaching problems.  It is measured by the presence and/or absence of assistance or constraints affecting how problems are addressed. Thinking may be limited by precedents, people, and service-wide, department-wide, or functional goals, policies, objectives, procedures, instructions, or practices. In general:

  • Goals, policies, standards, and objectives provide help by describing the “why” of an issue and establish boundaries.
  • Procedures detail the steps necessary to follow through on a policy (how, where, when, and by whom).
  • Instructions and practices outline the specific procedures or steps.

The degree to which help is available to job holders varies. For example: help from functional specialists and superiors may be less readily available to managers in geographically remote or organizationally isolated areas, or operations that run 24/7. The degree to which help is available is evaluated along the vertical axis of the chart. There are no hard-and-fast rules. However, here are some guidelines:

  • At the D level, what must be done is often defined and how things must be done is defined, with options to be selected. This is a less likely option for EX Group work.
  • At the E level, the what is clearly stated, but the how is determined by the incumbent’s own judgement and experience.
  • At F levels, thinking is more broadly about what must be done. Naturally, how things are to be done is also not clearly defined.
  • At the G level thinking is more about why things should be done. What must be done is generally less defined, and how things must be done is not defined at all.
  • At the H level thinking is constrained only by general laws of nature, science, public morality, etc. and is about setting the strategic direction for the entire Government. The question is “Where should we be going?” H is extremely unlikely for EX Group work as these roles in the public service respond to the direction of the elected Government.

Note: Only a “pull-up” (+) to a higher Thinking Environment can be applied (not a “pull-down”).

The key concept to remember when evaluating the Thinking Environment is as follows:

Concept Application
The relationship between the Know-How level and the Thinking Environment level

Logically, jobs do not require the incumbent to think beyond the limit of the Know-How required for the job. Conversely, the Know-How required is dependent on the thinking that needs to be done. Therefore, the Thinking Environment level (as designated by its letter) should generally be no greater than the Practical, Technical, Specialized Know-How level/letter previously assigned.

(E.g., When Practical, Technical, Specialized Know-How is at the F level, the Thinking Environment will likely be E or F but not G).

Thinking Challenge

Thinking Challenge, the second sub-factor of Problem Solving, is concerned with the degree of novelty or original thought required to resolve problems. It assesses the complexity of the problem and the extent to which its solution lies within previous practices. The complexity of the problem faced depends on how clear cut the solution is. The more complex it becomes, the more the job holder must select from experience and adapt previous solutions to similar problems: “Is there a right or wrong answer?” “Is the solution clear cut or does it require more judgement?”.

Concept Application
The definition of problems Problems, in this context, refer to the wide range of challenges confronting job holders. The concept is not restricted to things that have gone wrong, although such things must certainly be considered.

Here are some guidelines:

Roles at degree 3 encounter differing situations where there is a need to search for the appropriate solution: “Which of these possible answers is the most appropriate given the situation?”.

At degree 4, roles encounter new / unique / ambiguous situations, requiring adaptive thinking; this includes the search for new and better ways of doing things and even challenging "is this the right question". This degree is most likely for EX Group work.

At degree 5, situations encountered are uncharted or novel – both inside and outside the organization - and require the development of new concepts or imaginative solutions for which there are no precedents. 

Note: Only a “pull-up” ( + ) to a higher Thinking Challenge can be applied (not a “pull-down”).

The levels of Thinking Challenge appear across the top of the Problem Solving chart.

Combining the Problem Solving Sub-Factors

The result of making independent judgements for each of the two Problem Solving sub-factors is that the evaluation falls within a cell that contains two percentage step values.

Example 3: Contents of cell F 4
Contents of cell F 4. Text version below:
Example 3 - Text version
  • 50%
  • 57%

Two percentage step values: your choice of which specific Problem Solving percentage to use to represent the job's total Problem Solving requirements will be a judgment, based on your analysis of the strength or weakness of the job's fit in relation to the Guide Chart definitions of the two sub-factors and comparison to relevant benchmarks at the percentage levels.

Generally, a solid fit in relation to the definitions should result in choosing the lower (standard) number in the cell. A “pull” to a higher Thinking Environment and/or Thinking Challenge would change the choice to the higher percentage. A pull-down modifier cannot be applied to this factor.

Example 4: Problem Solving Evaluation
Contents of cell F 4. Text version below:
Example 4 - Text version
  • F4: 50% solid fit choice
  • F+4, F4+, F+4+: 57% “pull” to G Thinking Environment and/or 5 Thinking Challenge

To determine Problem Solving points, you can use the chart illustrating the most likely combinations of Problem Solving and Know-How points (second page). Simply locate the Problem Solving percentage in the left or right columns and the Know-How points along the top or bottom. The resulting Problem Solving points are found at the intersection.

Double Check - Checking the Problem Solving Evaluation

Evaluators should take the time to review their Problem Solving evaluations. Since Problem Solving is the application of Know-How, experienced evaluators have found that the relationship between the two factors tends to fall into patterns. These patterns are shown by the shadings on the most likely, less likely, and unlikely combinations of Problem Solving and Know-How points on the chart. They will serve as a general guide for checking the Problem Solving evaluation:

  • Normally, an evaluation should fall in the Most Likely areas.
  • An evaluation can fall in the Less Likely areas if it can be rationalized.
  • If an evaluation falls in the unlikely shaded areas, the evaluation of both the Know-How and the Problem Solving factors should be re-checked. It is possible that the body of knowledge the incumbent is expected to have is insufficient for thinking at the level indicated by the Problem Solving evaluation, or that too much knowledge is expected of the position given the degree to which it will be put to use, as indicated by the Problem Solving evaluation.

The following outlines the types of roles / work found at specific Problem Solving (PS) percentages:

PS% Interpretation

76+

Enterprise Strategy Formation

Roles that envision the long-term strategic direction of the Government and/or the implications for the entire Department/Agency. This work is typically found at the Minister and Deputy Minister level and therefore unlikely within the EX Group.

66

Strategy Formation – Branch / Sector Leadership

These roles define the overall strategy of the Branch / Sector and determine new / future directions of the organization. This work is typically found at the Assistant Deputy Minister level or those roles which are delegated some significant part of the Deputy Head’s authority. (e.g., Assistant Deputy Minister, Policy)

57

Strategic Alignment

These roles are focused on the strategic direction of a major function or operation of the Department/Agency. The work involves setting the policy framework and objectives for others to ensure integration between function and sub-functions and are typically Assistant Deputy Ministers / Large Director General roles.  (e.g., Director General, Resource Management)

50

Strategic Implementation

These roles are associated with varying the application of policy locally. There is a requirement to re-shape policy to fit the specific environments for which the job holder is accountable (i.e., turning functional policy into reality). This is typically found within Director General / Director type roles. (e.g., Director, National Security Assessment and Analysis)

43

Tactical Implementation

Roles are concerned with the translation of policy into operating procedures, with contributions to policy made based on an understanding of local/specific applications. These are found typically at the Director / Manager levels.

Measuring Accountability

Accountability measures the degree to which a job is accountable for action and the consequences of that action. It is the measured effect of the job on the end results of the organization.

Up to this point, judgments have been made about the total Know-How required for fully competent job performance and the degree of novelty and ambiguity employed in Problem Solving. Now the task is to consider the job's ability to bring about or assist in bringing about specific end results. This includes considering the nature of that ability in terms of how direct or indirect its impact as well as the scope of that impact on those results.

Accountability considers the following sub-factors in the following order of importance:

  • Freedom to Act: The freedom the incumbent has to make decisions and carry them out. It is the delegated decision-making authority and is the most important sub-factor.
  • Nature of Impact on end results: How direct the job's influence (direct or indirect) is on the end results of a unit, function, or program.
  • Area of Impact (Magnitude): The general size of the unit, function, program, or the element of society affected. This is the least important sub-factor.

Freedom to Act is a stand alone sub factor while the Nature and Magnitude are two elements of how we measure Impact. The latter must always be evaluated in combination. In this sense, they are not stand alone sub-factors; together they constitute the sub-factor of Impact. This is why Freedom to Act is the most important and carries the most quantitative weight in the evaluation of the job’s total Accountability rating. Each perceptible difference in Freedom to Act step generates a 15% increase in points with no overlap between levels. (e.g., E+ and F- are different on the Accountability guide chart, but the same on the Know-How guide chart.)

Freedom to Act

By examining the nature and extent of the controls – or the lack of the controls – that surround the job, this sub-factor directly addresses the question of the job's freedom to take action or implement decisions.

The controls placed on the position's Freedom to Act can be supervisory or procedural or both. In assessing Freedom to Act we focus on the core responsibilities of the position. A key difference to keep in mind when considering Freedom to Act is:

Concept Application
The difference between Freedom to Act and Thinking Environment

It is a common mistake to confuse the restraints placed on Freedom to Act with the help available in the Thinking Environment:

  • Freedom to Act is concerned with action or decisions about action.
  • Thinking Environment is concerned with mental manipulation.

Since controls tend to diminish as you rise in the organization, Freedom to Act increases with organizational rank. However, while it is true that no job can have as much Freedom to Act as its superior, the evaluator should be wary of automatic slotting according to organization level alone.

Note: Modifiers (“pull-up” or “pull-down”) can be applied to the Freedom to Act sub-factor.

Here are some broad guidelines that can help in assessing Freedom to Act:

  • At the E level, positions are relatively free to decide, within approved plans, what the objectives or end results will be. Review of work is typically on an annual/quarterly basis. Roles at this level determine “how” and “when” objectives are to be achieved/completed (e.g., Director, Legislative and Regulatory Affairs).

    Roles reporting at the third hierarchical level may be at E+ where they are delivering activities / operations within a well-defined departmental framework (e.g., Director, Program Evaluation at degree E+). 

  • At the F level, positions are relatively free to determine what the general end results are to be. Managerial direction received will be general in nature. Assessment of end results must be viewed over longer time spans (e.g., six months to a year or longer).

    Roles reporting at the third hierarchical level may be at F- when establishing how to implement government-wide frameworks within the departmental context (e.g., Director, Risk Management at degree F-).

    Positions at the second hierarchical level are typically at F or F+. (e.g. Regional Director, Health Services, Prairie Region at degree F, and BM 7-X-1 Director General, Centralized Operations at degree F+).

  • At the G level, the “what” is communicated only in very general terms. Positions are subject to guidance rather than direction or control. Any job evaluated here is subject only to broad policy and strategic objectives. This is typically the level applied to Assistant Deputy Minister level roles (e.g., Assistant Deputy Minister, Infrastructure and Environment).

Area of Impact – Magnitude

This sub-factor of Accountability measures the size of the area affected by a position. While it does give an indication of the weight to be assigned to the position, it is the least influential and sensitive of the three sub-factors of Accountability used to determine the overall Accountability evaluation. The evaluation of the Area of Impact (magnitude) must always be considered with the Nature of Impact.

To support a common way of measuring the size or area impacted in the context of a job’s accountability, a common quantifiable means or proxy for representing the diverse units, functions and programs that could be affected by the job was needed. Dollars have proven to be the most widely applicable proxy for measuring the Area of impact (Magnitude) of Accountability. The Area of Impact (Magnitude) scale is the least influential because the change or increase in dollar value to generate a 15% difference in point score is a 233% change (e.g., a budget increase from 300M to 400M is a 33% increase in dollar value but does not change the Magnitude rating).

However, to make a logical, rational determination of the Area of impact (Magnitude), the evaluator must remember that dollars are simply a proxy, not an absolute measure.

Concept Application
Dollars are only a proxy to represent Magnitude

Dollars are the most convenient measure to quantify the size of the accountability affected by a job. However, this does not mean that jobs impact on dollars. Jobs impact the results of functions, programs, or operations of organizational units. The use of dollars is simply a mean to quantify contributions to assess the position’s impact on a scale.

A helpful way to think about this is the concept of return on investment. The government decides to invest a certain amount of money in a program / activity / service and executives are responsible for the stewardship of these investments – ensuring that the government gets the desired return. The magnitude is the amount of the investment being made and the nature of impact is the role that any particular executive plays in bringing about the desired returns.

The Area of Impact (Magnitude) continuum on the Guide Chart has seven degrees, from Very Small to Largest. These headings provide a rough idea of the appropriate Magnitude for the subject position. References to the appropriate Benchmarks will help refine this initial determination. In this way, evaluators can arrive at a reasonable determination of the Area of Impact (Magnitude) and avoid jumping immediately to a premature consideration of budget dollars.

Evaluators should use the following process for applying the proxy to establish the appropriate Area of Impact (Magnitude):

  • Determine and describe (in words) what part(s) and/or function(s) of the organization the job affects and the nature of the job’s effect on each of them.
  • Once the part(s) and/or function(s) most appropriate to the job have been identified, think about the relative size of the part(s) or function(s) under consideration and describe these in words (Found in Dimensions section of the job description, or obtain this information through supporting documents.).
  • Once these relationships have been articulated, verify them and the “size” selected for the job against the dimensions of the Benchmark positions.
Concept Application
Applying modifiers to the Area of Impact (Magnitude) of the Proxy Selected

The scale reflects a geometric progression with overlap between levels (e.g., 3+ and 4- both generate the same points when Freedom to Act and Impact are the same for both).  A simple calculation can be used to determine how modifiers should be applied to Area of Impact (Magnitude):

Double the bottom and halve the top: Using a Magnitude of 4 with 10M to 100M gives you 20M and 50M. Thus, a (-) modifier would be applied to values between 10M and 20M, a solid 4 would be reflective of values between 20M and 50M, and above 50M, but less than 100M would have a (+) modifier applied.

Use of the Accountability Magnitude Index to Adjust for Inflation

Unfortunately, the value of money does not remain constant over time. To compensate for this change, the dollar values used as proxies for the Area of Impact (Magnitude), reflected in the Benchmarks have been converted into constant dollars to ensure a continuous alignment with the monetary amounts identified in the Guide charts.

Therefore, to make comparisons between a subject job's proxy dollars (which are expressed in current dollars) and the constant dollars in the Benchmarks, it is necessary to convert the current dollars into constant dollars. The Accountability Magnitude Index (AMI) provides the factor used for this purpose. To convert a current dollar value into constant dollars, divide the current dollar value by the approved AMI.

The AMI is adjusted periodically by Korn Ferry to reflect the value of Canadian currency on a global stage. This adjustment is reviewed annually and periodically adopted by the Office of the Chief Human Resources Officer (OCHRO) in the Treasury Board of Canada Secretariat for application in evaluating EX Group jobs. Any AMI adjustments to be used in the Core Public Administration for EX Group job evaluation are announced by the OCHRO. The AMIs from financial year 1980/81 to the publication date of this guidance are as follows:

Accountability Magnitude Index 1980–2022
Year Magnitude Index
1980/81 2.45
1981/82 2.77
1982/83 3.06
1983/84 3.41
1984/85 3.61
1985/86 3.72
1986/87 3.83
1987/88 3.91
1988/89 4.03
1989/90 4.17
1990/91 4.37
1991/92 4.5
1992/93 4.6
1993/94 4.7
1994/95 4.8
1995/96 5
1996/97 5
1997/98 5
1998/99 5.2
1999/2000 5.4
Sept. 2000 6
Sept. 2002 6.5
Apr. 2006 7
Sept. 2010 8
Oct. 2022 9

“Pass-Through Dollars”

Many positions may appear to have a very large Magnitude, but the dollars being used are “Pass-Through Dollars” where little or no value is added to the position (transfer payments to individuals or other jurisdictions under social programs which are controlled largely by legislation, regulation or formula fall into this category). An example would be Canada Pension Plan payments. The key to handling Pass-Through Dollars is as follows:

Concept Application
Pass-Through Dollars are unlikely to be an appropriate Magnitude proxy

In cases of Pass-Through Dollars, the position deals with the process of payment. As such, the accountability is limited to ensuring the processing of a payment, not for the amount or type of payment.  These dollars cannot adequately represent the Magnitude of the position. A more appropriate proxy should be found.

While there is potential to consider an Indirect impact on the Pass-Through dollar proxy, a better measure could be a Primary impact on the salary, operating and management budget of the unit to reflect the work of the staff processing payments.

Nature of Impact on end-results

The Nature of Impact measures the directness of the position's effect on end results. The evaluation of the Nature of Impact must always be considered with the Area of Impact (Magnitude). The Nature of Impact levels are as follows:

I - Indirect: The position provides information or other supportive services for use by others. Activities are noticeably removed from final decision and end results. The position's contribution is modified by or merged with other support before the end result stage. For example: 

Director, Financial Accounting, accountable for the processing, accounting, and reporting of departmental financial transactions. The role provides financial transaction support services; however, it is noticeably removed from the end results achieved by the departmental budget, having an indirect influence on the overall departmental budget.

The use of + or – with an Indirect impact suggest that, for some jobs, the influence and contributions can be more closely tied to the end results, while for others, the link is more obscure.

C - Contributory: The position provides interpretative, advisory, or facilitating services for use by others or by a team in taking action. The position's advice and counsel are influential and closely related to actions or decisions made by others. The position’s contribution significantly influences decisions related to various units or programs. For example:

Director General, Integrity Risk Guidance, acts as the authority on risk management and integration, and provides advice and guidance to senior departmental officials to ensure the integrity of departmental policies, programs and service delivery strategies and frameworks. The position influences the design and delivery of departmental policies and programs, reflected as a contributory impact on the organization’s overall budget.

To the extent that Contributory pertains to only a portion of the investment represented by the Area of Impact (Magnitude), or the advice given relates more to remaining compliant with governing frameworks, a C- might apply. Conversely, if the advice is expected to materially impact the value delivered across the bulk of the investment represented, it might be a C+.

S - Shared: The position is jointly accountable with others for taking action and exercising a controlling Impact on end results. Positions with this type of Impact have noticeably more direct control over actions than positions evaluated at the Contributory level, but do not have total control over all the variables in determining the end result(s).

A basic rule is that Shared Impact does not exist vertically in an organization (e.g., between superior and subordinate). Shared Impact can exist between peer jobs within the same organization or with a position or positions from outside the organizational unit or outside of the Federal Government. Shared Impact suggests a degree of partnership in, or joint accountability for, the total result. In this way it differs from Contributory Impact, where the position is only accountable for an input to the end result. For example:

The departmental Project Manager could be considered to have a Shared Impact on all design and construction activities carried out by Public Services and Procurement Canada in the construction of a major facility.

The use of + or – with a Shared impact suggest “we are all equal, but some are more impactful than others”. It is a relative assessment of the impact of the different players on a file or issue where they are part of the decision-making team.

P - Primary: The position has controlling Impact on end results, and the accountability of others is subordinate or advisory. Such an Impact is commonly found in managerial positions which have line accountability for key end result areas, be they large or small. For example:

The Director of a research unit may have Primary Impact upon the research activities done by all sections of the unit. A subordinate Manager within the unit may be accountable for the research activities in a section of the unit. Both positions could be evaluated at the Primary level, but the Area of Impact (the size of the unit or function or activity) would vary.

Note: There are no + or - modifiers for this impact.

Concept Application
The relation between control and Primary Impact

The relative size of the unit is not an issue in deciding whether the position has Primary Impact on its results. The key is that:

  • The position controls the end results of the unit
  • That control is not shared with others

Choosing the Correct Area of Impact (Magnitude) and Nature of Impact Combination

An evaluation score may differ depending on the combination of Area and Nature of Impact used. It is likely that an Executive role can be evaluated with more than one reasonable combination. For example:

  • A function head (e.g., a Director General of Human Resources) may be seen to have a Contributory Impact on the investment in the workforce of the Department (e.g., Departmental salary budget) or a Primary Impact on the operations of the Human Resources Branch (e.g., Branch Salary and O&M budget).
  • A Director General, Procurement may be seen to have a Primary Impact on the effective operation of the branch (e.g., Branch Salary and O&M budget), or an impact on the value of the goods and services being procured for the organization where the nature of that Impact may vary from Indirect to Shared depending on the influence that procurement has on obtaining the optimum return on investment on the money spent.

Evaluators are encouraged to identify the different possible combinations for the position being evaluated before selecting which might be more suitable. Very often the point totals available in the alternative slots for the possible combinations of Area and Nature of Impact will be the same; in such cases it is advisable to select the combination that best expresses the core purpose of the role. Where they are not the same, it is advisable to use the higher score to properly reflect the full job size.

The key is to find the combination of Area and Nature of Impact that results in the highest legitimate evaluation. This is because it is vital to get the fullest, most complete measure of the position for these two sub-factors as to properly reflect the job size. The table on the following pages provides some guidelines for evaluating certain types of expenditures when these are used as the proxy.

Examples of Impact for Various Dimensions

Dimension Impact to Consider

1. Salary, operation, and maintenance budget (O&M) as used to represent a divisional budget

A Primary Impact is selected when the main accountability for a unit or a program’s end results rests with the job/role being evaluated. 

A Contributory or Indirect Impact can be considered if the job plays an advisory or facilitating role either through direct action or indirectly through a modified contribution (e.g., Policy Advisor/Expert role advising on how to manage/organize/develop a unit.)

2. Capital budget (used to represent a capital program)

A Primary Impact is considered when the entire lifecycle of the capital program/project is controlled by the role.  This includes feasibility, design, construction, installation, and utilization. This is rare. Such a project would most likely have a multi-year time horizon and the total investment would be divided by the planned number of years to obtain an annual value.

A Primary Impact can also be selected when responsibility for return on investment is maximized (e.g., a Regional Director accountable for delivering services that require the use of Government of Canada assets, such as vehicles). Note: The complete value of the assets is not considered, as assets have multi-year lifespans. The evaluator needs to use the annual value, which is reflected as the Capital budget.)

More typically, the Impact is Shared or Contributory as responsibility for capital programs/projects lies with several managers, each responsible for major components of the program/project.

3. Full organizational budget (used to represent an organization’s entire annual budget)

Typically, no one position in the EX Group will be fully accountable for an entire organization’s budget. However, roles providing organizational-wide expert advice and direction can be considered to have a Contributory Impact on an organization’s overall budget. (e.g., Chief Financial Officer roles which provide oversight, recommendations and ensure financial stewardship of a department’s resources.)

4. Human resources costs (used to represent the human resources function)

A Contributory Impact is considered when the role plays a significant advisory function or is accountable for all (or a large portion) of an organization’s Human Resources program direction (e.g., Heads of Human Resources, Central Agency roles accountable for setting overall Public Service direction.)

A Primary impact is not appropriate as many jobs play a role with respect to human resources management within an organization.  While a Head of Human Resources can set direction, define policies, etc., they cannot be directly accountable for the human resources-related decisions of delegated managers in their organizations. (e.g., A Director of a unit will have direct (Prime) accountability of their unit’s salary and operating budget.)

5. Purchased materials and equipment (used to represent the purchasing function)

The Impact considered would be Contributory for a normal supply and service role (prepares and issues Requests for Proposals) or Shared where the role is one of heavy involvement in determining specifications, in addition to the normal supply and service role. The impact could also be Indirect if the role simply orders supplies/services from pre-approved suppliers with pre-negotiated prices and terms and conditions of service.

Larger procurement roles are typically found in organizations whose mandate is to provide services/support for Public Service/Canadians (e.g., Public Services and Procurement Canada and Shared Services Canada).

6. Grants and Contributions (used to represent a program)

A Contributory or Indirect Impact is considered for roles with discretion in Grants and Contribution amounts, and/or control over the end results expected from the grant or contribution. The degree to which the role impacts the Grants and Contribution will determine whether Contributory (e.g., Director General, Communications) or Indirect (e.g., Director, Financial Policy) is most appropriate.

Note: In certain cases, Grants and Contribution may not be applicable as pass-through dollars (e.g., Transfer Payments). Grants and Contribution would only be contributory if the role determined who is eligible, and evaluated the use of the money.

7. Transfer Payments (used to represent a program)

For roles with accountability for Transfer Payments that are determined by a formula with no discretion, the Impact would likely be none (e.g., Transfer Payments are viewed as Pass-Through dollars: Where the role oversees the process of payment but has no influence on whether the payment should be made, or what the payment amount should be.).

However, if the role has some discretion in determining amounts and/or use, the Impact would be Indirect because the position has some effect on the program.

8. Revolving Funds (represents payment received from clients for services rendered)

These positions do not have sufficient impact on what is to be measured. The impact is none.

Note: Payments received should not be double counted against corresponding expenditures, nor should they be used to reduce operating expenditures to a net figure.

9. Dimensions lying outside the Public Service such as value of the GDP

The relationship of Public Service positions to these dimensions is, in most cases, too remote for the evaluation of any Impact. Where influence can be clearly identified, the Impact of positions is normally Indirect and is typically exerted through legislative, regulatory or enforcement authorities. (e.g., Senior roles at Global Affairs Canada which significantly influence and shape trade relationships with other countries, or senior level roles in organizations with a broad regulatory role with impact on a sector/industry, such as Fisheries and Oceans.) 

Concept Application
Evaluating the Impact of large initiatives shared across several organizations

It is not uncommon to find jobs involved in initiatives shared across several government departments, where each department plays a key role in delivering the end result. It is reasonable to consider a Primary Impact on an individual role’s portion of the total annualized dollar value of the project or initiative, or a Shared Impact on the annualized aggregate value of the initiative.

Where one of the participating organizations is identified as the lead organization and has overall accountability for the total end results, a pull-up on the Freedom to Act should be considered; if this is judged to be appropriate, then a pull-down on the Freedom to Act for the roles in non-lead organizations should also be applied, thus reflecting the expectation that the lead organization would be vetting and endorsing the decisions taken by the other organizations. If the lead organization performs the role of coordinator (reporting on, but not accountable for the totality of the program end results), then the coordinator role is unlikely to have a material impact on the evaluation.

Concept Application
Combining the impact on unrelated activities

All jobs have multiple responsibilities, and we aim to select a nature and area of Impact (Magnitude) that encompasses all responsibilities. Often this can be achieved through selecting Primary on the operating budget of the relevant organization, and then any impact that the organization has on the entire enterprise is covered. However, it is possible to have an Executive role that wears multiple, unrelated hats. Focusing on only one element of the role would fail to appreciate the full impact.

For example, consider the Chief Financial Officer (CFO) of Department X (Budget $3B) who also has the added responsibility for reviewing the finances of Crown Corporation Y (Budget $2B) within the portfolio of the same Minister. If the initial Accountability evaluation was on the $3B budget (G5C = 608), then is the new value on $5B (G5+C = 700)?

The first question is whether the additional responsibility is “unrelated” to the core of the initial role – i.e. Is the area of impact of the additional responsibility different from the area of impact of the initial role? The second question is then, assuming a different area of impact, what is the nature of impact on this new area? In this example, assume the CFO has a Contributory (C) Impact of $3B plus a lesser Impact (say “C-”) on the $2B which is an unrelated area of Impact. The answer is not “C” on $5B because the nature of Impacts is not the same.  We have to calibrate the “C” and “C-” Impacts and bring them to a common denominator.

Reviewing the Guide Charts, we can identify that a move from “C” to “S” increases the Accountability points by 2 steps if everything else is equal (F4C = 304 and F4S = 400). Similarly, a move from “4” to “5” Magnitude increases the point score by 2 steps (F4C = 304 and F5C = 400). So, each step on either Magnitude or nature of Impact has the same influence on the point score. We can use this fact to equilibrate and combine scores.

Note – A 2 step change in the Accountability points is associated with a 10x increase in the Magnitude value, therefore a one step change is associated with a Magnitude change of just of 3 (square root of 10 = 3.16).

In our example, we start with the Impact on $3B at G5C and add the Impact on $2B of G5C-. We adjust the Magnitude of the new area of impact to achieve an equivalent nature of Impact. Now the Impact on $2B of G5C- is equivalent to G4+C ($0.63B) as $2B divided by 3.16 is $633M. Thus, we have a combined magnitude of $3.63B all at “C” and the evaluation remains at G5C = 608 because the additional responsibility was not material when calibrated to an equivalent level of nature of impact.

Now consider a position with a direct budget of $40M (F3P = 400) and a G&C Program worth $1B (F5-C = 350). What is the effect here? If the employees who are paid for through the $40M budget are administering the G&C Program, then there is no additional impact. These are not unrelated, they are different ways of looking at the impact that the team has, so we would choose the larger option but not combine them.

However, if the Director performs the responsibility of advising on the G&C Program outside of the work of their team, then this could be viewed as an added responsibility and a calculation performed. To equilibrate at “P” equivalence we have to increase the G&C nature of Impact by 4 steps, so we have to reduce the magnitude by 4 steps. Reducing $1B by 4 steps gives us $10M. The combined assessment is now F 3+ P = 460 because $40M + $10M = $50M which is “3+” rather than “3”. In this example, the combination does have a material impact – but only if the G&C Program is not already recognized in the work that the Director’s team undertakes. (e.g., Assistant Deputy Minister, Citizen Services)

Combining the Accountability Sub-Factors

The result of evaluating the three sub-factors is that the evaluation falls within a cell with three possible point values, each representing one step up in size.

Example 5: A cell with three possible point values, each representing one step up in size
 A cell with three possible point values, each representing one step up in size. Text version below:
Example 5 - Text version
  • Low: 350
  • Medium: 400
  • High: 460

The number chosen from the cell will depend upon your assessment of the relative strength of the job's fit to sub-factor definitions. However, unlike the sub-factors of Know-How and Problem Solving (which have a more or less equal weight in determining the factor's score), in the case of Accountability, the fit of the Freedom to Act sub-factor is the most important one to consider. As such, it is possible for the evaluator to place more weight on the Freedom to Act rating as compared to the other sub-factor ratings when deciding on the point score.

Concept Application
Influence or weight of Freedom to Act compared to Area and Nature of Impact

Where appropriate, a double modifier can be used to illustrate the importance of the Freedom to Act sub-factor in the following instances:

  • Where there is a pull-up on the Freedom to Act and a pull-down on the Nature and/or Area of Impact, to illustrate greater influence of Freedom to Act, a double-plus (++) can be recorded to indicate the extra weight placed on Freedom to Act versus the Nature and Area of Impact (e.g., E+3-P = 264 points, E++3-P = 304 points), or 
  • Where there is a pull-down on the Freedom to Act and a pull-up on the Nature and/or Area of Impact, to illustrate the importance of Freedom to Act a double minus (- -) can be recorded to indicate more weight is placed on Freedom to Act over the Nature and Area of Impact (e.g., E-3+P = 264, E--3+P = 230). (e.g., Regional Director General, Maritimes)

In assessing whether a double modifier is appropriate, the resulting Job Profile (further explained below) should be considered as well as a comparison of the total Accountability points to Benchmark positions.

Double-Check: Checking the Complete Evaluation

Validating the evaluation

There is generally a logical relationship in work between the level of depth and range of Practical, Technical, Specialized knowledge required and the constraints in the thinking environment which carries through to the degree of latitude or freedom to act present in the role. This quality assurance process is known as technical consistency or the waterfall test.

The Freedom to Act is generally lower or equal to the Thinking Environment, and the Thinking Environment is generally lower or equal to the Practical, Technical, Specialized Know-How. For example:

KH (PTS) PS (TE) ACC (FA)
Most likely F+ F E+
Unlikely F+ E+ F
Unlikely F G F

If Freedom to Act were higher than Thinking Environment, this would suggest that the position makes decisions beyond what it is asked to think about, and if Thinking Environment is greater than Technical Know-How, this would suggest that the position is being asked to think about things that it is not capable of understanding.

Job Profiles

The evaluation score attributed to a job gives an indication of its size relative to other jobs. It answers the question: How big is this job compared with other jobs? If we analyze the distribution of Know-How, Problem Solving and Accountability scores within the total score we discover something about the nature or shape of the job. This can be seen as answering the question: What sort of job is this? The distribution of the evaluation factors within the total score is independent of job size.

When evaluation committees become familiar with the profiling concept as an aid in understanding the shape of a job, it can be useful to estimate the job's profile before evaluating it in detail on the guide charts. The committee can compare their initial profile with the derived profile after evaluation. The main tool in this regard is the short profile.

Short profiles

An easy and convenient way of looking at the type of job and the balance between the criteria is to look just at the relationship between the Problem Solving score (the thinking element of the job) and the Accountability score (the results element of the job).

The short profile is determined by contrasting the Accountability score with the Problem Solving score and determining the number of step differences between the two (if any). Thus:

  • When the Accountability score is larger there is an A profile
  • When the Problem Solving score is larger there is a P profile
  • And when the two scores are the same, we have a ‘level profile known as L

An action-oriented job is primarily oriented toward generating end results. Problem Solving takes a secondary position in these roles. Therefore, the points given to Accountability will be higher than those for Problem Solving. This is known as an A Profile.

A thinking job exists to apply Know-How in the analysis, investigation, and identification of situations. The Problem Solving points will be greater than those for Accountability. This is known as a P Profile.

A balanced or level job is one in which the Accountability and Problem Solving points are the same. The position will be staff-oriented and have responsibility for managerial or supervisory functions. This is known as a L Profile.

While there are no fixed rules, particular types of jobs do tend to have predictable profiles.

P Profiles

P4, P3 Problem Solving points exceed Accountability points by four or three steps, respectively. Jobs with these profiles will tend to be concerned with performing basic or pure research, with little orientation to, or regard for, development aspects. P4 Jobs will rarely be found outside a university and would therefore be very unlikely for EX Group work.

P2, P1 Problem Solving points exceed Accountability points by two steps or one step, respectively. Applied research or policy development jobs will tend to have these profiles. These profiles are less likely for Executives positions as they are typically accountable for results.

L Profiles

L Problem Solving points equal Accountability points. Jobs with these profiles will tend to involve providing advisory or analytical services in staff functions or supervisory positions such as financial analysts or heads of functional specialties.

A Profiles

A1 Accountability points exceed Problem Solving points by one step. Jobs with A1 profiles are often hybrid jobs with significant people management responsibilities (such as human resources managers), line management positions, or jobs which receive a significant degree of direction from functional units, such as project managers or regional directors of administrative services.

A2 These profiles are found in line management jobs which have a clear and well-defined responsibility for achieving results, such as regional director for operations.

A3, A4 Examples of this profile are unusual but can occur where the Accountability for results is very large and immediate, but the Problem Solving or Know-How content of the job is relatively low.

Profiling allows the validity of evaluations to be checked against typical job profiles. Discrepancies, if found, may indicate an incorrect evaluation. However, they might also indicate an inappropriately structured job. Therefore, it is important to avoid letting profiles drive the evaluation process. EX Group positions are most commonly found within P1, L, A1 and A2 job profiles.

Validating Against and Using the Benchmarks

The most important test of the validity of the evaluation is finding comparable reference evaluations in the standardized continuum of the Benchmarks, as outlined in Appendix C. While it is not realistic to illustrate all variations of EX Group work with individual Benchmarks, taken together the updated Benchmark jobs offer a comprehensive snapshot of modern EX Group work.

The Benchmark reference positions have two critical roles to play in the job evaluation process:

  • They provide the necessary discipline of a constant set of reference points.

The Benchmark evaluations have been thoroughly checked to ensure that the job evaluation method was applied consistently and appropriately. As a result, they provide a constant standard against which to evaluate positions, making them the key tool for ensuring consistency in the application of the Korn Ferry Hay Guide Chart methodology, over time, throughout the federal government.

  • They allow for flexibility within a disciplined framework.

Given the number and complexity of EX Group positions to be evaluated, it would be impossible to attempt to provide exact matches for every possible combination of Executive work. By providing a sufficient number and variety of constant Benchmark positions but allowing evaluators to use their common sense in using them, both flexibility and discipline can be built into the process.

Copies of the Benchmark reference positions are found in Appendix B.

Process for Selecting Suitable Benchmarks

1) How Benchmarks are arranged

The Benchmarks are arranged by level in Appendix B.

2) Information needed about the subject position to select a benchmark

To select suitable Benchmarks, the evaluator first needs to know:

  • Where the subject position fits in the organization (e.g., number of levels from the Deputy Head)
  • The job function (e.g., financial, operational, human resources)
  • The basic nature of the job (e.g., to think deeply, as in research positions, to think broadly, as in policy development, to direct activities, as in field operations, or to administer policies and practices, as in staff positions)
  • Whether the position is line or staff, regional or located at headquarters

All this information should be in the job description.

3) Selecting Suitable Benchmarks

Two or three Benchmark comparisons should be sufficient for testing the validity of an evaluation. A strong comparator is one in which the organizational context, the overall evaluation, and the evaluations of the three factors – Know-How, Problem Solving and Accountability are all similar to that of the subject position. However, given the diversity of work in the EX Group it is reasonable to use different Benchmarks to support each complete factor – one should not break up sub-factors to support an evaluation. For example, one Benchmark might provide a close fit on the Know-How factor but not work well on the Accountability factor. The best thing to do would be to try to find another, more suitable, Benchmark to validate the Accountability rating, mindful of the relationship between evaluation factors (e.g., Selecting Benchmarks which have the same profiles, or the same profile as the job being evaluated).

Organization Check

A crucial test of the validity of the evaluation is whether it fits with the evaluations for other positions in the unit. This means that when you isolate each factor, the step differences between the subject job and the supervisor, peer and subordinate positions for each factor all make sense.

A common evaluation error is over-emphasizing the differences between peer positions and under-emphasizing the differences between superior and subordinate. Note, however, that there are no rules for determining the proper relationship between levels in an organization. Each case must be assessed on its own.

For example: the chart on the table below shows four organizational structures with very different superior/subordinate relationships. In each case, the step differences between the factors for the two levels change. However, these examples should not be taken as hard and fast rules. They simply serve to demonstrate:

  • a variety of superior/subordinate relationships that can make sense
  • the importance of looking at the reality of the actual departmental structure when testing the validity of a new evaluation

In the final analysis, as throughout the evaluation process, informed common sense should be the tool for making and checking all judgments.

Organization Check – Validity of Evaluation Likely Step Differences
Designation Unit Diagram Common Characteristics Know How Problem Solving (%) Accountability
“Normal” Superior/Subordinate Relationship Normal Superior/Subordinate Relationship
  • Reasonable span of control
  • Clear channels of communication
  • Balance between subordinates
  • Management delegation
2 1 3
Lean Staff Lean Staff
  • Broad span of control
  • Communication processes fuzzy
  • Imbalance between subordinate positions
  • Management and decision-making centralized
3 1 or 2 3 to 5
Missing Level Missing Level
  • Large span of control
  • Communication of tasks only
  • Large differentials in subordinate position
  • Management and decision-making highly centralized
4 2 5 to 7
One-over-one (Not typically supported in the Executive Group) One Over One
  • One subordinate
  • Superior and subordinate function as a team
  • Usually a temporary “grooming” position for subordinate prior to taking over the top position
  • Appropriate where criticality of top job dictates (i.e., CEO) or where there is a need for a split between external focus and internal focus
1 1 2

Classification Levels: Executive Group

The total points assigned through the evaluation process will determine the classification level for a newly evaluated job. Positions within the assigned point bands fall into compensation levels from EX-1 to EX-5. The bands are as follows:

Executive Group Classification Levels
Classification Level Minimum Points Maximum Points
EX-5 2448 N/A
EX-4 1868 2447
EX-3 1560 1867
EX-2 1262 1559
EX-1 920 1261

Appendix A – Guide Charts

Guide Chart for Evaluating Know-How

Definition

Know-How includes all relevant knowledge, skill and experience, however acquired, needed for acceptable performance in a job/role, in the following sub-factors:

Practical, Technical, Specialized Know-How

Varied applied skills, including those relating to human relations, knowledge of the position’s environment and clientele (e.g., the public, industry, special interest groups, other governments, etc.), practical procedures, specialized techniques and/or scientific/professional disciplines. This is used to recognize increasing specialization (depth) and/or the requirement for a greater breadth (scope) of knowledge.

Depth – Ranges from Knowledge or simple work routines to unique and authoritative expertise.

Scope – Covers the variety of techniques, disciplines, processes, products, etc., from few to many about which knowledge is required

Planning, Organizing and Integrating (Managerial) Knowledge
The knowledge required for integrating and managing activities, functions and resources. It involves combining some or all of the elements of planning, organizing, coordinating, directing, executing and controlling over time. Managerial knowledge is related to the size of an organization, functional and geographic diversity and time horizon. It may be exercised directly or in an advisory/ consultative way.
Communicating and Influencing Skills
A measure of how relatively crucial, critical and difficult the requirements of the job for working with, and through, others to achieve end results both inside and outside the organization
Measuring Practical, Technical, Specialized Know-How
This type of knowledge and skill may be characterized by breadth (variety), or depth (complexity), or both. Jobs may require some combination of: various skills; some knowledge about many things; a good deal of knowledge about a few things. Thus, to measure this kind of Know-How, the evaluator has to understand what skills are needed and how much knowledge is needed about how many things and how complex each of them is?
Function
A group of diverse activities which, because of common objectives, similar skill requirements, and strategic importance to an organization, are usually directed by a member of top management.
Subfunction
A major activity which is part of, and more homogeneous than, a function.
Element
A part of a subfunction; usually very specialized in nature and restricted in scope or impact.
Planning, Organizing and Integrating Knowledge
II. Operation of a unit with activities that are relatively similar in nature and objective, OR guidance of a sub-function(s) or several significant elements across several units where there is a requirement for coordination with associated functions. III. Operation of a large unit with activities that are noticeably different in objective where integration is critical to the achievement of the end results/goals, OR guidance of a function(s) that affects all of the organization. IV. Operation of a major unit with activities that are significantly different and divergent with respect to objectives and end results OR guidance of a strategic function(s) that significantly affects the organization’s planning and operations. V. Overall strategic integration and leadership of the organization OR total management of the major segment of a very large organization.
Communicating and Influencing SkillsTable 1 - Footnote 1 2 3 1 2 3 1 2 3 1 2 3
Notes
Table 1 Note 1

Communicating and Influencing Skills:

  1. Courtesy, tact and effectiveness must be employed, and an appropriate working relationship established and maintained with others in order to accomplish the position’s objectives.
  2. Interaction with others requires understanding, influencing, and supporting people, through applying technical knowledge or rational arguments, aimed at causing action or acceptance by others.
  3. Interaction with others, internal and external to the organization, is concerned with influencing, developing and motivating people and changing behavior. It involves inspiration and the creation of the right working climate. Successful achievement of the position’s program/service/advisory objectives HINGES on the establishment and maintenance of appropriate interpersonal relationships.

Return to Table 1 - Footnote referrer

Table 1 Note 2

Most Likely

Return to Table 1 - Footnote * referrer

Table 1 Note 3

Less likely

Return to Table 1 - Footnote ** referrer

Table 1 Note 4

Unlikely

Return to Table 1 - Footnote *** referrer

Practical, Technical, Specialized Know-How E. A sound understanding of and skill in several activities which involve a variety of practices and precedents with respect to the organization’s processes, operation and clientele, OR a grasp of a scientific or similar discipline’s theory and principles, OR both. 200
Table 1 Note ***
230
Table 1 Note ***
264
Table 1 Note ***
264
Table 1 Note ***
304
Table 1 Note ***
350
Table 1 Note ***
350
Table 1 Note ***
400
Table 1 Note ***
460
Table 1 Note ***
460
Table 1 Note ***
528
Table 1 Note ***
608
Table 1 Note ***
230
Table 1 Note ***
264
Table 1 Note ***
304
Table 1 Note ***
304
Table 1 Note ***
350
Table 1 Note ***
400
Table 1 Note ***
400
Table 1 Note ***
460
Table 1 Note ***
528
Table 1 Note ***
528
Table 1 Note ***
608
Table 1 Note ***
700
Table 1 Note ***
264
Table 1 Note ***
304
Table 1 Note ***
350
Table 1 Note ***
350
Table 1 Note ***
400
Table 1 Note ***
460
Table 1 Note ***
460
Table 1 Note ***
528
Table 1 Note ***
608
Table 1 Note ***
608
Table 1 Note ***
700
Table 1 Note ***
800
Table 1 Note ***
F. Broad and/or deep knowledge in a field of expertise requiring a command of diverse practices and precedents and/or sophisticated concepts, principles and issues relating to the organization and its clients, OR command of complex scientific theory, principles and practices, OR both. 264
Table 1 Note ***
304
Table 1 Note ***
350
Table 1 Note ***
350
Table 1 Note ***
400
Table 1 Note **
460
Table 1 Note *
460
Table 1 Note ***
528
Table 1 Note ***
608
Table 1 Note **
608
Table 1 Note ***
700
Table 1 Note ***
800
Table 1 Note ***
304
Table 1 Note ***
350
Table 1 Note ***
400
Table 1 Note **
400
Table 1 Note ***
460
Table 1 Note **
528
Table 1 Note *
528
Table 1 Note ***
608
Table 1 Note ***
700
Table 1 Note **
700
Table 1 Note ***
800
Table 1 Note ***
920
Table 1 Note ***
350
Table 1 Note ***
400
Table 1 Note **
460
Table 1 Note *
460
Table 1 Note ***
528
Table 1 Note **
608
Table 1 Note *
608
Table 1 Note ***
700
Table 1 Note ***
800
Table 1 Note **
800
Table 1 Note ***
920
Table 1 Note ***
1056
Table 1 Note ***
G. Mastery of theories, principles, and techniques, or the cumulative equivalent command, of the interrelationships, variables, and competing demands of the organization and its clients, and related programs and other issues necessary to advise AND/OR implement program at the executive management or executive policy levels of the organization. 350
Table 1 Note ***
400
Table 1 Note **
460
Table 1 Note *
460
Table 1 Note ***
528
Table 1 Note **
608
Table 1 Note *
608
Table 1 Note ***
700
Table 1 Note ***
800
Table 1 Note *
800
Table 1 Note ***
920
Table 1 Note ***
1056
Table 1 Note ***
400
Table 1 Note ***
460
Table 1 Note **
528
Table 1 Note *
528
Table 1 Note ***
608
Table 1 Note **
700
Table 1 Note *
700
Table 1 Note ***
800
Table 1 Note ***
920
Table 1 Note *
920
Table 1 Note ***
1056
Table 1 Note ***
1216
Table 1 Note ***
460
Table 1 Note ***
528
Table 1 Note **
608
Table 1 Note *
608
Table 1 Note ***
700
Table 1 Note **
800
Table 1 Note *
800
Table 1 Note ***
920
Table 1 Note ***
1056
Table 1 Note *
1056
Table 1 Note ***
1216
Table 1 Note ***
1400
Table 1 Note ***
H. Externally recognized mastery and expertise in a complex scientific field or other learned discipline. This level would normally be associated with ongoing ground-breaking work. 460
Table 1 Note ***
528
Table 1 Note ***
608
Table 1 Note **
608
Table 1 Note ***
700
Table 1 Note ***
800
Table 1 Note **
800
Table 1 Note ***
920
Table 1 Note ***
1056
Table 1 Note **
1056
Table 1 Note ***
1216
Table 1 Note ***
1400
Table 1 Note ***
528
Table 1 Note ***
608
Table 1 Note ***
700
Table 1 Note ***
700
Table 1 Note ***
800
Table 1 Note ***
920
Table 1 Note ***
920
Table 1 Note ***
1056
Table 1 Note ***
1216
Table 1 Note ***
1216
Table 1 Note ***
1400
Table 1 Note ***
1600
Table 1 Note ***
608
Table 1 Note ***
700
Table 1 Note ***
800
Table 1 Note ***
800
Table 1 Note ***
920
Table 1 Note ***
1056
Table 1 Note ***
1056
Table 1 Note ***
1216
Table 1 Note ***
1400
Table 1 Note ***
1400
Table 1 Note ***
1600
Table 1 Note ***
1840
Table 1 Note ***

Guide Chart for Evaluating Know-How (PDF Document – 245.1 KB)

Guide Chart for Evaluating Problem Solving

Definition:

Problem Solving is the amount and nature of the thinking required in the job in the form of analyzing, reasoning, evaluating, creating, using judgment, forming hypotheses, drawing inferences and arriving at conclusions. Problem Solving deals with the intensity of the mental processes that use Know-How to identify and solve problems.

There are two sub-factors:

Thinking Environment – Freedom to Think:
The extent to which assistance or guidance is available from others or from past practice or precedents, and the degree to which the position is required to identify situations where direction or precedents are not applicable. How well defined or nebulous is the problem, issue, etc.?
Thinking Challenge:
The novelty and complexity of the problems encountered and the extent to which original thinking must be employed to arrive at solutions.
Measuring Problem Solving:
Problem Solving measures the intensity of the mental process which employs Know-How in analyzing, evaluating, creating, reasoning, arriving at and making conclusions. To the extent that Problem Solving is circumscribed by standards, covered by precedents or referred to others, the scope of the Problem Solving is diminished and the emphasis correspondingly is on Know-How.
N.B.
The evaluation of Problem Solving should be made without reference to the job's freedom to make decisions or take action; the scope and nature of the job's decisions are measured on the Accountability chart.
Thinking Challenge
3. Differing situations requiring the identification of issues, the application judgment, and the selection of solutions within the area of expertise and acquired knowledge. Appropriate action selected based on experience. Some use of judgement required. 4. Situations constantly requiring adaptation or development of new solutions through analytical, interpretative, evaluative, creative and innovative thinking. 5. Novel and path-finding situations requiring the development of new concepts and imaginative solutions for which there are no precedents.
Notes
Table 2 Note 1

Most Likely

Return to Table 2 - Footnote * referrer

Table 2 Note 2

Less likely

Return to Table 2 - Footnote ** referrer

Table 2 Note 3

Unlikely

Return to Table 2 - Footnote *** referrer

Thinking Environment D. Thinking within clear but substantially diversified procedures; precedents covering many situations, and/or access to assistance. 29%
Table 2 Note ***
38%
Table 2 Note ***
50%
Table 2 Note ***
33%
Table 2 Note ***
43%
Table 2 Note ***
57%
Table 2 Note ***
E. Thinking within a well defined frame of reference and towards specific objectives, in situations characterized by specific policies, practices, and precedents. 33%
Table 2 Note ***
43%
Table 2 Note *
57%
Table 2 Note **
38%
Table 2 Note ***
50%
Table 2 Note *
66%
Table 2 Note ***
F. Thinking within a general frame of reference toward objectives, in situations with some nebulous, intangible, or unstructured aspects. 38%
Table 2 Note ***
50%
Table 2 Note *
66%
Table 2 Note **
43%
Table 2 Note **
57%
Table 2 Note *
76%
Table 2 Note ***
G. Thinking within general policies, principles and goals of the organization; many nebulous, intangible, or unstructured aspects to the environment. 43%
Table 2 Note ***
57%
Table 2 Note *
76%
Table 2 Note ***
50%
Table 2 Note ***
66%
Table 2 Note *
87%
Table 2 Note ***
H. Thinking within the organization philosophy, general laws of nature and science, business philosophy and cultural values. 50%
Table 2 Note ***
66%
Table 2 Note ***
87%
Table 2 Note ***
57%
Table 2 Note ***
76%
Table 2 Note ***

Guide Chart for Evaluating Problem Solving (PDF Document – 144.3 KB)

Problem Solving Points are at the Intersection of the Column for the Know-How Score and the Row for the Problem Solving Percentage
%
PS
Know-How Points %
PS
Notes
Table 3 Note 1

Most Likely

Return to Table 3 - Footnote * referrer

Table 3 Note 2

Less likely

Return to Table 3 - Footnote ** referrer

Table 3 Note 3

Unlikely

Return to Table 3 - Footnote *** referrer

N/A 50
Table 3 Note *
57
Table 3 Note *
66
Table 3 Note *
76
Table 3 Note *
87
Table 3 Note *
100
Table 3 Note *
115
Table 3 Note *
132
Table 3 Note *
152
Table 3 Note *
175
Table 3 Note *
200
Table 3 Note *
230
Table 3 Note *
264
Table 3 Note *
304
Table 3 Note *
350
Table 3 Note *
400
Table 3 Note *
460
Table 3 Note *
528
Table 3 Note *
608
Table 3 Note *
700
Table 3 Note *
800
Table 3 Note *
920
Table 3 Note *
1056
Table 3 Note *
1216
Table 3 Note *
1400
Table 3 Note *
N/A
87% 43
Table 3 Note ***
50
Table 3 Note ***
57
Table 3 Note ***
66
Table 3 Note ***
76
Table 3 Note ***
87
Table 3 Note ***
100
Table 3 Note ***
115
Table 3 Note ***
132
Table 3 Note ***
152
Table 3 Note ***
175
Table 3 Note ***
200
Table 3 Note ***
230
Table 3 Note ***
264
Table 3 Note ***
304
Table 3 Note ***
350
Table 3 Note ***
400
Table 3 Note ***
460
Table 3 Note ***
528
Table 3 Note ***
608
Table 3 Note ***
700
Table 3 Note ***
800
Table 3 Note ***
920
Table 3 Note ***
1056
Table 3 Note *
1216
Table 3 Note *
87%
76% 38
Table 3 Note ***
43
Table 3 Note ***
50
Table 3 Note ***
57
Table 3 Note ***
66
Table 3 Note ***
76
Table 3 Note ***
87
Table 3 Note ***
100
Table 3 Note ***
115
Table 3 Note ***
132
Table 3 Note ***
152
Table 3 Note ***
175
Table 3 Note ***
200
Table 3 Note ***
230
Table 3 Note ***
264
Table 3 Note ***
304
Table 3 Note ***
350
Table 3 Note ***
400
Table 3 Note ***
460
Table 3 Note ***
528
Table 3 Note ***
608
Table 3 Note ***
700
Table 3 Note ***
800
Table 3 Note **
920
Table 3 Note *
1056
Table 3 Note *
76%
66% 33
Table 3 Note ***
38
Table 3 Note ***
43
Table 3 Note ***
50
Table 3 Note ***
57
Table 3 Note ***
66
Table 3 Note ***
76
Table 3 Note ***
87
Table 3 Note ***
100
Table 3 Note ***
115
Table 3 Note ***
132
Table 3 Note ***
152
Table 3 Note ***
175
Table 3 Note ***
200
Table 3 Note ***
230
Table 3 Note ***
264
Table 3 Note ***
304
Table 3 Note ***
350
Table 3 Note ***
400
Table 3 Note ***
460
Table 3 Note **
528
Table 3 Note **
608
Table 3 Note *
700
Table 3 Note *
800
Table 3 Note **
920
Table 3 Note **
66%
57% 29
Table 3 Note ***
33
Table 3 Note ***
38
Table 3 Note ***
43
Table 3 Note ***
50
Table 3 Note ***
57
Table 3 Note ***
66
Table 3 Note ***
76
Table 3 Note ***
87
Table 3 Note ***
100
Table 3 Note ***
115
Table 3 Note ***
132
Table 3 Note ***
152
Table 3 Note ***
175
Table 3 Note ***
200
Table 3 Note ***
230
Table 3 Note ***
264
Table 3 Note ***
304
Table 3 Note **
350
Table 3 Note *
400
Table 3 Note *
460
Table 3 Note *
528
Table 3 Note **
608
Table 3 Note ***
700
Table 3 Note ***
800
Table 3 Note ***
57%
50% 25
Table 3 Note ***
29
Table 3 Note ***
33
Table 3 Note ***
38
Table 3 Note ***
43
Table 3 Note ***
50
Table 3 Note ***
57
Table 3 Note ***
66
Table 3 Note ***
76
Table 3 Note ***
87
Table 3 Note ***
100
Table 3 Note ***
115
Table 3 Note ***
132
Table 3 Note ***
152
Table 3 Note ***
175
Table 3 Note ***
200
Table 3 Note ***
230
Table 3 Note *
264
Table 3 Note *
304
Table 3 Note *
350
Table 3 Note ***
400
Table 3 Note ***
460
Table 3 Note ***
528
Table 3 Note ***
608
Table 3 Note ***
700
Table 3 Note ***
50%
43% 22
Table 3 Note ***
25
Table 3 Note ***
29
Table 3 Note ***
33
Table 3 Note ***
38
Table 3 Note ***
43
Table 3 Note ***
50
Table 3 Note ***
57
Table 3 Note ***
66
Table 3 Note ***
76
Table 3 Note ***
87
Table 3 Note ***
100
Table 3 Note ***
115
Table 3 Note ***
132
Table 3 Note ***
152
Table 3 Note ***
175
Table 3 Note **
200
Table 3 Note *
230
Table 3 Note **
264
Table 3 Note ***
304
Table 3 Note ***
350
Table 3 Note ***
400
Table 3 Note ***
460
Table 3 Note ***
528
Table 3 Note ***
608
Table 3 Note ***
43%
38% 19
Table 3 Note ***
22
Table 3 Note ***
25
Table 3 Note ***
29
Table 3 Note ***
33
Table 3 Note ***
38
Table 3 Note ***
43
Table 3 Note ***
50
Table 3 Note ***
57
Table 3 Note ***
66
Table 3 Note ***
76
Table 3 Note ***
87
Table 3 Note ***
100
Table 3 Note ***
115
Table 3 Note **
132
Table 3 Note *
152
Table 3 Note *
175
Table 3 Note **
200
Table 3 Note ***
230
Table 3 Note ***
264
Table 3 Note ***
304
Table 3 Note ***
350
Table 3 Note ***
400
Table 3 Note ***
460
Table 3 Note ***
528
Table 3 Note ***
38%
33% 16
Table 3 Note ***
19
Table 3 Note ***
22
Table 3 Note ***
25
Table 3 Note ***
29
Table 3 Note ***
33
Table 3 Note ***
38
Table 3 Note ***
43
Table 3 Note ***
50
Table 3 Note ***
57
Table 3 Note ***
66
Table 3 Note ***
76
Table 3 Note ***
87
Table 3 Note ***
100
Table 3 Note **
115
Table 3 Note *
132
Table 3 Note *
152
Table 3 Note ***
175
Table 3 Note ***
200
Table 3 Note ***
230
Table 3 Note ***
264
Table 3 Note ***
304
Table 3 Note ***
350
Table 3 Note ***
400
Table 3 Note ***
460
Table 3 Note ***
33%
N/A 50
Table 3 Note *
57
Table 3 Note *
66
Table 3 Note *
76
Table 3 Note *
87
Table 3 Note *
100
Table 3 Note *
115
Table 3 Note *
132
Table 3 Note *
152
Table 3 Note *
175
Table 3 Note *
200
Table 3 Note *
230
Table 3 Note *
264
Table 3 Note *
304
Table 3 Note *
350
Table 3 Note *
400
Table 3 Note *
460
Table 3 Note *
528
Table 3 Note *
608
Table 3 Note *
700
Table 3 Note *
800
Table 3 Note *
920
Table 3 Note *
1056
Table 3 Note *
1216
Table 3 Note *
1400
Table 3 Note *
N/A

Guide Chart for Evaluating Problem Solving Points (PDF Document – 135.9 KB)

Guide Chart for Evaluating Accountability

Accountability:

Measures the extent to which a job/role is answerable for actions and their consequences. It measures the effect of the job/role on end-results. It measures the following sub-factors in the following order of importance: Freedom to Act, Nature of Impact, Magnitude (Area of Impact)

Freedom to Act:
The extent to which the job/role, through delegation or empowerment, decides on the measures and actions to be taken to achieve the required results before seeking advice or direction.
Nature of Impact on end results:
The size of the influence (direct or indirect) the job has on end results. The Nature and Area of Impact (Magnitude) are always considered together: As defined below.
Area of Impact (Magnitude):
The area of the organization most clearly affected by decisions AND/OR recommendations of the job. The general size of the unit, function, program or the element of society affected. This is the least important sub-factor.
NB:
Magnitude and Impact must fit together, neither can be final or meaningful without being related to the other.
Nature of Impact on end results:
The degree to which the job affects or brings about the results expected of the unit or function being considered.
Indirect (I):
Supportive and ancillary services where activities are noticeably removed from final end results and assistance is modified or merged with other support before the end result stage.
Contributory (C):
Interpretive, advisory or facilitating services, for use by others in taking action that are influential and closely related to action or decisions by others OR measurable contribution, as a member of team in achieving results.
Shared (S):
Joint and significant control with one or more position(s) (except own subordinates and superior) over the activities and resources that produce the results, OR control of what are clearly many (but not all) of the significant variables in determining results.
Primary (P):
Controlling impact on end results – The position has effective control over the significant activities and resources that produce the end results and is the sole position (at its level of Freedom to Act) that must answer for the results.
Area of Impact (Magnitude) 1. Very Small
(under $100K)
2. Small
($100K to $1 Million)
3. Medium
($1 to $10 Million)
4. Medium–Large
($10 to $100 Million)
5. Large
($100 Million to $1 Billion)
6. Very Large
($1 to $10 Billion)
7. Largest
(over $10 Billion)
Nature of Impact I C S P I C S P I C S P I C S P I C S P I C S P I C S P
Notes
Table 4 Note 1

Most Likely

Return to Table 4 - Footnote * referrer

Table 4 Note 2

Less likely

Return to Table 4 - Footnote ** referrer

Table 4 Note 3

Unlikely

Return to Table 4 - Footnote *** referrer

Freedom to Act D. Operating within practices and procedures covered by precedents or well-defined policies and review of end results, usually after the fact. 38
Table 4 Note ***
50
Table 4 Note ***
66
Table 4 Note ***
87
Table 4 Note ***
50
Table 4 Note ***
66
Table 4 Note ***
87
Table 4 Note ***
115
Table 4 Note ***
66
Table 4 Note ***
87
Table 4 Note ***
115
Table 4 Note ***
152
Table 4 Note ***
87
Table 4 Note ***
115
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note ***
115
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note ***
264
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note ***
264
Table 4 Note ***
350
Table 4 Note ***
200
Table 4 Note ***
264
Table 4 Note ***
350
Table 4 Note ***
460
Table 4 Note ***
43
Table 4 Note ***
57
Table 4 Note ***
76
Table 4 Note ***
100
Table 4 Note ***
57
Table 4 Note ***
76
Table 4 Note ***
100
Table 4 Note ***
132
Table 4 Note ***
76
Table 4 Note ***
100
Table 4 Note ***
132
Table 4 Note ***
175
Table 4 Note ***
100
Table 4 Note ***
132
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note ***
132
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note ***
304
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note ***
304
Table 4 Note ***
400
Table 4 Note ***
230
Table 4 Note ***
304
Table 4 Note ***
400
Table 4 Note ***
528
Table 4 Note ***
50
Table 4 Note ***
66
Table 4 Note ***
87
Table 4 Note ***
115
Table 4 Note ***
66
Table 4 Note ***
87
Table 4 Note ***
115
Table 4 Note ***
152
Table 4 Note ***
87
Table 4 Note ***
115
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note ***
115
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note ***
264
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note ***
264
Table 4 Note ***
350
Table 4 Note ***
200
Table 4 Note ***
264
Table 4 Note ***
350
Table 4 Note ***
460
Table 4 Note ***
264
Table 4 Note ***
350
Table 4 Note ***
460
Table 4 Note ***
608
Table 4 Note ***
E. Operating within broad practices and procedures covered by functional precedents and policies and managerial direction, with well-defined objectives. 57
Table 4 Note ***
76
Table 4 Note ***
100
Table 4 Note ***
132
Table 4 Note ***
76
Table 4 Note ***
100
Table 4 Note ***
132
Table 4 Note ***
175
Table 4 Note ***
100
Table 4 Note ***
132
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note *
132
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note *
304
Table 4 Note *
175
Table 4 Note ***
230
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
230
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
66
Table 4 Note ***
87
Table 4 Note ***
115
Table 4 Note ***
152
Table 4 Note ***
87
Table 4 Note ***
115
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note **
115
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note **
264
Table 4 Note *
152
Table 4 Note ***
200
Table 4 Note **
264
Table 4 Note *
350
Table 4 Note *
200
Table 4 Note **
264
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
264
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
800
Table 4 Note *
76
Table 4 Note ***
100
Table 4 Note ***
132
Table 4 Note ***
175
Table 4 Note ***
100
Table 4 Note ***
132
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note *
132
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note *
304
Table 4 Note *
175
Table 4 Note ***
230
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
230
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
920
Table 4 Note *
F. Operating within general direction and broadly defined policy objectives with managerial direction of a general nature. 87
Table 4 Note ***
115
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note **
115
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note **
264
Table 4 Note *
152
Table 4 Note ***
200
Table 4 Note **
264
Table 4 Note *
350
Table 4 Note *
200
Table 4 Note **
264
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
264
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
800
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
800
Table 4 Note *
1056
Table 4 Note **
100
Table 4 Note ***
132
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note *
132
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note *
304
Table 4 Note *
175
Table 4 Note ***
230
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
230
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
920
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
920
Table 4 Note *
1216
Table 4 Note ***
115
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note **
264
Table 4 Note *
152
Table 4 Note ***
200
Table 4 Note **
264
Table 4 Note *
350
Table 4 Note *
200
Table 4 Note **
264
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
264
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
800
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
800
Table 4 Note *
1056
Table 4 Note **
608
Table 4 Note *
800
Table 4 Note *
1056
Table 4 Note **
1400
Table 4 Note ***
G. Operating only within overall general guidance from top-most management on broad organizational objective and orientation of strategic policy. 132
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note *
304
Table 4 Note *
175
Table 4 Note ***
230
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
230
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
920
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
920
Table 4 Note *
1216
Table 4 Note ***
700
Table 4 Note *
920
Table 4 Note *
1216
Table 4 Note ***
1600
Table 4 Note ***
152
Table 4 Note ***
200
Table 4 Note **
264
Table 4 Note *
350
Table 4 Note *
200
Table 4 Note **
264
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
264
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
350
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
800
Table 4 Note *
460
Table 4 Note *
608
Table 4 Note *
800
Table 4 Note *
1056
Table 4 Note **
608
Table 4 Note *
800
Table 4 Note *
1056
Table 4 Note **
1400
Table 4 Note ***
800
Table 4 Note *
1056
Table 4 Note **
1400
Table 4 Note ***
1840
Table 4 Note ***
175
Table 4 Note ***
230
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
230
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
304
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
400
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
920
Table 4 Note *
528
Table 4 Note *
700
Table 4 Note *
920
Table 4 Note *
1216
Table 4 Note ***
700
Table 4 Note *
920
Table 4 Note *
1216
Table 4 Note ***
1600
Table 4 Note ***
920
Table 4 Note *
1216
Table 4 Note ***
1600
Table 4 Note ***
2112
Table 4 Note ***

Guide Chart for Evaluating Accountability (PDF Document – 232.8 KB)

Appendix B – Benchmark Index by Level

Notes:

+ denotes a pull-up.

- denotes a pull-down.

Benchmark Information Job Evaluation Record
BM #
Title
Know-How Problem Solving Accountability Total Points
Level
Profile
PTS
Modifiers
POI
Modifiers
CIS
Points
TE
Modifiers
TC
Modifiers
%
Points
FTA
Modifiers
Mag
Modifiers
IMP
Modifiers
Points
1 Director, Public Affairs F + II N/A 3 460 E + 4 N/A 50 230 E + 2 N/A P N/A 230 920 EX-01 L
2 Director, Indigenous Affairs F N/A II + 3 460 E + 4 N/A 50 230 E + 2 N/A P N/A 230 920 EX-01 L
3 Director, Program Evaluation F N/A III - 3 460 F N/A 4 N/A 50 230 E + 4 N/A C N/A 230 920 EX-01 L
4 Director, Legislative and Regulatory Affairs F N/A III - 3 460 F N/A 4 N/A 50 230 E N/A 5 N/A C - 230 920 EX-01 L
5 Director, Contracting, Materiel Management and Systems F + II N/A 3 460 E + 4 N/A 50 230 E + 2 N/A P N/A 230 920 EX-01 L
6 Director, MROs and Ministerial Services F N/A II + 3 460 E + 4 N/A 50 230 E N/A 2 + P N/A 230 920 EX-01 L
7 Director, Carbon Pricing System Division F + II N/A 3 460 E + 4 N/A 50 230 E + 2 N/A P N/A 230 920 EX-01 L
8 Director, Science and Technology F + II N/A 3 460 F N/A 4 N/A 50 230 E + 2 N/A P N/A 230 920 EX-01 L
9 Director, Operations CANMET Energy F N/A II + 3 460 F N/A 4 N/A 50 230 E N/A 2 + P N/A 230 920 EX-01 L
10 Director, Family Law and Assistance Services F + II N/A 3 460 E + 4 N/A 50 230 E + 2 N/A P N/A 230 920 EX-01 L
11 Director, Health Program Integrity and Control F N/A II + 3 460 F N/A 4 N/A 50 230 E + 2 N/A P N/A 230 920 EX-01 L
12 Director, Trade and International Affairs F N/A II + 3 460 F N/A 4 N/A 50 230 E + 2 N/A P N/A 230 920 EX-01 L
13 Director, National Security Assessment and Analysis F + II + 3 460 F N/A 4 N/A 50 230 E N/A 5 N/A I + 230 920 EX-01 L
14 Area Director, Citizen Services F N/A III - 3 460 E N/A 4 N/A 43 200 E + 3 - P N/A 264 924 EX-01 A2
15 Director, Financial Accounting F N/A II + 3 460 E + 4 N/A 50 230 E N/A 6 N/A I N/A 264 954 EX-01 A1
16 Director, Civilian Human Resources Services Centre, NCR F - III N/A 3 460 E + 4 N/A 50 230 E + 3 - P N/A 264 954 EX-01 A1
17 Director, Real Property Policy F + II + 3 460 E + 4 N/A 50 230 E + 6 - I N/A 264 954 EX-01 A1
18 Director, Office of Emergency Response Services F N/A II + 3 460 E + 4 N/A 50 230 E + 3 - P N/A 264 954 EX-01 A1
19 Assistant Deputy Chairperson, Immigration Division F + II N/A 3 460 F N/A 4 N/A 50 230 F N/A 2 - P N/A 264 954 EX-01 A1
20 Director, Centre for Special Business Projects F + II N/A 3 460 F N/A 4 N/A 50 230 F - 2 + P N/A 304 994 EX-01 A2
21 Director, Communications Analysis and Policy Development F N/A III N/A 3 528 F N/A 4 N/A 50 264 E + 2 N/A P N/A 230 1022 EX-01 P1
22 Director, Drug Policy F + III - 3 528 F N/A 4 N/A 50 264 E + 2 N/A P N/A 230 1022 EX-01 P1
23 Director, Financial Management F N/A III N/A 3 528 E + 4 N/A 50 264 E + 5 - C N/A 264 1056 EX-01 L
24 Director, Financial Policy F + III - 3 528 F N/A 4 N/A 50 264 E N/A 6 N/A I N/A 264 1056 EX-01 L
25 Director, Health Analysis G - II + 3 528 F N/A 4 N/A 50 264 F N/A 2 N/A P N/A 304 1096 EX-01 A1
26 Director, Risk Management F + III - 3 528 F N/A 4 N/A 50 264 F - 2 + P N/A 304 1096 EX-01 A1
27 Regional Director, Geological Survey of Canada (GSC) - Quebec G N/A II + 3 608 F N/A 4 N/A 50 304 F N/A 2 + P N/A 350 1262 EX-02 A1
28 Senior Director, Internal Partnerships and Service Relations F + III N/A 3 608 F N/A 4 N/A 50 304 F - 3 N/A P N/A 350 1262 EX-02 A1
29 Counsellor/Program Manager, Political and Economic F + III N/A 3 608 F N/A 4 N/A 50 304 F - 6 N/A I N/A 350 1262 EX-02 A1
30 Director General, Resource Management G - III - 3 608 F + 4 N/A 57 350 F + 2 + P N/A 350 1308 EX-02 L
31 Director General, Defence Major Projects Sector F + III N/A 3 608 F + 4 N/A 57 350 F N/A 5 + I + 350 1308 EX-02 L
32 Director General, Regional Civilian Human Resources Services F + III + 3 608 F N/A 4 + 57 350 F N/A 3 N/A P N/A 400 1358 EX-02 A1
33 Director General, Integrity Risk Guidance Branch G - III N/A 3 608 F + 4 N/A 57 350 F N/A 5 N/A C N/A 400 1358 EX-02 A1
34 Director General, Archives G - III - 3 608 F + 4 N/A 57 350 F + 3 - P N/A 400 1358 EX-02 A1
35 Regional Director, Health Services (Prairie Region) F N/A III + 3 608 F N/A 4 N/A 50 304 F N/A 3 + P N/A 460 1372 EX-02 A3
36 Director General, Financial Operations and Services G - III N/A 3 608 F + 4 N/A 57 350 F N/A 6 N/A I + 460 1418 EX-02 A2
37 Director General, Operations and Departmental Security Officer G - III N/A 3 608 F N/A 4 + 57 350 F + 3 N/A P N/A 460 1418 EX-02 A2
38 Director General, Centralized Operations G - III N/A 3 608 F + 4 N/A 57 350 F + 3 N/A P N/A 460 1418 EX-02 A2
39 Director General, Carbon Pricing Bureau G N/A III N/A 3 700 F + 4 N/A 57 400 F + 2 N/A P N/A 350 1450 EX-02 P1
40 Ambassador, Guatemala G - III + 3 700 G N/A 4 N/A 57 400 G - 2 N/A P N/A 400 1500 EX-02 L
41 Director General, Human Resources Operations G - III + 3 700 F + 4 N/A 57 400 F + 3 N/A P N/A 460 1560 EX-03 A1
42 Director General, CANMET Energy - Ottawa G N/A III N/A 3 700 F + 4 + 57 400 F N/A 7 - I N/A 460 1560 EX-03 A1
43 Director General, Centre of Emergency Preparedness and Response G - III + 3 700 F + 4 N/A 57 400 F + 3 N/A P N/A 460 1560 EX-03 A1
44 Director General, Consumer Product Safety G N/A III N/A 3 700 F + 4 + 57 400 F + 3 N/A P N/A 460 1560 EX-03 A1
45 Director General, Communications G - III + 3 700 F + 4 N/A 57 400 F + 5 N/A C N/A 460 1560 EX-03 A1
46 Chief Executive Officer, Canadian Forces Housing Agency G - III + 3 700 F + 4 N/A 57 400 F + 4 - P N/A 528 1628 EX-03 A2
47 Director General, Controlled Substances G N/A III + 3 800 F N/A 4 + 57 460 F N/A 3 N/A P N/A 400 1660 EX-03 P1
48 Assistant Deputy Commissioner, Correctional Operations (Prairie Region) G - III + 3 700 F + 4 N/A 57 400 F + 4 N/A P N/A 608 1708 EX-03 A3
49 Assistant Secretary to the Cabinet, Senior Personnel G - IV N/A 3 800 F + 4 N/A 57 460 F + 6 N/A C N/A 608 1868 EX-04 A2
50 Regional Director General, Maritimes G - IV - 3 800 F + 4 N/A 57 460 F ++ 4 - P N/A 608 1868 EX-04 A2
51 Assistant Deputy Minister, Public Affairs G N/A IV - 3 800 G N/A 4 N/A 57 460 G - 5 + C N/A 608 1868 EX-04 A2
52 Assistant Deputy Minister, Chief Digital Officer G N/A IV N/A 3 920 G N/A 4 + 66 608 G N/A 4 - P N/A 700 2228 EX-04 A1
53 Assistant Deputy Minister, Citizen Services G N/A IV N/A 3 920 G N/A 4 + 66 608 G N/A 3 + P N/A 700 2228 EX-04 A1
54 Assistant Deputy Minister, Small Business, Tourism and Marketplace G N/A IV N/A 3 920 G N/A 4 + 66 608 G N/A 4 - P N/A 700 2228 EX-04 A1
55 Vice-President, Corporate Services and CFO G N/A IV N/A 3 920 G N/A 4 + 66 608 G N/A 5 N/A C + 700 2228 EX-04 A1
56 Assistant Chief Statistician, Analytical Studies, Methodology and Statistic Infrastructure G N/A IV N/A 3 920 G + 4 N/A 66 608 G N/A 3 + P N/A 700 2228 EX-04 A1
57 Assistant Deputy Minister, Community Safety and Countering Crimes Branch G N/A IV N/A 3 920 G N/A 4 + 66 608 G N/A 3 + P N/A 700 2228 EX-04 A1
58 Assistant Commissioner, Human Resources Management G N/A IV N/A 3 920 G N/A 4 + 66 608 G N/A 3 + P N/A 700 2228 EX-04 A1
59 Vice-President, Health Security and Infrastructure G N/A IV N/A 3 920 G N/A 4 + 66 608 G N/A 3 + P N/A 700 2228 EX-04 A1
60 Regional Deputy Commissioner G N/A IV N/A 3 920 G N/A 4 + 66 608 G N/A 4 N/A P N/A 800 2328 EX-04 A2
61 Assistant Deputy Minister, Policy G N/A IV + 3 1056 G N/A 4 + 66 700 G N/A 3 + P N/A 700 2456 EX-05 L
62 Assistant Deputy Minister, Healthy Environments, Consumer Safety and Controlled Substances G N/A IV + 3 1056 G N/A 4 + 66 700 G N/A 4 - P N/A 700 2456 EX-05 L
63 Assistant Deputy Minister, Americas G N/A IV + 3 1056 G N/A 4 + 66 700 G N/A 4 - P N/A 700 2456 EX-05 L
64 Senior Assistant Deputy Minister, National and Cyber Security Branch G N/A IV + 3 1056 G N/A 4 + 66 700 G N/A 5 + C N/A 700 2456 EX-05 L
65 Assistant Secretary, Expenditure Management Sector G N/A IV + 3 1056 G N/A 4 + 66 700 G - 7 N/A I + 800 2556 EX-05 A1
66 Assistant Deputy Minister, Science and Technology G + IV N/A 3 1056 G N/A 4 + 66 700 G N/A 4 N/A P N/A 800 2556 EX-05 A1
67 Assistant Deputy Minister, Real Property Services G N/A IV + 3 1056 G N/A 4 + 66 700 G + 4 - P N/A 800 2556 EX-05 A1
68 Assistant Deputy Minister, Information Management G N/A IV + 3 1056 G N/A 4 + 66 700 G - 5 - P N/A 920 2676 EX-05 A2
69 Assistant Deputy Minister, Infrastructure and Environment G N/A IV + 3 1056 G N/A 4 + 66 700 G N/A 5 - P N/A 920 2676 EX-05 A2
70 Assistant Deputy Minister, Operations G N/A IV + 3 1056 G N/A 4 + 66 700 G N/A 4 + P N/A 920 2676 EX-05 A2
71 Assistant Deputy Minister, International Trade and Finance G + IV N/A 3 1056 G + 4 N/A 66 700 G N/A 7 + I N/A 920 2676 EX-05 A2
72 Head of Mission, Beijing G N/A IV + 3 1056 G N/A 4 + 66 700 G N/A 7 - C N/A 920 2676 EX-05 A2

Page details

Date modified: