Executive Group Position Evaluation Plan - September 2005
Table of Contents
- Table of Amendments
- Foreword
- Preface
- Introduction to Job E valuation
- Guide Chart for Evaluating Know-How
- Measuring Know-How
- Guide Chart for Problem Solving Thinking
- Measuring Problem Solving / Thinking
- Government of Canada Executive Group—Guide Chart for Evaluating Accountability / Decision Making
- Measuring Accountability / Decision Making
- Using the Benchmarks
- Process for Selecting Suitable Benchmarks
- Appendix A - Executive (Ex) Group Definition
- Appendix B - Guide Charts
- Appendix C – Benchmark Index
Table of Amendments
Amendment Number | Date of Amendment | Nature of Amendment |
---|---|---|
1 | August 2012 | Addition of Table of Amendments, Update of Table of Contents, Update of Accountability Magnitude Index |
Foreword
Much has changed in the Canadian Public Service workplace, the economy and the world since the Executive Group Position Evaluation Plan (EGPEP) was last reviewed and updated in 1992. Internal and external pressures have re-shaped management values and the role of the Federal Government. Globalization and international security issues have given rise to many new and more powerful international organizations requiring greater government co-ordination. The delivery of many government services has been altered, shared or transferred to other government jurisdictions or the private sector.
Executive jobs have been highly impacted by these changes and the introduction of modern management frameworks that emphasize such Public Service values as providing citizen-centred program delivery, ensuring responsible spending, ensuring innovative and timely policy development, managing for results, operating within shared management accountabilities, behaving ethically, and applying effective people management practices.
The world of work for executives is constantly evolving to both anticipate and respond to citizen expectations of better service, employee expectations of visionary leadership and the impact of new technologies. Given the scope and magnitude of change in the way executive work is performed today, it has become imperative that the benchmarks appended to the Executive Group Position Evaluation Plan be modernized.
In assessing the nature of an executive position’s contribution to its organization, evaluators will need to be sensitive to the complexity, spirit and values implicit in the nature of this important work.
Preface
This manual has been prepared to assist and act as a guide for all Classification Specialists and members of the Executive Group involved in the classification of Executive Group positions.
The Executive Group Position Evaluation Plan (EGPEP) is a Hay-based position evaluation plan, which has been used to evaluate EX-level positions in the Federal Public Service since 1980. The Hay Guide Chart and Profile methodology, more commonly called the Hay Plan, are widely used by hundreds of organizations around the world in both public and private sector jurisdictions.
The EGPEP’s benchmark position descriptions have been reviewed and updated to reflect changes in management practices as well as the major operational and organizational shifts that have taken place since the Plan was last updated in 1992. These changes will ensure that the Plan remains current and relevant.
This manual is designed to:
- Clarify the basic concepts and principles underlying the job evaluation process.
- Provide guidelines that will foster consistency in the evaluation of Executive Group positions while retaining the flexibility required to properly reflect the diverse nature of these positions.
- Serve as an adjunct to the materials and experience received during basic job evaluation training or refresher courses.
To assist evaluators in developing a sound and comprehensive understanding of executive jobs, it is important that Executive Group position descriptions be up-to-date, so that they accurately reflect changes in organization structure, authorities, key responsibilities and management philosophy. Most importantly, they need to clearly demonstrate the position’s contribution to the organization.
This revised EGPEP manual supersedes the September 1992 edition.
Introduction to Job Evaluation
Job Evaluation Fundamentals
Purpose
Job evaluation provides senior management with a sound basis to:
- Establish the appropriate rank order of jobs
- Establish the relative distance between jobs within the ranking
- Provide a systematic measurement of job size relative to other positions, to make salary comparisons possible
- Provide a source of information on the work being done in a unit prior to making restructuring decisions
Fundamental Premises
The evaluation of Executive Group positions is based on the Hay Guide Chart Profile Method. The logic behind the Hay Method is:
- Every organization exists to produce identifiable end results
- An organization is created when more than one individual is required to accomplish the tasks to produce those end results
- Every viable job in an organization has been designed to make some contribution toward reaching those end results
- That contribution can be systematically measured
The Ranking / Validation Process
The Hay Method identifies the relative value (or weight) of positions within an organizational unit. The relationships are based on the relative degree to which any position, competently performed, contributes to what its unit has been created to accomplish.
Concept | Application |
---|---|
The notion of competent performance in job evaluation | Job evaluation measures the contribution made by a position, not the contribution an incumbent may or may not make to the position. Since jobs are designed on the assumption that they can and will be competently performed, the evaluator assumes that competent performance exists and makes no judgements about performance. |
The contribution the position makes to the unit is determined by measuring job content, as set out in the job description, using three measurement factors:
- Know-How
- Problem Solving / Thinking
- Accountability / Decision Making
The Hay Method uses these three factors and their sub-factors in a ranking process. The common measurement standard is the degree to which the three factors are found in one job relative to the degree to which they are found in another, with the Hay Guide Charts providing the technological tools for the ranking process. Thus, it is important to remember that there are no absolutes. It is simply a matter of determining how much more or less of each factor any job has relative to others around it. As a result, the two key activities in the evaluation process are:
- Ranking: Looking at jobs within their organizational context, not in isolation.
- Validation: Double-checking the accuracy of the relative weight given to the jobs for each of the three factors, through precise Benchmark comparisons.
Overview of the Evaluation Process
1. Understanding the Job
An accurate job description is an essential component of the job evaluation process. It provides the necessary information from which to construct an evaluation of the position. To do that, it must provide a clear and succinct description of:
- The job’s purpose and the end results for which it is accountable (found in the General and Specific Accountability statements)
- Where the job sits in organizational terms (found in the Organization Structure statement and the organization charts)
- The dimensions of the job (found in the Dimensions statement)
Two key concepts which govern the use of job descriptions in arriving at a valid evaluation are:
Concept | Application |
---|---|
The need for up-to-date job descriptions | The description should be up to date so that the job can be evaluated as it is, not as it was and not as it might be or could be. It should describe what is actually required of the job. Jobs change, and so it is important to have accurate, complete and current information. |
Avoiding title comparisons | The title of a position can provide a strong clue about where to look for appropriate Benchmark comparators. However, by themselves, title comparisons can distort valid evaluations, because what the job holder does and what occurs in another job with a similar title may not be the same at all. For this reason, titles are never adequate for making proper evaluations. |
2. Understanding the Job Context: Using the Organization Charts
It is vitally important to avoid viewing the job as though it exists in isolation. Organization Charts show two things:
- Where the position fits within the unit structure (its hierarchical level). This is very important information for identifying potential Benchmark comparators.
- The impact and influence of other jobs on the position. Organizational interrelationships, particularly where one job provides functional guidance to another, have a strong influence on job size. Organizational interrelationships can also indicate potential overlaps or duplications, which the job descriptions, taken in isolation, could mask.
The key concept for weighing the influence of organizational relationships is:
Concept | Application |
---|---|
The need to recognize both lateral and vertical relationships | Both vertical and lateral relationships affect job size. It is a common mistake to overlook the lateral relationships between peer positions and overemphasize the vertical ones between superior and subordinate. It is important to look at both equally critically. |
3. Evaluating the Position: Using the Three Factors
The three evaluation factors provide a common yardstick which makes it possible for actual job comparisons to be made. The three factors represented on the charts are:
- Know-How, which encompasses three scaled sub-factors:
- Depth and range of practical/technical/specialized Know-How
- Breadth of managerial and operational Know-How
- Criticality of human relations
- Problem Solving / Thinking, which encompasses two scaled sub-factors:
- Thinking environment
- Thinking challenge
- Accountability / Decision Making, which encompasses three scaled sub-factors:
- Freedom to act
- Area and type of impact
- Magnitude
- Freedom to act
Two key concepts which underlie these three evaluation factors are:
Concept | Application |
---|---|
Comparing jobs according to universal factors | It is possible to evaluate diverse jobs using the three factors of the Hay system because they incorporate the three fundamental characteristics that researchers have found are common to the nature of work and are therefore present to some degree in every job. These factors form a common “measuring stick” that can be appropriately applied to any job in order to evaluate the work done in the job. |
The need to focus on job content | The purpose of job evaluation is to establish, as objectively as possible, each job’s relationship to others in terms of content and requirements. This is particularly difficult if the current classification level, rating or historical relationship is referred to during evaluation. The evaluator must take pains to ignore the related assumptions that may go with knowing the suggested organizational level of the job, the incumbent, or the (likely) salary connected with the position. |
4. Using the Numbering Pattern of the Guide Charts
The numbering system on the Guide Charts is geometric, with values increasing in steps of approximately 15%. Since this numbering progression runs through all three Charts, evaluations always reflect step differences of 15%. For example: 100, 115, 132, 152, 175, 200 and so on, with the value doubling every five steps.
Concept | Application |
---|---|
Step differences: the building blocks of evaluation technology | The notion of step differences is critical because it provides a framework for consistent, quantified judgements to be made based on the minimum perceptible difference that well-informed and experienced evaluators can discern between jobs or elements of jobs. The minimum perceptible difference between factors or sub-factors has been shown to be 15%. This is why the numbering pattern used in the Charts is a progression of steps approximately 15% apart. |
The Charts used for Executive Group positions are a subset of the Hay Master Charts. As such, they have been “sized” to include only the relevant portions for evaluating Executive Group positions, plus a suitable floor and ceiling to provide the outer parameters for the evaluation context.
5. Ranking the Position
Once points have been assigned to all three factors, it is a straightforward matter to make a preliminary ranking of the position based on the sum of those points. The minimum and maximum points for each of the levels in the Executive Group are provided in Appendix A.
6. Validation: Double Checking the Evaluation Logic
There are two aspects to ensuring the quality of Executive Group position evaluations. The first of these quality assurance measures involves double checking that the value assigned to each factor makes evaluation sense. Factor-specific validation checks are included in the explanation of each factor in this manual.
7. Validation: Reconciling the Evaluation with Others Around It
The other aspect of quality assurance involves ensuring that the evaluation makes sense within the continuum of Executive Group evaluations. This means double checking the evaluation against those for other positions around it in the unit (based on the organization charts), and also double checking the broader validity of the evaluation against the service-wide standard of the Benchmarks.
8. Documenting the Position
The final evaluation should be supported by a written rationale (as demonstrated in Appendix C of this manual).
Government of Canada Executive Group
Guide Chart for Evaluating Know-How
Know-How
Definition
Know-How is the sum total of EVERY kind of knowledge and skill, HOWEVER ACQUIRED, needed for COMPETENT JOB PERFORMANCE. Know How has three components, the requirements for:
Note: Each of the components is marked with a certain number of dots. The number of dots corresponds to a section on the Guide Chart for Evaluating Know-How.
- Practical, Technical, Specialized Know-How
- Varied applied skills, including those relating to human relations, knowledge of the position’s environment and clientele (e.g., the public, industry, special interest groups, other governments, etc.), practical procedures, specialized techniques and/or scientific/professional disciplines.
- Managerial and Operational Know-How
- The Know-How and skill involved in guiding and integrating the resources associated with an organizational unit or function in order to produce the expected results. The knowledge and skills may be exercised executively ("acting as a manager") or consultatively ("thinking as a manager"). Involved is some combination of planning, organizing, integrating, coordinating, directing, motivating and developing human resources, controlling, evaluating, and checking. This Know-How may be required in providing service to the client/customer and/or advice to others, and becomes more critical as the conflicting demands and priorities of clients/customers increase.
- Criticality of Human Relations
- This is a measure of how relatively crucial, critical, and difficult are the various interpersonal relationships which positions must establish and maintain in order to achieve the objectives.
- Measuring Practical, Technical, Specialized Know-How
- This type of knowledge and skill may be characterized by breadth (variety), or depth (complexity), or both. Jobs may require some combination of: various skills; some knowledge about many things; a good deal of knowledge about a few things. Thus, to measure this kind of Know-How, the evaluator has to understand what skills are needed and how much knowledge is needed about how many things and how complex each of them is.
- Function
- A group of diverse activities which, because of common objectives, similar skill requirements, and strategic importance to an organization, are usually directed by a member of top management.
- Subfunction
- A major activity which is part of, and more homogeneous than, a function.
- Element
- A part of a subfunction; usually very specialized in nature and restricted in scope or impact.
Measuring Know-How
Know-How is the sum of every kind of knowledge and skill, however acquired, that is required for fully competent job performance. It can be thought of as “how much skill and/or knowledge about how many things and how complex each of them is.” It has three sub-factors:
- Depth and range of practical, technical or specialized Know-How
- Breadth of managerial and operational Know-How in planning, organizing, co-ordinating, directing, developing, controlling, evaluating and checking
- The criticality of interpersonal relationships in achieving objectives
Practical / Technical / Specialized Know-How
The depth and range of Practical, Technical or Specialized Know-How required in a position is measured on the vertical axis of the Chart.
There are three important concepts to grasp in order to apply the Practical/Technical/Specialized scale correctly:
Concept | Application |
---|---|
Equivalency of depth and breadth in Practical/Technical/ Specialized Know-How |
It is important to recognize that the demands for Practical Know-How in operational/service positions such as line management and human resources can be as great as the Technical/Specialized knowledge requirements of professional jobs such as engineering, science, law or education. |
The Know-How required to manage specialist positions | Managers do not necessarily need the same depth of subject-specific Technical or Specialized Know-How as those working below them. This is because managers are not required to do their subordinates’ jobs. However, they do require sufficient understanding of their subordinates’ areas of expertise to be able to manage their activities. Note: The management skill required is measured separately under Managerial and Operational Know-How. |
Equivalency of work experience and formal education | While it is true that some Know-How can only be gained formally (e.g., a PhD in physics), it is important to focus on the knowledge and skill required to do the work, not on how an incumbent might come to possess that knowledge, so as to avoid correlating Know-How level with educational level:
|
Managerial and Operational Know-How
The requirement for Managerial and Operational Know-How is measured on the horizontal axis of the Chart.
Managers in Executive Group positions must know how to do such things as plan, organize, motivate, co-ordinate, direct, develop, control, evaluate or check the results of others’ work. This management skill can be required in direct activities (e.g., by line managers), through consultative activities which require thinking like a manager (e.g., by staff specialists), or both (as in positions which manage staff operations).
There are two key concepts to bear in mind when evaluating this sub-factor:
Concept | Application |
---|---|
The more complex the job, the broader the management skills required | Four elements affect the degree to which the need for management skills increases:
|
Thinking like a manager | This can be summed up as the ability to look at the larger organizational picture in a situation - in effect, to put oneself in the shoes of one’s superior or another manager, even though one does not have the resources available to that position. For example: the head of a financial function may have to develop plans for the entire unit but may not control the resources needed to put the plan into effect. |
Criticality of Human Relations
Criticality of Human Relations is measured, along with Managerial and Operational Know-How, on the Guide Chart’s horizontal axis. This final Know-How sub-factor integrates the assessment of the practical requirement for using human relations skills into the measurement of job content, that is, the degree to which establishing and maintaining effective interpersonal relationships is central to the position achieving its objectives.
The requirement for using human relations skills on the job is represented by three possible levels. For most Executive positions, because of their size and/or nature, the achievement of objectives truly hinges on the establishment and maintenance of effective interpersonal relations. However, this is not the case for every Executive job.
Therefore, evaluators should not automatically assign level 3 in Criticality to every Executive job. In assessing each Executive position, evaluators must weigh a variety of considerations in making their judgements, such as:
- The degree to which the executive values of leadership and motivation are both integral to the job and highly complex or difficult in nature.
- The importance of “service” and client contact (both internal and external) as integral elements of the job.
- The nature of the client relationship(s).
In assessing the significance of client contact, evaluators should consider such factors as the frequency and nature or intensity of these contacts. There is a significant difference in the Criticality of Human Relations between a case where contact is established simply to gather or exchange information and/or opinions and a case where contact is established and maintained to influence decisions, processes or behaviours which are crucial to the organization successfully achieving its goals.
It is also important to relate the nature of the job’s contacts to its objectives. Evaluators should avoid being misled by statements in job descriptions which ascribe contacts to a job that are not in keeping with its objectives and accountabilities.
There are three levels of Criticality:
- Level 1
- Common courtesy must be employed and an appropriate working relationship established and maintained with subordinates, colleagues and superiors in order to accomplish the position’s objectives. However, there is no significant need to influence others in carrying out assignments. Interaction with others is generally for the purpose of a straightforward information exchange or seeking instruction or clarification.
- Level 2
- In dealing with subordinates, colleagues and superiors, and in the course of some contact with clients inside and/or outside government, it is necessary to establish and maintain the kind of relationships that will facilitate the acceptance and utilization of the position’s conclusions, recommendations and advice. In order to achieve desired results, positions have to interact regularly with subordinates, colleagues and superiors and have some contact with clients. The nature of these contacts is such that tact and diplomacy beyond the demands of normal courtesy are required.
- Level 3
- Successful achievement of the position’s program delivery and/or service and/or advisory objectives hinges on the establishment and maintenance of appropriate interpersonal relationships in dealings with subordinates, colleagues and superiors and in ensuring the provision of service through substantive contact with clients inside or outside government. Skills of persuasiveness or assertiveness as well as sensitivity to the other person’s point of view are essential to ensuring the delivery of service. This involves understanding the other’s point of view, determining whether a behavioural change is warranted and, most importantly, causing such a change to occur through the exercise of interpersonal skills.
The key concept to remember when evaluating the Criticality of Human Relations is:
Concept | Application |
---|---|
The difference between the need for good human relations skills and the need to know human relations theory | Most Executive Group positions require incumbents to interact with people. You measure the practical importance under Criticality of Human Relations. The focus is on putting skills into action. However, some highly specialized positions require that the incumbent have technical knowledge of human relations theory. Examples would be counselling positions. This knowledge of theory is measured under Practical/Technical/Specialized Know-How. However, the need to put this theory into action in the counselling process would be measured under Criticality of Human Relations. |
Combining the Know-How Elements
To this point, three independent decisions regarding Know-How have been made. For example:
Practical/Technical/ Specialized |
Managerial/ Operational |
Human Relations |
|
---|---|---|---|
Position 1 | F | II | 3 |
Position 2 | G | III | 3 |
Position 3 | G | II | 2 |
The total weight of Know-How is derived from the combination of the three sub-factors. The values assigned to the sub-factors will lead the evaluator to a “cell” on the Chart. This cell will contain three numbers, representing three step values.
For example: the F II 3 cell reads:
- 350
- 400
- 460
Normally, a solid fit on all three sub-factors would lead you to select the middle number in the cell. The final decision about which of these numbers to choose to represent the job’s total Know-How requirement will be based on the degree of confidence in the validity of the cell selected.
Regardless of the number chosen, you should record any shadings in your evaluation (i.e., any “pulls” up or down). You can do this by using an arrow up or down beside the sub-factor, (such as F II 3 ()).
Concept | Application |
---|---|
Making numbering differentiations | The overlapping numbering system is designed to allow different jobs to receive equivalent points, if appropriate. The numbering system also permits the evaluator to show relative differences between jobs whose evaluations put them in the same cell. This is done by assigning a higher number from the cell to the stronger job. |
The continuum of the cells | The cells on the Chart represent stages along a continuum, not discrete steps. It is possible to carefully evaluate a position on each of the sub-factors and still be aware that the cell selected does not completely reflect your final opinion. In this case, you might choose the top or bottom number in the cell, depending on whether you thought there was a "pull" up or down on the evaluation.
|
Checking the Step Relationships of a Know-How Evaluation
There are some “rules of thumb” that can assist you in making/validating your judgments. It is important to bear in mind that these are “rules of thumb,” not hard-and-fast rules. They should not be used as a substitute for thorough analysis of the job and interpretation of the Guide Charts.
As a “rule of thumb,” when you are considering a hierarchy of jobs in a job family, technical ladder or reporting structure, the number of steps in the Know-How score can give some insight into the vertical structure of the hierarchy:
- One-step difference e.g., 460 to 528
- A “one-step” difference generally indicates a point of compression in the structure, giving reason to question the need for the number of organizational “layers” found. For example: one-over-one situations where the subordinate’s job is virtually a replica of the superior’s job or is “just perceptibly” different.
- Two-step difference e.g., 460 to 608
- This is the typical or logical relationship/vertical distance in a reporting sequence.
- Three-step difference e.g., 460 to 700
- Three steps between levels are characteristic of reporting relationships in organizations with a broad span of control.
- Four-step difference e.g., 460 to 800
- This represents a significant difference in terms of job content on the Know-How factor, suggesting that a level may be missing in the organizational structure. Care should be taken to ensure that the void exists in reality and that it is not the result of an evaluation error.
Validating Against the Benchmarks
The evaluation of the Know-How factor should always make sense within the continuum of Executive Group positions. There is a need to ensure that the logic of that continuum remains intact over time.
The method for validating against the continuum is to “prove” the evaluation by finding several comparable reference positions from the standardized Benchmark positions. This is the key test of the validity of an evaluation. Generally, the Benchmark validation step is done after the position has been evaluated against all three factors. The process is outlined in Appendix C.
Government of Canada Executive Group
Guide Chart for Evaluating Problem Solving / Thinking
- Problem Solving / Thinking Definition:
-
Problem Solving/Thinking is the original, self starting thinking required by the job to: (1) identify, (2) define, and (3) resolve a problem. "You think with what you know" - this is true of even the most creative work. The raw material of any thinking is knowledge of facts, principles and means. Ideas are put together from something already there. Therefore, Thinking is treated as a percentage utilization of Know How. There are two components :
Note: Each of the two components is marked with a certain number of dots. The number of dots corresponds to a section on the Guide Chart for Evaluating Problem Solving / Thinking.
- The Thinking Environment:
- The extent to which assistance or guidance is available from others or from past practice or precedents and the degree to which the position is required to identify situations where direction or precedents are not applicable. How well/poorly defined is the problem, issue, etc.?
- The Thinking Challenge:
- The novelty and complexity of the thinking to be done and the time pressures within which the thinking must be done.
- Measuring Problem Solving / Thinking:
- Problem Solving/Thinking measures the intensity of the mental process which employs Know How in analyzing, evaluating, creating, reasoning, arriving at and making conclusions. To the extent that Problem Solving/Thinking is circumscribed by standards, covered by precedents, or referred to others, the scope of the Problem Solving/Thinking is diminished, and the emphasis correspondingly is on Know How.
- N.B.
- The evaluation of problem solving/thinking should be made without reference to the job’s freedom to make decisions or take action; the scope and nature of the job’s decisions are measured on the accountability/decision making chart.
Click here for Guide Chart for Evaluating Problem Solving / Thinking
Measuring Problem Solving / Thinking
Problem Solving / Thinking is the opportunity, need or requirement on the part of the position to put Know-How to use in original, self-starting thinking in order to deal with issues and solve problems on the job.
Measuring Problem Solving / Thinking involves evaluating the intensity of the mental processes required by the position. Activities include employing Know-How to analyze, identify, define, evaluate, draw conclusions about and resolve issues. To the extent that thinking is circumscribed by standards, covered by precedents or referred to others, the Problem Solving / Thinking requirement of the job is diminished.
The raw material of any Problem Solving / Thinking is knowledge of facts, principles and means. Ideas are put together from something already there: “You think with what you know.” This is true of even the most creative work.
However, this mental manipulation of Know-How is different from the straight application of skill measured by the Know-How factor. For this reason, not all of the Know-How required in a job will necessarily be utilized in the Problem Solving / Thinking elements of that job. Problem Solving / Thinking is therefore treated and measured as a percentage of Know-How, and the numbering pattern on the chart is comprised of a series of percentages rather than point values.
Problem Solving / Thinking has two dimensions:
- Thinking Environment (vertical axis) - how much assistance is available to help the incumbent do the thinking required.
- Thinking Challenge (horizontal axis) - the complexity and novelty of the thinking required and the time pressures within which the thinking must be done.
Thinking Environment
The first step in evaluating a job’s Problem Solving / Thinking element involves considering the amount of help available to the job holder. That help can come from precedents, people, and service-wide, department-wide or functional goals, policies, objectives, procedures, instructions or practices. In general:
- Goals, policies and objectives provide help by describing the “what” of a subject matter.
- Procedures detail the steps necessary to follow through on a policy (how, where, when, and by whom).
- Instructions and practices outline the specific how-to’s.
The degree to which help is available to job holders varies. For example: help from functional specialists and superiors may be less readily available to managers in geographically remote or organizationally isolated areas. The degree to which help is available is evaluated along the vertical axis of the Chart. There are no hard-and-fast rules. However, here are some guidelines:
- At the D level, what has to be done is often defined. How things have to be done is less defined.
- At the E and F levels, thinking is more about what has to be done. Naturally, how things are to be done is also not clearly defined.
- At the G and H levels, thinking is more about why things should be done. The what is generally less defined, and how things are done is not defined at all.
The key concept to remember when evaluating the Thinking Environment is as follows:
Concept | Application |
---|---|
The relationship between the Know-How level and the Problem Solving level | Logically, jobs do not require the incumbent to think beyond the limit of the Know-How required for the job. Therefore, the Thinking Environment level (as designated by its letter) should generally be no deeper than the depth of the Practical/Technical/Specialized Know-How level/letter previously assigned. (Example: When Practical/Technical/Specialized Know-How is at the F level, the Thinking Environment will probably be E or F - but not G). |
Thinking Challenge
Thinking Challenge, the second dimension of Problem Solving / Thinking, measures the complexity of the thought processes required of the job holder. It addresses the questions, “How tough are the things that come the job holder’s way in terms of the thinking to be done?” and, “How quickly must the thinking be done?” The various levels of “Thinking Challenge” appear across the top of the Problem Solving / Thinking chart.
There are three key concepts to bear in mind when evaluating Thinking Challenge:
Concept | Application |
---|---|
The definition of “problems” | “Problems” in this context refer to the wide range of challenges confronting job holders. The concept is not restricted to things that have gone wrong, although such things must certainly be considered. |
Equivalency of technical and management issues in measuring Thinking Challenge | It is a common mistake to pay too much attention to technical issues when considering Thinking Challenge and too little attention to management issues. Both are equally valid when considering this sub-factor. |
The difference between Thinking Environment and Thinking Challenge | Thinking Environment measures the context in which problem solving takes place, and its main constraint is the amount of help available in that context. Thinking Challenge measures the inherent difficulty of the thinking required, and its main constraint is the novelty of the issues being considered. |
Guide Chart to Select Problem Solving Points at the Intersection of the Column for the Know-How Score and the Row for the Problem Solving Percentage
Problem Solving | Know-How Points | ||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
50 | 57 | 66 | 76 | 87 | 100 | 115 | 132 | 152 | 175 | 200 | 230 | 264 | 304 | 350 | 400 | 460 | 528 | 608 | 700 | 800 | 920 | 1056 | 1216 | 1400 | |
87% | 43c | 50c | 57c | 66c | 76c | 87c | 100c | 115c | 132c | 152c | 175c | 200c | 230c | 264c | 304c | 350c | 400c | 460c | 528c | 608c | 700c | 800c | 920c | 1056a | 1216a |
76% | 38c | 43c | 50c | 57c | 66c | 76c | 87c | 100c | 115c | 132c | 152c | 175c | 200c | 230c | 264c | 304c | 350c | 400c | 460c | 528c | 608c | 700c | 800b | 920a | 1056a |
66% | 33c | 38c | 43c | 50c | 57c | 66c | 76c | 87c | 100c | 115c | 132c | 152c | 175c | 200c | 230c | 264c | 304c | 350c | 400c | 460b | 528b | 608a | 700a | 800b | 920b |
57% | 29c | 33c | 38c | 43c | 50c | 57c | 66c | 76c | 87c | 100c | 115c | 132c | 152c | 175c | 200c | 230c | 264c | 304b | 350a | 400a | 460a | 528b | 608c | 700c | 800c |
50% | 25c | 29c | 33c | 38c | 43c | 50c | 57c | 66c | 76c | 87c | 100c | 115c | 132c | 152c | 175c | 200c | 230a | 264a | 304a | 350c | 400c | 460c | 528c | 608c | 700c |
43% | 22c | 25c | 29c | 33c | 38c | 43c | 50c | 57c | 66c | 76c | 87c | 100c | 115c | 132c | 152c | 175b | 200a | 230b | 264c | 304c | 350c | 400c | 460c | 528c | 608c |
38% | 19c | 22c | 25c | 29c | 33c | 38c | 43c | 50c | 57c | 66c | 76c | 87c | 100c | 115b | 132a | 152a | 175b | 200c | 230c | 264c | 304c | 350c | 400c | 460c | 528c |
33% | 16c | 19c | 22c | 25c | 29c | 33c | 38c | 43c | 50c | 57c | 66c | 76c | 87c | 100b | 115a | 132a | 152c | 175c | 200c | 230c | 264c | 304c | 350c | 400c | 460c |
Problem solving points are at the intersection of the column for the Know-How score and the row for the problem solving percentage
Combining the Problem Solving / Thinking Sub-Factors
The result of making independent judgements for each of the two Problem Solving / Thinking sub-factors is that the evaluation falls within a cell which contains two percentage step values.
Example: Two Percentage Step Values
- 50
- 57
Your choice of which specific Problem Solving / Thinking percentage to use to represent the job’s total Problem Solving / Thinking requirements will be a judgment, based on your “feel” for the strength or weakness of the job’s fit in relation to the Chart definitions of the two sub-factors. Generally, a “solid” fit in relation to the definitions should result in your choosing the lower number in the cell. A “pull” to a higher Thinking Environment or Thinking Challenge would change the choice to the higher percentage. For example:
Problem Solving / Thinking Evaluation = F4
- 50% solid fit choice
- 57% “pull” to G and/or 5
To determine Problem Solving / Thinking points, you can use the chart on the facing page. Simply locate the Problem Solving / Thinking percentage in the left column and the Know-How points along the top or bottom. The resulting Problem Solving / Thinking points are found at the intersection.
Checking the Problem Solving / Thinking Evaluation
Evaluators should take the time to review their Problem Solving / Thinking evaluations. Since Problem Solving / Thinking is the application of Know-How, experienced evaluators have found that the relationship between the two factors tends to fall into patterns. These patterns are shown by the legend a,b and c on the previous chart. They will serve as a general guide for checking the Problem Solving / Thinking evaluation:
- Normally, an evaluation should fall in the Most Likely (legend a) areas.
- An evaluation can fall in the Less Likely (legend b) areas as long as it can be supported by sound reasons.
- If an evaluation falls in the Unlikely (legend c) areas, the evaluation of both the Know-How and the Problem Solving / Thinking factors should be re-checked. It is possible that the body of knowledge the incumbent is expected to have is insufficient for thinking at the level indicated by the Problem Solving / Thinking evaluation, or that too much knowledge is expected of the position given the degree to which it will be put to use, as indicated by the Problem Solving / Thinking evaluation.
Validating Against the Benchmarks
The most important test of the validity of the evaluation is finding comparable reference evaluations in the standardized continuum of the Benchmarks, as outlined in Appendix C.
Government of Canada Executive Group—Guide Chart for Evaluating Accountability / Decision Making
- General
-
Accountability/Decision Making is the measurement of the degree to which a job is responsible for achieving results and the importance of those results to the organization. There are three components
in the following order of importance:
- Freedom to Act
- The degree to which a job, through delegation or empowerment, acts independently to achieve end results before seeking advice or direction as defined in the left-hand column of the next page.
- Impact
- As defined.
- Magnitude
- The size, relative to the whole Public Service, of the unit or function most clearly affected by the decisions AND/OR recommendations of the job. The process for determining Magnitude is described in Measuring Accountability / Decision Making.
- N.B.
- Magnitude and Impact must fit together; neither can be final or meaningful without being related to the other.
- Impact
- The degree to which the job affects or brings about the results expected of the unit or function being considered.
- Indirect (I)
- Supportive and ancillary services, where activities are noticeably removed from final decisions and assistance is modified or merged with other support before the end result stage.
- Contributory (C)
- Interpretative, advisory or facilitating services, for use by others in taking action, which are influential and closely related to action or decisions by others OR measurable contribution, as a member of a team, in achieving end results.
- Shared (S)
- Equal, joint, and significant control, with (usually only one) another position(s) (except own subordinates and superior), over the activities and resources which produce the results, OR control of what are clearly many (but not all) of the significant variables in determining results.
- Primary (P)
- Controlling Impact—The position has effective control over the significant activities and resources which produce the results, and is the sole position (at its level of Freedom to Act) which must answer for the results.
Click here for Guide Chart for Evaluating Accountability / Decision Making
Measuring Accountability / Decision Making
Accountability / Decision Making measures the degree to which a job is responsible for action and the consequences of that action. It is the measured effect of the job on end results.
Up to this point, judgments have been made about the total Know-How required for fully competent job performance, and the degree of mental intensity employed in Problem Solving / Thinking. Now the task is to consider the job’s ability to bring about, or assist in bringing about, some specific end results. This includes considering the Magnitude of, and Impact on, those results.
The Accountability / Decision Making Guide Chart shows three sub-factors:
- Freedom to act: The freedom the incumbent has to make decisions and carry them out. This is the most important sub-factor.
- Impact on end results: How direct the job’s influence is on the end results of a unit, function or program.
- Magnitude (or result area impacted): The general size of the unit, function or program affected. This is the least important sub-factor.
Freedom to Act
Freedom to Act is the most important of the three Accountability / Decision Making sub-factors. By examining the nature and extent of the controls - or the lack of the controls - that surround the job, it directly addresses the question of the job’s freedom to take action or implement decisions. Because of its importance, this sub-factor carries the most quantitative weight in the evaluation of a position’s total Accountability / Decision Making.
The controls placed on the position’s Freedom to Act can be supervisory or procedural, or both. A key concept to keep in mind when considering Freedom to Act is:
Concept | Application |
---|---|
The difference between Freedom to Act and Thinking Environment | It is a common mistake to confuse the restraints placed on Freedom to Act with the help available in the Thinking Environment:
|
Since controls tend to diminish as you rise in the organization, Freedom to Act increases with organizational rank. However, while it is true that no job can have as much Freedom to Act as its superior, the evaluator should be wary of automatic slotting according to organization level alone.
Here are some broad guidelines that can help in assessing Freedom to Act:
- At the D level, positions are relatively free to decide how to achieve predetermined results under some direction from superior management. Positions at this level are subject to managerial approval of tactical objectives and periodic evaluation of results, generally on a quarterly or annual basis.
- At the E and F level, positions are relatively free to determine what the general results are to be. Managerial direction will be general in nature. Assessment of end results must be viewed over longer time spans (e.g., six months to a year or longer).
- At the G level, the what is communicated only in very general terms. Positions become subject to guidance rather than direction or control. Any job evaluated here is subject only to broad policy.
Impact
While the explanation of how to evaluate Impact and Magnitude is presented sequentially, these two sub-factors should always be considered together.
The Impact sub-factor measures the directness of the position’s effect on end results. The Impact levels are as follows:
- I - Indirect:
- The position provides information, recording or other supportive services for use by others. Activities are noticeably removed from final decisions / end results. The position’s contribution is modified by or merged with other support before the end result stage.
- C - Contributory:
-
The position provides interpretative, advisory or facilitating services for use by others or by a team in taking action. The position’s advice and counsel are influential and closely related to actions or decisions made by others. Such an Impact is commonly found in staff or support functions which significantly influence decisions related to various units or programs. For example:
- A senior labour relations specialist makes recommendations and administers policies and practices which affect the use of the unionized human resources of a unit. Note: The Magnitude of these resources might be represented by the human resources costs (payroll) of the unit.
- S - Shared:
- The position is jointly accountable with others (usually one other) for taking action and exercising a controlling Impact on end results. Positions with this type of Impact have noticeably more direct control over action than positions evaluated at the Contributory level, but do not have total control over all the variables in determining the end result. In addition, Shared Accountability can be used to indicate that a position makes an extremely strong contribution to end results (stronger than its peers) but does not have a Primary Impact.
- A basic rule is that Shared Impact does not exist vertically in an organization
-
(i.e., between superior and subordinate). However, Shared Impact can exist between peer jobs within the same organization or with a position from outside the organizational unit. Shared Impact suggests a degree of partnership in, or joint Accountability for, the total result. In this way it differs from Contributory Impact, where the position is only accountable for a portion of the end result.
- The departmental Project Manager could be considered to have a Shared Impact on all design and construction activities carried out by Public Works and Government Services Canada in the construction of a major facility.
Note: There are few situations in the Public Service where true shared accountability exists.
- P - Primary:
-
The position has controlling Impact on end results, and the accountability of others is subordinate. Such an Impact is commonly found in managerial positions which have line accountability for key end result areas, be they large or small. For example:
- The Director of a research unit may have Primary Impact upon the research activities done by all sections of the unit. A subordinate Manager within the unit may be accountable for the research activities in a section of the unit. Both positions could be evaluated at the Primary level, but the level of Magnitude (the size of the unit or function or activity) would vary.
Concept | Application |
---|---|
The relation between control and Primary Impact | The relative size of the unit is not an issue in deciding whether or not the position has Primary Impact on its results. The key is that:
|
Magnitude
Magnitude measures the size of the area affected by a position. While it does give an indication of the “weight” to be assigned to the position, it is the least important of the three sub-factors used to determine the overall Accountability / Decision Making evaluation.
For measuring Executive Group positions, a common, quantifiable means or “proxy” for representing the diverse units, functions and programs that could be affected by the position must be identified. Dollars have proven to be the most widely applicable “proxy” for measuring the Magnitude to be assigned to a given position.
However, to make a logical, rational determination of Magnitude, the evaluator must remember that dollars are simply a proxy, not an absolute measure. The key concept behind this is:
Concept | Application |
---|---|
Dollars are only a proxy to represent Magnitude | Dollars are the most convenient measure of the size of the unit or function affected by a job. However, this does not mean that jobs impact on dollars. Jobs impact on functions, programs or operations of organizational units. |
Dollar ranges on the Guide Chart correspond to varying alignments of Magnitude/Impact/Freedom to Act. The Benchmarks provide evaluators with a wide variety of examples of different levels of Magnitude to assist them in the determination of this sub-factor.
The Magnitude continuum has seven degrees, from “Very Small” to “Largest.” These headings provide a rough idea of the appropriate Magnitude for the subject position. References to the appropriate Benchmarks will help refine this initial determination. In this way, evaluators can arrive at a reasonable determination of Magnitude and avoid jumping immediately to a premature consideration of budget dollars.
Evaluators should use the following process for applying the proxy to establish the appropriate Magnitude:
- Determine and describe (in words) what part(s) and/or function(s) of the organization the job affects, and the nature of the job’s effect on each of them. For example: the position controls a Branch.
- Once the part(s) and/or function(s) most appropriate to the job have been identified, think about the relative size of the part(s) or function(s) under consideration and describe these in words. For example: the Branch is very small, small, large, very large, etc.
- Once these relationships have been articulated, verify them and the “size” selected for the job against the dimensions of the Benchmark positions.
Use of the Accountability Magnitude Index to Adjust for Inflation
To maintain consistency over time, the Magnitude proxies of the Benchmarks use constant dollars. To make comparisons between a subject job’s proxy dollars (which are expressed in current dollars) and the constant dollars in the Benchmarks, it is necessary to convert the current dollars into constant dollars. The annual Accountability/Magnitude Index (AMI) provides the factor used for this purpose.
The AMI is based on the implicit price deflator used by Statistics Canada to produce GNP data in constant dollars. To convert to constant dollars, divide current dollars by the current AMI. For example: if an operating and maintenance budget of $4 million were selected to represent the Magnitude of a position, you would divide this amount by the current AMI to arrive at constant dollars [$4 million / 6.50 = $615,385 (constant)].
The current AMI is updated periodically by the Public Service Human Resources Management Agency. The AMIs from financial year 1980/81 are as follows:
Year | Index |
---|---|
1980-1981 | 2.45 |
1981-1982 | 2.77 |
1982-1983 | 3.06 |
1983-1984 | 3.41 |
1984-1985 | 3.61 |
1985-1986 | 3.72 |
1986-1987 | 3.83 |
1987-1988 | 3.91 |
1988-1989 | 4.03 |
1989-1990 | 4.17 |
1990-1991 | 4.37 |
1991-1992 | 4.50 |
1992-1993 | 4.60 |
1993-1994 | 4.70 |
1994-1995 | 4.80 |
1995-1996 | 5.00 |
1996-1997 | 5.00 |
1997-1998 | 5.00 |
1998-1999 | 5.20 |
1999-2000 | 5.40 |
September 2000 | 6.00 |
September 2002 | 6.50 |
April 2006 | 7.00 |
September 2010 | 8.00 |
“Pass-Through Dollars”
Many positions may appear to have a very large Magnitude, but the dollars being used to measure the unit are “Pass-Through Dollars.” (Transfer payments to individuals or other jurisdictions under social programs which are controlled largely by legislation, regulation or formula fall into this category. An example would be Canada Pension Plan payments.) The key to handling Pass-Through Dollars is as follows:
Concept | Application |
---|---|
Pass-Through Dollars are not an appropriate Magnitude proxy | In cases of Pass-Through Dollars, the position deals with the process of payment but has practically no impact on determining whether payments should be made or what payments should be made. These dollars do not properly represent the Magnitude of the position. A more appropriate proxy should be found. |
Choosing the Correct Impact / Magnitude Combination
An evaluation score may differ depending on the combination of Impact and Magnitude used. For instance:
- A function head (e.g., a Director General of Human Resources) may be seen to have a Contributory Impact on the operations of the Department or a Primary Impact on the operations of the Human Resources Branch.
- Very often the point totals available in these two slots will be the same. Where they are not the same, it is advisable to use the higher score to properly reflect the full job size as long as you are confident that your reasoning is correct.
The key is to find the combination of Impact and Magnitude that results in the highest legitimate evaluation. This is because it is vital to get the fullest, most complete measure of the position for these two sub-factors so as to properly reflect the job size. Table 1 on the following pages provides some guidelines for evaluating certain types of expenditures when these are used as the proxy.
Dimensions | Impact of Operating Management |
Impact of Staff or Support Function |
Comments |
---|---|---|---|
1. Salary, operating and maintenance budget (used to represent an organizational unit) | Primary | Contributory or Indirect | The Impact of operating management is Primary because the main accountability for unit or program end results rests with operating management. The Impact of staff depends on the significance of the advisory and facilitating role (i.e., Policy Advisor could be C or I). |
2. Capital budget (used to represent a capital program) | Primary or Shared | Contributory, Indirect or None | The Impact of operating management is Primary when feasibility, design, construction, installation and utilization are controlled by a single manager (which is rarely the case). It is less (e.g., Shared) when a department project manager acts as a “knowledgeable” client for a specific project. The Impact of functional staff depends on the significance of the advisory and facilitating role. |
3. Human resources costs (used to represent the human resources function) | Contributory or Indirect | Contributory or Indirect | The number of positions which play a role in designing and/or implementing this function (e.g., central agencies, departmental managers, departmental Human Resources jobs) means that no one or two positions control the significant activities necessary for Primary or Shared. |
Combining the Accountability / Decision Making Sub-Factors
The result of evaluating the three sub-factors is that the evaluation falls within a cell with three possible point values, each representing one step up in size.
Example: A cell with three possible point values, each representing one step up in size
Low | 350 |
---|---|
Middle | 400 |
High | 460 |
The number chosen from the cell will depend upon your assessment of the relative strength of the job’s fit to sub-factor definitions. However, unlike the sub-factors of Know-How and Problem Solving/Thinking (which have a more or less equal weight in determining the factor’s score), in the case of Accountability / Decision Making, the fit of the Freedom to Act sub-factor is the most important one to consider.
There is a propensity for evaluators to forget this hierarchy of values and use Magnitude to drive the selection of the number in the cell. This is inappropriate. Considerations of Impact and Magnitude (the least important sub-factor) should be used to confirm the direction of the overall evaluation, as determined by considerations of Freedom to Act.
Dimensions | Impact of Operating Management | Impact of Staff or Support Function | Comments |
---|---|---|---|
4. Purchased materials and equipment (used to represent the purchasing function) | Shared, Contributory or Indirect | Shared, Contributory or Indirect | The Impact of Department of Public Works and Government Services people would be Contributory for the normal supply & service role, or Shared where their role is one of heavy involvement in determining specifications, in addition to the normal supply & service role. The extent to which departmental functional staff (e.g., head of informatics) act as the department’s purchasing agents will affect the Impact recognized in the evaluation. |
5. Grants and contributions (used to represent a program) | Contributory or Indirect | Indirect | The Impact could range from Contributory to Indirect depending on the degree of:
|
6. Transfer payments (used to represent a program) | Indirect or None | None | When transfer payments are determined by a formula with no discretion, the Impact would likely be none. Where there is some discretion in determining amount and/or use, the Impact would likely be Indirect because the position has some effect on the program. |
Checking the Accountability / Decision Making Evaluation
Up, Down and Level Profiles
The evaluation score of a position gives an indication of its size, relative to other jobs. It answers the question, “How big is this job?” However, the relationship between the scores for the Problem Solving / Thinking and the Accountability / Decision Making factors is indicative of the shape of the job and answers the questions, “What sort of job is this?” “Is it characterized by thinking (Problem Solving) or action (Accountability), or is the balance about equal?”
- An action-oriented job is primarily oriented toward generating end results. Problem Solving takes a secondary position in this position. Therefore, the points given to Accountability / Decision Making will be higher than those for Problem Solving / Thinking. This relationship is known as an Up, or “A,” Profile.
- A thinking job exists to apply Know-How in the analysis, investigation and identification of situations. The Problem Solving / Thinking points will be greater than those for Accountability / Decision Making. This is known as a Down, or “P,” Profile.
- A balanced or level job is one in which the Accountability / Decision Making and Problem Solving / Thinking points are the same. The position will be staff-oriented and have responsibility for managerial or supervisory functions. This is known as a Level, or “0,” Profile.
While there are no hard-and-fast rules, particular types of jobs do tend to have predictable profiles:
Down Profiles
- P4 P3
- Problem Solving / Thinking points exceed Accountability / Decision Making points by four or three steps, respectively. Jobs with these profiles will tend to be concerned with basic or pure research, with little orientation to, or regard for, development aspects. P4 Jobs will rarely be found outside a university.
- P2 P1
- Problem Solving / Thinking points exceed Accountability / Decision Making points by two steps or one step, respectively. Applied research or policy development jobs will tend to have these profiles.
Dimensions | Impact of Operating Management | Impact of Staff or Support Function | Comments |
---|---|---|---|
7. Revolving funds (represents payment received from clients for services rendered) | None | None | Individual positions do not have sufficient impact on what is to be measured. That is, the impact is less than Indirect. Payments received should not be double-counted against corresponding expenditures, nor should they be used to reduce operating expenditures to a net figure. |
8. Dimensions lying outside the Public Service such as value of the GNP | None or Contributory or Indirect | None or Indirect | The relationship of Public Service positions to these dimensions is, in most cases, too remote for the measurement of any Impact. Where influence can be clearly identified, the Impact of operating management positions is normally Indirect and is typically exerted through legislative, regulatory or enforcement authorities. Contributory Impact could apply to operating management positions in which the degree of control over end results is considerable. Indirect Impact could apply to staff or support functions when the degree of control or influence over end results is considerable. In all these instances, the incumbent must be identified in the job description and the performance review process as answerable for results. |
Level Profiles
- L
- Problem Solving / Thinking points equal Accountability / Decision Making points. Jobs with these profiles will tend to involve providing support services in staff functions or supervisory positions such as financial analysts or heads of functional specialties.
Up Profiles
- A1
- Accountability / Decision Making points exceed Problem Solving / Thinking points by one step. Jobs with A1 profiles are often hybrid jobs with significant people management responsibilities (such as human resources managers), line management positions, or jobs which receive a significant degree of direction from functional units, such as project managers or regional directors of administrative services.
- A2 A3
- These profiles are found in line management jobs which have a clear and well defined responsibility for achieving results, such as regional director for operations.
- A4
- Examples of this profile are unusual but can occur where the Accountability for results is high but the Problem Solving or Know-How content of the job is relatively low.
Up, Down and Level profiling allows the validity of evaluations to be checked against typical job profiles. Discrepancies, if found, may indicate an incorrect evaluation. However, they might also indicate an inappropriately structured job. Therefore, it is important to avoid letting profiles drive the evaluation process.
Validating Against the Benchmarks
The most important test of the validity of the evaluation is finding comparable reference evaluations in the standardized continuum of the Benchmarks, as outlined in Appendix C.
Using the Benchmarks
The Benchmark reference positions have two critical roles to play in the job evaluation process:
- They provide the necessary discipline of a constant set of reference points.
The Benchmark evaluations have been thoroughly checked to ensure that the job evaluation method was applied consistently and appropriately. As a result, they provide a constant standard against which to evaluate positions, making them the key tool for ensuring consistency in the application of the Guide Chart methodology, over time, throughout the federal government.
- They allow for flexibility within a disciplined framework.
Given the number and complexity of Executive Group positions to be evaluated, it would be both impossible and counter-productive to attempt to provide hard-and-fast solutions for every possible situation. By providing a sufficient number of constant Benchmark positions, but allowing evaluators to use their common sense in using them, both flexibility and discipline can be built into the process.
Copies of the Benchmark reference positions are found in Appendix C.
Process for Selecting Suitable Benchmarks
1) How Benchmarks are arranged
The Benchmarks are sorted in two different ways:
- The Benchmark Job Descriptions are arranged into groups based on job function (Appendix C).
- There is a second listing of the positions by level, in descending order of total points (Appendix C).
2) Information needed about the subject position
In order to select suitable Benchmarks, the evaluator first needs to know:
- Where the subject position fits in the organization (e.g., number of levels from the Deputy Head)
- The job function (e.g., financial, operational, human resources)
- The basic nature of the job (e.g., to think deeply, as in research positions, to think broadly, as in policy development, to direct activities, as in field operations, or to administer policies and practices, as in staff positions)
- Whether the position is line or staff, regional or located at headquarters
All this information should be in the Job Description.
3) Selecting Suitable Benchmarks
Two or three Benchmark comparisons should be sufficient for testing the validity of an evaluation. A strong comparator is one in which the organizational context, the overall evaluation and the evaluations of the three factors are all similar to that of the subject position (i.e., fewer than three steps away on any one factor).
However, on occasion, it may be difficult to find a Benchmark that fits the subject position on all three factors. For example, one Benchmark might provide a close fit on the Know-How factor but not work well on the Accountability / Decision Making factor. The best thing to do would be to try to find other, more suitable, Benchmarks since the imbalance affects the profile fit of the two positions. However, if a good, overall fit cannot be found with any of the Benchmarks, the evaluator should look for an additional Benchmark position which provides a good fit for the missing factor (in the above example, Accountability / Decision Making).
General Accountability | |
---|---|
Subject Position | Is accountable for the proper and effective management and administration of all operations programs in the Region to ensure the provision of economic support, pension, medical examinations, social well-being and health care to qualified veterans and their dependants. Is accountable for the operation of the Saskatoon Veterans Home. |
Benchmark 7-A-2 | Is accountable for directing and managing regional operational functions essential to conducting the Agency’s statistical programs, and for developing the overall strategic framework and business plans for the regional operations to enhance the Agency’s capacity to provide Canadians with statistical information. |
Benchmark 6-A-2 | Is accountable for the efficient and effective direction of regional operations and programs designed to assist inmates and parolees with their reintegration into civil society. |
Benchmark 5-A-1 | Is accountable for providing a regional lens in the conception, development and delivery of national programs to promote good nutrition and informed use of drugs, food and natural health products, and to maximize the safety and efficacy of pharmaceutical drugs, food, natural health products, medical devices, biologics and related biotechnology products in the Canadian marketplace and health system to promote and protect the health of Canadians. |
Sample Validations Using Benchmarks
Below are two examples of the process for using Benchmarks. Descriptive statements from the job description of the subject position and the sample Benchmarks are shown beneath. These statements are not the only relevant differentiating considerations. They are presented simply to illustrate the thinking process involved in using Benchmarks.
Sample Validation: Example A
Subject Position: Regional Director General, Prairie Region
Evaluation:
Factors | Points | |
---|---|---|
Know-How | F III 3 | 608 |
Problem Solving / Thinking | F 4 (50%) | 304 |
Accountability / Decision Making | F 3 P | 400 |
Total | 1312 |
Benchmarks Selected:
Factors | Points | |
---|---|---|
Know-How | G III 3 | 700 |
Problem Solving / Thinking | F 4 (57%) | 400 |
Accountability / Decision Making | F 4 P | 528 |
Total | 1628 |
Factors | Points | |
---|---|---|
Know-How | F III 3 | 608 |
Problem Solving / Thinking | F 4 (50%) | 304 |
Accountability / Decision Making | F 2 P | 350 |
Total | 1262 |
Factors | Points | |
---|---|---|
Know-How | F III 3 | 528 |
Problem Solving / Thinking | F 4 (50%) | 264 |
Accountability / Decision Making | F 2 P | 304 |
Total | 1096 |
Major challenge | |
---|---|
Subject Position | The position is expected to negotiate with provincial authorities to ensure an equitable distribution of contract beds for the priority use of veterans, and achieve an optimum care/cost ratio within the framework of provincial standards in Manitoba, Saskatchewan and Alberta. |
Benchmark 7-A-2 | The position must build national consensus for the implementation of policies and procedures that respond to the changing needs of clients. In meeting this challenge, the incumbent must balance increasing demands for quality information with public and political concern regarding invasion of privacy, response burden, voluntary participation and respondent refusal or mistrust. |
Benchmark 6-A-2 | The position leads the cooperative development of innovative programs with community support organizations or spiritual and cultural leaders to help offenders benefit from a broad range of programs and services, and assist them in successfully reintegrating into the community. |
Benchmark 5-A-1 | The position fosters a spirit of cooperation, collaboration, teamwork and partnership between the Department and the communities, through contacts with senior officials such as the Deputy Minister or Minister, Assistant Deputy Ministers, Presidents or Directors in provincial health, social services and education departments or agencies. |
Example A Explanation: The Know-How Factor
Subject Job | Director General, Regional Operations (7-A-2) | Assistant Deputy Commissioner, Operations (6-A-2) | Regional Director, Health Products and Food (5-A-1) |
---|---|---|---|
F III 3 608 | G III 3 700 | F III 3 608 | F III 3 528 |
- One consideration for relating the Know-How of the subject position to the Benchmarks is the management structure above each position. There are the same number of layers between the Deputy Minister and the position holders in all four cases.
- Another consideration is the diversity of the programs managed. While the program mixes of the subject position and the three Benchmark positions are relatively homogeneous in end results, the national role of Benchmark 7-A-2, the variety of its clientele and the ad-hoc nature of the demands placed upon it warrant a higher degree of specialized and practical skills and knowledge than the subject position and the other two Benchmark positions.
- A number of variables must be considered when evaluating Know-How: cultural diversity of the publics served, geographic dispersion of the territory managed, and size and type of staff managed. Benchmark 7-A-2 is accountable for the department-wide provision of products and services to a paying clientele with varied interests in and purposes for the products requested, and it manages a larger staff than the subject position and the other two Benchmark positions.
- The subject position and Benchmarks 6-A-2 and 5-A-1 are all responsible for managing the delivery of direct client services within a specific geographic area, whereas Benchmark 7-A-2 leads the management of service delivery nationally in all regions, which adds to the depth of procedures, techniques and disciplines required and the variety of environments and clienteles.
- The subject position is considered comparable to Benchmark 6-A-2 as both positions require a similar breadth and depth of knowledge to direct the delivery of the full range of departmental services at the regional level, whereas Benchmark 5-A-1 reports to a corporate ADM at National Headquarters and requires knowledge and skill related to one departmental business line.
Benchmark Comparisons Example A
Subject Position | Employees: | 317 |
---|---|---|
Operating budget: | $3.3 million | |
Grants and Contributions: | $15.0 million | |
Benchmark 7-A-2 | Employees (both Public Service and Statistics Act): | 3,012 (Non-census year) 49,645 (Census year) |
Operating budget: | $12.6 million (Non-census Year) $30.3 million (Census Year) |
|
Annual Sales: | $1.2 million | |
Benchmark 6-A-2 | Employees: | 52 |
Operating budget: | $438,500 | |
Benchmark 5-A-1 | Employees: | 32 |
Operating budget: | $310,000 |
Subject Job | Director General, Regional Operations (7-A-2) | Assistant Deputy Commissioner, Operations (6-A-2) | Regional Director, Health Products and Food (5-A-1) |
---|---|---|---|
F4 (50%) 304 | F 4 (57%) 400 | F 4 (50%) 304 | F 4 (50%) 264 |
- The geographic dispersion, the nation-wide leadership, the specific requirements of the clientele and the revenue-generating function create discernible differences between the complexity of programs managed by Benchmark 7-A-2 and those delivered by the subject position and the other two Benchmark positions.
- The difference in geographic areas serviced by the subject position and Benchmarks 6-A-2 and 5-A-1 is not considered to require significantly different innovative thinking.
Subject Job | Director General, Regional Operations (7-A-2) | Assistant Deputy Commissioner, Operations (6-A-2) | Regional Director, Health Products and Food (5-A-1) |
---|---|---|---|
F 3 P 400 | F 4 P 528 | F 2 P 350 | F 2 P 304 |
- The subject position and the Benchmark positions appear to have a similar degree of latitude and authority to act on behalf of the department within the scope of their delivery responsibilities in their respective areas.
- Benchmark 7-A-2 is viewed as making a noticeably more significant contribution to the achievement of the department’s end results than the subject position, hence a push to the middle number of the magnitude range of the Benchmark position. The other two Benchmark positions and the subject position are viewed as having a similar degree of contribution within their respective magnitude ranges.
General Accountability | |
---|---|
Subject Position | Is accountable for managing the development, implementation and maintenance of departmental financial and accounting policies, systems and procedures to meet the operational needs of management and to conform with the requirements of government acts, statutes and regulations, and for providing non-transfer payment accounting services and advice to responsibility centre managers across the Department. |
Benchmark 6-O-2 | Is accountable for establishing the Agency’s strategic and corporate guidelines in the sectors of modern comptrollership, human resources, financial resources and administrative services management, in order to achieve maximum effectiveness in the use of the Department’s human, financial and material resources. |
Benchmark 5-M-1 | As the senior full-time financial officer and comptroller, is accountable for the financial policies, systems, procedures, operations and accounting activities to ensure effective control and stewardship of the financial resources appropriated, and for ensuring the introduction and acceptance of modern accounting and financial management standards and practices. |
Benchmark 4-O-3 | Is responsible for the proper and effective management of the finance, materiel management, information systems, contract administration, and administration functions in the Department’s Atlantic Region. |
Sample Validation: Example B
Subject Position: Director Financial Policies, Systems and Accounting
Evaluation :
Factors | Points | |
---|---|---|
Know-How | F III 3 | 460 |
Problem Solving / Thinking | E 4 (50%) | 230 |
Accountability / Decision Making | E 5 C | 264 |
Total | 954 |
Benchmarks Selected
Factors | Points | |
---|---|---|
Know-How | G III 3 | 608 |
Problem Solving / Thinking | F 4 (57%) | 350 |
Accountability / Decision Making | F 2 P | 350 |
Total | 1308 |
Factors | Points | |
---|---|---|
Know-How | F III 3 | 528 |
Problem Solving / Thinking | E 4 (50%) | 264 |
Accountability / Decision Making | E 5 C | 264 |
Total | 1056 |
Factors | Points | |
---|---|---|
Know-How | F II 3 | 460 |
Problem Solving / Thinking | E 4 (43%) | 200 |
Accountability / Decision Making | E 2 P | 230 |
Total | 890 |
Major Challenge | |
---|---|
Subject Position | A major challenge for the Director is to provide management with an integrated accounting and financial systems operation that has effective accounting mechanisms and control, and at the same time meets the requirements of Parliament and Central Agencies with respect to departmental initiatives in the areas of Economic Development and Trust and Loan Fund Management. |
Benchmark 6-O-2 | The position is expected to provide corporate leadership in the development, dissemination and implementation of policies, systems, processes, services and control mechanisms for the seamless implementation of the management reform underpinning the modern comptrollership and modern management functions affecting the management of the Agency’s financial, material and human resources. |
Benchmark 5-M-1 | The position’s challenge is to ensure the continued integrity of the financial management data, systems and procedures while ensuring the successful implementation of significant changes in financial management practices, information systems and accounting procedures arising out of the Modern Comptrollership Initiative and other financial modernization initiatives. |
Benchmark 4-O-3 | The position guarantees the integrity and reliability of budgetary control and reporting systems in the Region to ensure that probity and prudence are duly exercised in the handling of regional resources. |
Subject Job | Director General, Resource Management (6-O-2) | Director, Financial Management (5-M-1) | Regional Manager, Fin. & Admin. (4-O-3) |
---|---|---|---|
F III3 460 | G III 3 608 | F III3 528 | F II3 460 |
- While the subject position and Benchmark 5-M-1 are both rated F III 3, the Benchmark is a senior full-time financial officer and is thus seen as requiring higher degrees of professional and operational knowledge. Benchmark 6-O-2 reflects a degree of mastery required to provide executive direction at the first managerial level for the management of all the infrastructure support functions across the department. The subject position and Benchmark 4-O-3 reflect the depth of professional expertise required to provide executive level support in the delivery of the department’s programs.
- The managerial expertise required for the subject position is less than that required for Benchmarks 6-O-2 and 5-M-1, which have department-wide functional responsibilities. The department-wide role of the subject position and the focus of its end results, although rated higher, are weighed as comparable with the regional focus of Benchmark 4-O-3 and receive the same numerical value.
Subject Job | Director General, Resource Management (6-O-2) | Director, Financial Management (5-M-1) | Regional Manager, Fin. & Admin. (4-O-3) |
---|---|---|---|
E 4 (50%) 230 | F 4 (57%) 350 | E 4 (50%) 230 | E 4 (43%) 200 |
- As the senior functional position in the department, Benchmark 6-O-2 is clearly involved in more demanding, extensive and varied issues than are the subject position or the other two Benchmark positions.
- The departmental functional roles of the subject position and of Benchmark 5-M-1 are viewed as requiring a higher degree of innovation, creativity and integration than Benchmark 4-O-3.
Benchmark Comparisons Example B
Subject Position | Division Employees: | 50 |
---|---|---|
Division Operating Budget: | $500,000 | |
Departmental Operating Budget | $885 million | |
Benchmark 6-O-2 | Branch Employees: | 30 |
Department Employees: | 380 | |
Branch Operating Budget: | $800,000 | |
Agency Operating Budget: | $6.7 million | |
Benchmark 5-M-1 | Division Employees: | 48 |
Division Operating Budget: | $484,000 | |
Department Operating Budget: | $177 million | |
Benchmark 4-O-3 | Region Employees: | 1,370 |
Division Employees: | 133 | |
Region Operating Budget: | $49 million | |
Division Salary Budget: | $866,000 |
Subject Job | Director General, Resource Management (6-O-2) | Director, Financial Management (5-M-1) | Regional Manager, Fin. & Admin. (4-O-3) |
---|---|---|---|
E 5 C 264 | F 2 P 350 | E 5 C 264 | E 2 P 230 |
- The ratings reflect the higher level of empowerment and contribution of Benchmark 6-O-2 in the achievement of expected end results and departmental objectives.
- Note that while the magnitude levels of the subject position and Benchmark 5-M-1 could warrant a difference in their respective numerical ratings, their contribution to the achievement of the department’s goals is viewed overall as being of equal value.
Likely Step Differences | |||||
---|---|---|---|---|---|
Designation | Unit Diagram | Common Characteristics | Know How | % Problem Solving | Account-ability |
“Normal” Superior/Subordinate Relationship |
|
2 | 1 | 3 | |
Lean Staff |
|
3 | 1 or 2 | 3 to 5 | |
Missing Level |
|
4 | 2 | 5 to 7 | |
One Over One |
|
1 | 1 | 2 |
Organization Check
A crucial test of the validity of the evaluation is whether it fits with the evaluations for other positions in the unit. This means that when you isolate each factor, the step differences between the subject job and the supervisor, peer and subordinate positions for each factor all make sense.
A common evaluation error is over-emphasizing the differences between peer positions and under-emphasizing the differences between superior and subordinate. Note, however, that there are no rules for determining the proper relationship between levels in an organization. Each case must be assessed on its own.
For example: the previous chart shows four organizational structures with very different superior/subordinate relationships. In each case, the step differences between the factors for the two levels change. However, these examples should not be taken as hard and fast rules. They simply serve to demonstrate:
- a variety of superior/subordinate relationships that can make sense
- the importance of looking at the reality of the actual departmental structure when testing the validity of a new evaluation
In the final analysis, as throughout the evaluation process, informed common sense should be the tool for making and checking all judgments.
Appendix A - Executive (Ex) Group Definition
The Executive Group comprises positions located no more than three hierarchical levels below the Deputy or Associate Deputy level and that have significant executive managerial or executive policy roles and responsibilities or other significant influence on the direction of a department or agency. Positions in the Executive Group are responsible and accountable for exercising executive managerial authority or providing recommendations and advice on the exercise of that authority.
Inclusions
Notwithstanding the generality of the foregoing, it includes positions that have, as their primary purpose, responsibility for one or more of the following activities:
- Managing programs authorized by an Act of Parliament, or an Order-in-Council, or major or significant functions or elements of such programs;
- Managing substantial scientific or professional activities;
- Providing recommendations on the development of significant policies, programs or scientific, professional or technical activities; and
- Exercising a primary influence over the development of policies or programs for the use of human, financial or material resources in one or more major organizational units or program activities in the Public Service.
Exclusions
Positions excluded from the Executive Group are those whose primary purpose is included in the definition of any other group.
Classification Levels: Executive Group
The total points assigned through the evaluation process will determine the classification level for a newly evaluated position. Positions within the assigned point bands fall into compensation levels from EX-1 to EX-5. The bands are as follows:
Classification Level | Minimum Points | Maximum Points |
---|---|---|
EX-5 | 2448 | N/A |
EX-4 | 1868 | 2447 |
EX-3 | 1560 | 1867 |
EX-2 | 1262 | 1559 |
EX-1 | 920 | 1261 |
Appendix B - Guide Charts
Government of Canada Executive Group
Guide Chart for Evaluating Know-How
- Definition
-
Know-How is the sum total of Every kind of knowledge and skill, However Acquired, needed for Competent Job Performance.
Know How has three components, the requirements for:
Note: Each of the components is marked with a certain number of dots. The number of dots corresponds to a section on the Guide Chart for Evaluating Know-How.
- Practical, Technical, Specialized Know-How
- Varied applied skills, including those relating to human relations, knowledge of the position’s environment and clientele (e.g., the public, industry, special interest groups, other governments, etc.), practical procedures, specialized techniques and/or scientific/professional disciplines.
- •Managerial and Operational Know-How
- The Know-How and skill involved in guiding and integrating the resources associated with an organizational unit or function in order to produce the expected results. The knowledge and skills may be exercised executively ("acting as a manager") or consultatively ("thinking as a manager"). Involved is some combination of planning, organizing, integrating, coordinating, directing, motivating and developing human resources, controlling, evaluating, and checking. This Know-How may be required in providing service to the client/customer and/or advice to others, and becomes more critical as the conflicting demands and priorities of clients/customers increase.
- Criticality of Human Relations
- This is a measure of how relatively crucial, critical, and difficult are the various interpersonal relationships which positions must establish and maintain in order to achieve the objectives.
- Measuring Practical, Technical, Specialized Know-How
- This type of knowledge and skill may be characterized by breadth (variety), or depth (complexity), or both. Jobs may require some combination of: various skills; some knowledge about many things; a good deal of knowledge about a few things. Thus, to measure this kind of Know-How, the evaluator has to understand what skills are needed and how much knowledge is needed about how many things and how complex each of them is.
- Function
- A group of diverse activities which, because of common objectives, similar skill requirements, and strategic importance to an organization, are usually directed by a member of top management.
- Subfunction
- A major activity which is part of, and more homogeneous than, a function.
- Element
- A part of a subfunction; usually very specialized in nature and restricted in scope or impact.
Managerial and Operational Know-How | ||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
II. Operation of a unit with activities that are relatively similar in nature and objective, OR guidance of a sub-function(s) or several significant elements across several units. | III. Operation of a large unit with activities that are noticeably different in objectives and the nature of the end results, OR guidance of a function(s) which affects all of the organization. | IV. Operation of a major unit with activities which are significantly different and divergent with respect to objectives and end results, OR guidance of a strategic function(s) which significantly affects the organization’s planning and operation. | V. Management of all units and functions of a large organization, OR total management of the major segment of a very large organization. | |||||||||||
Human RelationsTable 1 - Footnote † | 1 | 2 | 3 | 1 | 2 | 3 | 1 | 2 | 3 | 1 | 2 | 3 | ||
Practical, Technical, Specialized Know-How | E. A sound understanding of and skill in several activities which involve a variety of practices and precedents with respect to the organization’s processes, operation and clientele, OR a grasp of a scientific or similar discipline’s theory and principles, OR both. | Low | 200 | 230 | 264 | 264 | 304 | 350 | 350 | 400 | 460 | 460 | 528 | 608 |
Medium | 230 | 264 | 304 | 304 | 350 | 400 | 400 | 460 | 528 | 528 | 608 | 700 | ||
High | 264 | 304 | 350 | 350 | 400 | 460 | 460 | 528 | 608 | 608 | 700 | 800 | ||
F. Extensive knowledge and skill gained through a wide and/or deep exposure to the involved and/or diverse practices, processes, and issues relating to the organization and its clients, OR command of complex scientific theory, principles, and practices, OR both. | Low | 264Table 1 - Footnote * | 304Table 1 - Footnote * | 350Table 1 - Footnote * | 350Table 1 - Footnote * | 400Table 1 - Footnote * | 460Table 1 - Footnote * | 460 | 528 | 608 | 608 | 700 | 800 | |
Medium | 304Table 1 - Footnote * | 350Table 1 - Footnote * | 400Table 1 - Footnote * | 400Table 1 - Footnote * | 460Table 1 - Footnote * | 528Table 1 - Footnote * | 528 | 608 | 700 | 700 | 800 | 920 | ||
High | 350Table 1 - Footnote * | 400Table 1 - Footnote * | 460Table 1 - Footnote * | 460Table 1 - Footnote * | 528Table 1 - Footnote * | 608Table 1 - Footnote * | 608 | 700 | 800 | 800 | 920 | 1056 | ||
G. Mastery of theories, principles, and techniques, or the cumulative equivalent command, of the interrelationships, variables, and competing demands of the organization and its clients, and related programmes and other issues necessary to advise AND/OR implement programmes at the executive management or executive policy levels of the organization. | Low | 350Table 1 - Footnote * | 400Table 1 - Footnote * | 460Table 1 - Footnote * | 460Table 1 - Footnote * | 528Table 1 - Footnote * | 608Table 1 - Footnote * | 608Table 1 - Footnote * | 700Table 1 - Footnote * | 800Table 1 - Footnote * | 800 | 920 | 1056 | |
Medium | 400Table 1 - Footnote * | 460Table 1 - Footnote * | 528Table 1 - Footnote * | 528Table 1 - Footnote * | 608Table 1 - Footnote * | 700Table 1 - Footnote * | 700Table 1 - Footnote * | 800Table 1 - Footnote * | 920Table 1 - Footnote * | 920 | 1056 | 1216 | ||
High | 460Table 1 - Footnote * | 528Table 1 - Footnote * | 608Table 1 - Footnote * | 608Table 1 - Footnote * | 700Table 1 - Footnote * | 800Table 1 - Footnote * | 800Table 1 - Footnote * | 920Table 1 - Footnote * | 1056Table 1 - Footnote * | 1056 | 1216 | 1400 | ||
H. Externally recognized mastery and expertise in a complex scientific field or other learned discipline. | Low | 460 | 528 | 608 | 608 | 700 | 800 | 800 | 920 | 1056 | 1056 | 1216 | 1400 | |
Medium | 528 | 608 | 700 | 700 | 800 | 920 | 920 | 1056 | 1216 | 1216 | 1400 | 1600 | ||
High | 608 | 700 | 800 | 800 | 920 | 1056 | 1056 | 1216 | 1400 | 1400 | 1600 | 1840 |
Guide Chart for Evaluating Problem Solving / Thinking
- Definition:
-
Problem Solving/Thinking is the original, self starting thinking required by the job to: (1) identify, (2) define, and (3) resolve a problem. "You think with what you know" - this is true of even the most creative work. The raw material of any thinking is knowledge of facts, principles and means. Ideas are put together from something already there. Therefore, Thinking is treated as a percentage utilization of Know How.
There are two components :
Note: Each of the two components is marked with a certain number of dots. The number of dots corresponds to a section on the Guide Chart for Evaluating Problem Solving / Thinking.
- The Thinking Environment:
- The extent to which assistance or guidance is available from others or from past practice or precedents and the degree to which the position is required to identify situations where direction or precedents are not applicable. How well/poorly defined is the problem, issue, etc.?
- The Thinking Challenge:
- The novelty and complexity of the thinking to be done and the time pressures within which the thinking must be done.
- Measuring Problem Soving / Thinking:
- Problem Solving/Thinking measures the intensity of the mental process which employs Know How in analyzing, evaluating, creating, reasoning, arriving at and making conclusions. To the extent that Problem Solving/Thinking is circumscribed by standards, covered by precedents, or referred to others, the scope of the Problem Solving/Thinking is diminished, and the emphasis correspondingly is on Know How.
- N.B.
- The evaluation of PROBLEM SOLVING/THINKING should be made without reference to the job’s freedom to make decisions or take action; the scope and nature of the job’s decisions are measured on the ACCOUNTABILITY/DECISION MAKING Chart.
Thinking Challenge | |||||
---|---|---|---|---|---|
3. Differing situations requiring search for solutions within the area of known things. Appropriate action selected based on experience. Some use of judgement required. | 4. Variable situations requiring analytical, interpretative, evaluative, and/or constructive thinking, often on short notice. | 5. Novel or non recurring, path finding situations in complex administrative or research situations requiring the development of new concepts and imaginative approaches, usually under some pressure. | |||
Thinking Environment | D. Thinking within clear but substantially diversified procedures; precedents covering many situations, and/or access to assistance. | Low | 29% | 38% | 50% |
High | 33% | 43% | 57% | ||
E. Thinking within a well defined frame of reference and towards specific objectives, in situations characterized by specific policies, practices, and precedents. | Low | 33% | 43% | 57% | |
High | 38% | 50% | 66% | ||
F. Thinking within a general frame of reference toward objectives, in situations with some nebulous, intangible, or unstructured aspects. | Low | 38% | 50% | 66% | |
High | 43% | 57% | 76% | ||
G. Thinking within concepts, principles, and broad guidelines toward the organization’s objectives or goals; many nebulous, intangible, or unstructured aspects to the environment. | Low | 43% | 57% | 76% | |
High | 50% | 66% | 87% | ||
H. Thinking within organization philosophy and/or natural laws and/or principles governing human affairs. | Low | 50% | 66% | 87% | |
High | 57% | 76% |
Guide Chart for Evaluating Problem Solving / Thinking (PDF Document – 113 KB)
Guide Chart for Evaluating Accountability/Decision Making
- General
-
Accountability/Decision Making is the measurement of the degree to which a job is responsible for achieving results and the importance of those results to the organization.
There are three components in the following order of importance:
- Freedom to Act
- The degree to which a job, through delegation or empowerment, acts independently to achieve end results before seeking advice or direction as defined in the left-hand column of the next page.
- Impact
- As defined.
- Magnitude
- The size, relative to the whole Public Service, of the unit or function most clearly affected by the decisions AND/OR recommendations of the job. The process for determining Magnitude is described in Measuring Accountability / Decision Making.
- N.B.
- Magnitude and Impact must fit together; neither can be final or meaningful without being related to the other.
- Impact
- The degree to which the job affects or brings about the results expected of the unit or function being considered.
- Indirect (I)
- Supportive and ancillary services, where activities are noticeably removed from final decisions and assistance is modified or merged with other support before the end result stage.
- Contributory (C)
- Interpretative, advisory or facilitating services, for use by others in taking action, which are influential and closely related to action or decisions by others OR measurable contribution, as a member of a team, in achieving end results.
- Shared (S)
- Equal, joint, and significant control, with (usually only one) another position(s) (except own subordinates and superior), over the activities and resources which produce the results, OR control of what are clearly many (but not all) of the significant variables in determining results.
- Primary (P)
- Controlling Impact—The position has effective control over the significant activities and resources which produce the results, and is the sole position (at its level of Freedom to Act) which must answer for the results.
Magnitude (Constant Dollars) | 1. Very Small (under $100K) |
2. Small ($100K to $1 Million) |
3. Medium ($1 to $10 Million) |
4. Medium–Large ($10 to $100 Million) |
5. Large ($100 Million to $1 Billion) |
6. Very Large ($1 to $10 Billion) |
7. Largest (over $10 Billion) |
|||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Impact | ITable 3 - Footnote * | CTable 3 - Footnote ** | STable 3 - Footnote † | PTable 3 - Footnote †† | ITable 3 - Footnote * | CTable 3 - Footnote ** | STable 3 - Footnote † | PTable 3 - Footnote †† | ITable 3 - Footnote * | CTable 3 - Footnote ** | STable 3 - Footnote † | PTable 3 - Footnote †† | ITable 3 - Footnote * | CTable 3 - Footnote ** | STable 3 - Footnote † | PTable 3 - Footnote †† | ITable 3 - Footnote * | CTable 3 - Footnote ** | STable 3 - Footnote † | PTable 3 - Footnote †† | ITable 3 - Footnote * | CTable 3 - Footnote ** | STable 3 - Footnote † | PTable 3 - Footnote †† | ITable 3 - Footnote * | CTable 3 - Footnote ** | STable 3 - Footnote † | PTable 3 - Footnote †† | ||
Freedom to Act | D. These jobs are subject, wholly or in part, to practices and procedures covered by precedents or well-defined policies; supervisory review, usually after the fact. | Low | 38 | 50 | 66 | 87 | 50 | 66 | 87 | 115 | 66 | 87 | 115 | 152 | 87 | 115 | 152 | 200 | 115 | 152 | 200 | 264 | 152 | 200 | 264 | 350 | 200 | 264 | 350 | 460 |
Medium | 43 | 57 | 76 | 100 | 57 | 76 | 100 | 132 | 76 | 100 | 132 | 175 | 100 | 132 | 175 | 230 | 132 | 175 | 230 | 304 | 175 | 230 | 304 | 400 | 230 | 304 | 400 | 528 | ||
High | 50 | 66 | 87 | 115 | 66 | 87 | 115 | 152 | 87 | 115 | 152 | 200 | 115 | 152 | 200 | 264 | 152 | 200 | 264 | 350 | 200 | 264 | 350 | 460 | 264 | 350 | 460 | 608 | ||
E. These jobs, by their nature or size, are subject to broad practices and procedures covered by functional precedents and policies; achievement of a circumscribed operational activity; direction from well-defined objectives. | Low | 57 | 76 | 100 | 132 | 76 | 100 | 132 | 175 | 100 | 132 | 175 | 230 | 132 | 175 | 230 | 304 | 175 | 230 | 304 | 400 | 230 | 304 | 400 | 528 | 304 | 400 | 528 | 700 | |
Medium | 66 | 87 | 115 | 152 | 87 | 115 | 152 | 200 | 115 | 152 | 200 | 264 | 152 | 200 | 264 | 350 | 200 | 264 | 350 | 460 | 264 | 350 | 460 | 608 | 350 | 460 | 608 | 800 | ||
High | 76 | 100 | 132 | 175 | 100 | 132 | 175 | 230 | 132 | 175 | 230 | 304 | 175 | 230 | 304 | 400 | 230 | 304 | 400 | 528 | 304 | 400 | 528 | 700 | 400 | 528 | 700 | 920 | ||
F. These jobs, by their nature or size, are broadly subject to functional policies and goals; managerial direction of a general nature. | Low | 87 | 115 | 152 | 200 | 115 | 152 | 200 | 264 | 152 | 200 | 264 | 350 | 200 | 264 | 350 | 460 | 264 | 350 | 460 | 608 | 350 | 460 | 608 | 800 | 460 | 608 | 800 | 1056 | |
Medium | 100 | 132 | 175 | 230 | 132 | 175 | 230 | 304 | 175 | 230 | 304 | 400 | 230 | 304 | 400 | 528 | 304 | 400 | 528 | 700 | 400 | 528 | 700 | 920 | 528 | 700 | 920 | 1216 | ||
High | 115 | 152 | 200 | 264 | 152 | 200 | 264 | 350 | 200 | 264 | 350 | 460 | 264 | 350 | 460 | 608 | 350 | 460 | 608 | 800 | 460 | 608 | 800 | 1056 | 608 | 800 | 1056 | 1400 | ||
G. These jobs, by reason of their size, independent complexity and high degree of effect on department results, are subject only to general guidance from top-most management. | Low | 132 | 175 | 230 | 304 | 175 | 230 | 304 | 400 | 230 | 304 | 400 | 528 | 304 | 400 | 528 | 700 | 400 | 528 | 700 | 920 | 528 | 700 | 920 | 1216 | 700 | 920 | 1216 | 1600 | |
Medium | 152 | 200 | 264 | 350 | 200 | 264 | 350 | 460 | 264 | 350 | 460 | 608 | 350 | 460 | 608 | 800 | 460 | 608 | 800 | 1056 | 608 | 800 | 1056 | 1400 | 800 | 1056 | 1400 | 1840 | ||
High | 175 | 230 | 304 | 400 | 230 | 304 | 400 | 528 | 304 | 400 | 528 | 700 | 400 | 528 | 700 | 920 | 528 | 700 | 920 | 1216 | 700 | 920 | 1216 | 1600 | 920 | 1216 | 1600 | 2112 |
Guide Chart for Evaluating Accountability/Decision Making (PDF Document – 119 KB)
Appendix C - Benchmark Index
Bnumber | Benchmark Number | Level | Function | Position Title | Know-How | Problem Solving | Accountability | Total | Profile |
---|---|---|---|---|---|---|---|---|---|
10 | 10-A-1 | EX-05 | Program/Service Delivery to Canadians | Vice-President, Operations | GIV3 1056 |
G4(66) 700 |
G4P 800 |
2556 | A1 |
09 | 9-A-1 | EX-05 | Program/Service Delivery to Canadians | Assistant Deputy Minister / Regional Executive Head, Ontario | GIV3 920 |
G4(66) 608 |
G5P 920 |
2448 | A3 |
08 | 8-A-1 | EX-04 | Program/Service Delivery to Canadians | Deputy Commissioner, Prairies | GIV3 800 |
G4(57) 460 |
G4P 700 |
1960 | A3 |
07 | 7-A-1 | EX-03 | Program/Service Delivery to Canadians | Executive Director, Canada Business Service Centres | GIII3 700 |
F4(57) 400 |
F3P 460 |
1560 | A1 |
07 | 7-A-2 | EX-03 | Program/Service Delivery to Canadians | Director General, Regional Operations | GIII3 700 |
F4(57) 400 |
F4P 528 |
1628 | A2 |
06 | 6-A-1 | EX-02 | Program/Service Delivery to Canadians | District Director, Metropolitan Montreal | FIII3 608 |
F4(50) 304 |
F3P 350 |
1262 | A1 |
06 | 6-A-2 | EX-02 | Program/Service Delivery to Canadians | Assistant Deputy Commissioner, Operations | FIII3 608 |
F4(50) 304 |
F2P 350 |
1262 | A1 |
05 | 5-A-1 | EX-01 | Program/Service Delivery to Canadians | Regional Director, Health Products and Food | FIII3 528 |
F4(50) 264 |
F2P 304 |
1096 | A1 |
04 | 4-A-1 | EX-01 | Program/Service Delivery to Canadians | Director, Human Resources Centre Canada | FIII3 460 |
E4(50) 230 |
E2P 230 |
920 | 0 |
04 | 4-A-2 | EX-01 | Program/Service Delivery to Canadians | Director, Housing and Equipment | FIII3 460 |
F4(50) 230 |
E3P 264 |
954 | A1 |
10 | 10-B-1 | EX-05 | Corporate Leadership to Programs/Services | Assistant Deputy Minister, Claims and Indian Government | GIV3 1056 |
G4(66) 700 |
G6C 700 |
2456 | 0 |
09 | 9-B-1 | EX-04 | Corporate Leadership to Programs/Services | Assistant Deputy Minister, Operations | GIV3 920 |
G4(66) 608 |
G4P 800 |
2328 | A2 |
09 | 9-B-2 | EX-04 | Corporate Leadership to Programs/Services | Assistant Commissioner, Operations and Programs | GIV3 920 |
G4(66) 608 |
G5C 608 |
2136 | 0 |
08 | 8-B-1 | EX-04 | Corporate Leadership to Programs/Services | Assistant Deputy Minister, Oceans | GIII3 800 |
G4(57) 460 |
G3P 608 |
1868 | A2 |
07 | 7-B-1 | EX-03 | Corporate Leadership to Programs/Services | Director General, Primary Health Care and Public Health | GIII3 700 |
F4(57) 400 |
F3P 460 |
1560 | A1 |
06 | 6-B-1 | EX-02 | Corporate Leadership to Programs/Services | Director General, Interregional Interventions and Partnerships | FIII3 608 |
F4(57) 350 |
F2P 350 |
1308 | 0 |
05 | 5-B-1 | EX-01 | Corporate Leadership to Programs/Services | Director, Trade Integration | FIII3 528 |
F4(50) 264 |
F4I 230 |
1022 | P1 |
04 | 4-B-1 | EX-01 | Corporate Leadership to Programs/Services | Director, Operations and Regional Coordination | FII3 460 |
E4(50) 230 |
E2P 230 |
920 | 0 |
10 | 10-C-1 | EX-05 | Representing Canada’s Interests Abroad | Head of Mission | GIV3 1056 |
G4(66) 700 |
G7I 800 |
2556 | A1 |
09 | 9-C-1 | EX-04 | Representing Canada’s Interests Abroad | Vice-President, Asia | GIII3 920 |
G4(57) 528 |
G4P 700 |
2148 | A2 |
07 | 7-C-1 | EX-02 | Representing Canada’s Interests Abroad | Head of Mission / Ambassador | GIII3 700 |
F4(50) 350 |
F4C 350 |
1400 | 0 |
06 | 6-C-1 | EX-02 | Representing Canada’s Interests Abroad | Counsellor / Program Manager, Political and Economic | FIII3 608 |
F4(50) 304 |
F5I 350 |
1262 | A1 |
05 | 5-C-1 | EX-01 | Representing Canada’s Interests Abroad | Program Manager, Immigration | FIII3 528 |
F4(50) 264 |
F2P 304 |
1096 | A1 |
04 | 4-C-1 | EX-01 | Representing Canada’s Interests Abroad | Director, Circumpolar Affairs | FII3 460 |
F4(50) 230 |
F1P 230 |
920 | 0 |
10 | 10-D-1 | EX-05 | Intergovernmental | Assistant Deputy Minister, Federal-Provincial Relations and Social Policy | GIV3 1056 |
G4(66) 700 |
G6C 920 |
2676 | A2 |
07 | 7-D-1 | EX-03 | Intergovernmental | Director General, Intergovernmental Affairs | GIII3 700 |
F4(57) 400 |
F6C 460 |
1560 | A1 |
05 | 5-D-1 | EX-01 | Intergovernmental | Director, Federal/Provincial/Territorial Relations | FIII3 528 |
F4(50) 264 |
E6I 230 |
1022 | P1 |
04 | 4-D-1 | EX-01 | Intergovernmental | Chief Negotiator | FIII3 |
E4(50) 230 |
E4C 230 |
920 | 0 |
04 | 4-D-2 | EX-01 | Intergovernmental | Regional Director, Intergovernmental Affairs and Operational Policy | FII3 460 |
E4(50) 230 |
E2P 230 |
920 | 0 |
10 | 10-E-1 | EX-05 | Safety of Canadians | Senior Assistant Deputy Minister, National Security | GIV3 1056 |
G4(66) 700 |
G5C 700 |
2456 | 0 |
08 | 8-E-1 | EX-04 | Safety of Canadians | Executive Director, Pest Management Regulatory Agency | GIII3 800 |
G4(66) 528 |
G3P 608 |
1936 | A1 |
08 | 8-E-2 | EX-03 | Safety of Canadians | Director General, Civil Aviation | GIII3 800 |
F4(57) 460 |
F3P 460 |
1720 | 0 |
07 | 7-E-1 | EX-03 | Safety of Canadians | Director General, Food | GIII3 700 |
F4(57) 400 |
F3P 460 |
1560 | A1 |
06 | 6-E-1 | EX-02 | Safety of Canadians | Director, Environmental Assessment | FIII3 608 |
F4(50) 304 |
F5C 350 |
1262 | A1 |
06 | 6-E-2 | EX-02 | Safety of Canadians | Regional Director, Civil Aviation, Atlantic Region | FIII3 608 |
F4(50) 304 |
F2P 350 |
1262 | A1 |
04 | 4-E-1 | EX-01 | Safety of Canadians | Regional Director, National Crime Prevention Centre, B.C. Region | FII3 460 |
F4(50) 230 |
F3S 264 |
954 | A1 |
10 | 10-F-1 | EX-05 | Research | Assistant Deputy Minister, Research | GIV3 1056 |
G4(66) 700 |
G4P 800 |
2556 | A1 |
08 | 8-F-1 | EX-03 | Research | Director General, Bioproducts and Bioprocesses National Science Program | GIII3 800 |
F4(57) 460 |
F3P 400 |
1660 | P1 |
07 | 7-F-1 | EX-03 | Research | Director General, Northern Forestry Centre | FIII3 700 |
F4(57) 400 |
F3P 460 |
1560 | A1 |
06 | 6-F-1 | EX-02 | Research | Regional Director, Geological Survey of Canada (GSC) - Quebec | GII3 608 |
F4(50) 304 |
F2P 350 |
1262 | A1 |
04 | 4-F-1 | EX-01 | Research | Manager, St. Lawrence Centre | FII3 460 |
E4(50) 230 |
E2P 230 |
920 | 0 |
07 | 7-G-1 | EX-03 | Leading Projects | Executive Director, Modern Comptrollership Initiative | GIII3 700 |
F4(57) 400 |
F7I 460 |
1560 | A1 |
05 | 5-G-1 | EX-01 | Leading Projects | Director, Service Integration | FIII3 528 |
F4(50) 264 |
F2P 264 |
1056 | 0 |
04 | 4-G-1 | EX-01 | Leading Projects | Project Manager, Canadian Frigate Life Extension Project | FII3 460 |
F4(50) 230 |
E2P 230 |
920 | 0 |
04 | 4-G-2 | EX-01 | Leading Projects | Director, Seniors Cluster | FII3 460 |
F4(50) 230 |
F2P 264 |
954 | A1 |
10 | 10-H-1 | EX-05 | Public Service Direction and Services | Assistant Deputy Minister, Real Property | GIV3 1056 |
G4(66) 700 |
G4P 800 |
2556 | A1 |
10 | 10-H-2 | EX-05 | Public Service Direction and Services | Assistant Deputy Minister, International Trade and Finance | GIV3 1056 |
G4(66) 700 |
G7I 920 |
2676 | A2 |
09 | 9-H-1 | EX-04 | Public Service Direction and Services | Chief Executive Officer, Translation Bureau | GIV3 920 |
G4(66) 608 |
G4P 700 |
2228 | A1 |
08 | 8-H-1 | EX-04 | Public Service Direction and Services | Assistant Secretary, Senior Personnel and Special Projects | GIII3 800 |
F4(57) 460 |
F6C 608 |
1868 | A2 |
08 | 8-H-2 | EX-04 | Public Service Direction and Services | Assistant Secretary to the Cabinet (Machinery of Government) | GIV3 800 |
G4(66) 528 |
F7C 608 |
1936 | A1 |
07 | 7-H-1 | EX-03 | Public Service Direction and Services | Director General, Banking and Cash Management | GIII3 700 |
F4(57) 400 |
F4P 460 |
1560 | A1 |
06 | 6-H-1 | EX-02 | Public Service Direction and Services | Director, Research, Strategic Planning and Policy Development | GIII3 608 |
F4(57) 350 |
F4C 304 |
1262 | P2 |
05 | 5-H-1 | EX-01 | Public Service Direction and Services | Director, Internal Audit Policy and Special Reviews | FIII3 528 |
F4(50) 264 |
E6I 264 |
1056 | 0 |
04 | 4-H-1 | EX-01 | Public Service Direction and Services | Director, Seized Property Management | FII3 460 |
E4(50) 230 |
E3P 264 |
954 | A1 |
04 | 4-H-2 | EX-01 | Public Service Direction and Services | Director, Central and Public Accounting | FII3 |
E4(50) 230 |
E2P 230 |
920 | 0 |
10 | 10-I-1 | EX-05 | Policy and Planning | Assistant Deputy Minister, Policy | GIV3 1056 |
G4(66) 700 |
G3P 700 |
2456 | 0 |
09 | 9-I-1 | EX-04 | Policy and Planning | Assistant Deputy Minister, Policy | GIV3 920 |
G4(66) |
G5C 608 |
2136 | 0 |
08 | 8-I-1 | EX-03 | Policy and Planning | Director General, Strategic Policy Planning and Intergovernmental Relations | GIII3 800 |
F4(57) 460 |
F5C 460 |
1720 | 0 |
06 | 6-I-1 | EX-02 | Policy and Planning | Director, Policy, Planning and Partnerships | FIII3 608 |
F4(50) 304 |
F4C 350 |
1262 | A1 |
06 | 6-I-2 | EX-02 | Policy and Planning | Director General, Policy and Planning | GIII3 608 |
F4(57) 350 |
F5C 400 |
1358 | A1 |
05 | 5-I-1 | EX-01 | Policy and Planning | Director, Policy and Initiatives | FIII3 528 |
F4(50) 264 |
F2P 264 |
1056 | 0 |
04 | 4-I-1 | EX-01 | Policy and Planning | Director, Heritage Policy | FII3 460 |
E4(50) 230 |
E2P 230 |
920 | 0 |
04 | 4-I-2 | EX-01 | Policy and Planning | Director, Strategic Operations Planning | FII3 460 |
F4(50) 230 |
F2P 264 |
954 | A1 |
04 | 4-I-3 | EX-01 | Policy and Planning | Director, Science Policy | FII3 460 |
E4(50) 230 |
E2P 230 |
920 | 0 |
04 | 4-I-4 | EX-01 | Policy and Planning | Director, Strategic and Operational Planning | FII3 460 |
F4(50) 230 |
F3C 264 |
954 | A1 |
08 | 8-J-1 | EX-03 | Audit and Evaluation | Director General, Audit and Ethics | GIII3 800 |
G4(57) 460 |
G5C 528 |
1788 | A1 |
07 | 7-J-1 | EX-02 | Audit and Evaluation | Director General, Audit and Evaluation | GIII3 700 |
F4(57) 400 |
F5C 400 |
1500 | 0 |
05 | 5-J-1 | EX-01 | Audit and Evaluation | Director, Evaluation | FIII3 528 |
F4(50) 264 |
F4C 304 |
1096 | A1 |
04 | 4-J-1 | EX-01 | Audit and Evaluation | Director, Audit and Evaluation | FIII3 460 |
E4(50) 230 |
E5C 230 |
920 | 0 |
04 | 4-J-2 | Below 920 | Audit and Evaluation | Director, Sector Operations Audit | FII3 460 |
E4(43) 200 |
E2P 200 |
860 | 0 |
10 | 10-K-1 | EX-05 | Knowledge Management | Assistant Deputy Minister, Information Management | GIV3 1056 |
G4(66) 700 |
G5P 920 |
2676 | A2 |
09 | 9-K-1 | EX-04 | Knowledge Management | Assistant Chief Statistician, Methodology and Informatics | GIV3 920 |
G4(66) 608 |
G4P 700 |
2228 | A1 |
09 | 9-K-2 | EX-04 | Knowledge Management | Executive Director, Departmental Information Products / Chief Information Officer | GIV3 920 |
G4(66) 608 |
G3P 700 |
2228 | A1 |
07 | 7-K-1 | EX-03 | Knowledge Management | Director General, Informatics | GIII3 700 |
F4(57) 400 |
F3P 460 |
1560 | A1 |
07 | 7-K-2 | EX-03 | Knowledge Management | Director General, Information Management and Technologies, and Chief Information Officer | GIII3 700 |
F4(57) 400 |
F3P 460 |
1560 | A1 |
06 | 6-K-1 | EX-02 | Knowledge Management | Director General, e-Government | FIII3 608 |
F4(57) 350 |
F5I 350 |
1308 | 0 |
05 | 5-K-1 | EX-01 | Knowledge Management | Director, Telecommunications and Spectrum Engineering and Support | FIII3 528 |
F4(50) 264 |
E4P 304 |
1096 | A1 |
04 | 4-K-1 | EX-01 | Knowledge Management | Director, Information Management | FII3 460 |
E4(50) 230 |
E2P 230 |
920 | 0 |
04 | 4-K-2 | EX-01 | Knowledge Management | Director, Business Systems | FII3 460 |
E4(50) 230 |
E2P 230 |
920 | 0 |
08 | 8-L-1 | EX-04 | Communications and Public Engagement | Assistant Deputy Minister, Communications and Consultations | GIV3 800 |
G4(57) 460 |
G3P 608 |
1868 | A2 |
07 | 7-L-1 | EX-03 | Communications and Public Engagement | Director General, Communications | GIII3 700 |
F4(57) 400 |
F3P 460 |
1560 | A1 |
05 | 5-L-1 | EX-01 | Communications and Public Engagement | Director, Communications Analysis and Policy Development | FIII3 528 |
F4(50) 264 |
E2P 230 |
1022 | P1 |
04 | 4-L-1 | EX-01 | Communications and Public Engagement | Director, Public Affairs | FII3 460 |
E4(50) 230 |
E2P 230 |
920 | 0 |
08 | 8-M-1 | EX-03 | Finance | Corporate Comptroller | GIII3 800 |
G4(57) 460 |
G5C 528 |
1788 | A1 |
07 | 7-M-1 | EX-03 | Finance | Director General, Finance | GIII3 700 |
F4(57) 400 |
F5C 460 |
1560 | A1 |
05 | 5-M-1 | EX-01 | Finance | Director, Financial Management | FIII3 528 |
E4(50) 264 |
E5C 264 |
1056 | 0 |
04 | 4-M-1 | EX-01 | Finance | Director, Financial Operations and Accounting Services | FII3 460 |
E4(50) 230 |
E2P 230 |
920 | 0 |
08 | 8-N-1 | EX-04 | Human Resources | Assistant Commissioner, Human Resources | GIII3 800 |
G4(57) 460 |
G3P 608 |
1868 | A2 |
07 | 7-N-1 | EX-03 | Human Resources | Director General, Human Resources | GIII3 700 |
F4(57) 400 |
F3P 460 |
1560 | A1 |
06 | 6-N-1 | EX-02 | Human Resources | Director General, Regional Civilian Human Resources Services | FIII3 608 |
F4(57) 350 |
F3P 400 |
1358 | A1 |
06 | 6-N-2 | EX-02 | Human Resources | Director General, Human and Corporate Services | GIII3 608 |
F4(57) 350 |
F2P 350 |
1308 | 0 |
04 | 4-N-1 | EX-01 | Human Resources | Director, Civilian Human Resources Service Centre, National Capital Region | FIII3 460 |
E4(50) 230 |
E3P 264 |
954 | A1 |
04 | 4-N-2 | Below 920 | Human Resources | Regional Manager, Human Resources | FII3 460 |
E4(43) 200 |
E2P 230 |
890 | A1 |
08 | 8-O-1 | EX-04 | Management and Secretariat Services | Assistant Commissioner, Corporate Services | GIV3 800 |
G4(57) 460 |
G5C 608 |
1868 | A2 |
06 | 6-O-1 | EX-02 | Management and Secretariat Services | Director General, Corporate Management and Review | FIII3 608 |
F4(57) 350 |
F5C 350 |
1308 | 0 |
06 | 6-O-2 | EX-02 | Management and Secretariat Services | Director General, Resource Management | GIII3 608 |
F4(57) 350 |
F2P 350 |
1308 | 0 |
04 | 4-O-1 | EX-01 | Management and Secretariat Services | Director, Corporate Secretariat | FII3 460 |
E4(50) 230 |
E5I 230 |
920 | 0 |
04 | 4-O-2 | EX-01 | Management and Secretariat Services | Regional Director, Management Services (Ontario) | FII3 460 |
E4(50) 230 |
E3P 264 |
954 | A1 |
04 | 4-O-3 | Below 920 | Management and Secretariat Services | Regional Manager, Finance and Administration | FII3 460 |
E4(43) 200 |
E2P 230 |
890 | A1 |
04 | 4-O-4 | Below 920 | Management and Secretariat Services | Corporate Secretary | FII3 460 |
E4(43) 200 |
E2P 200 |
860 | 0 |
Page details
- Date modified: