Income Tax Audit Manual
Compliance Programs Branch (CPB)
This chapter was last updated November 2019.
Hyperlinks to external or unaffiliated websites are for information purposes only. The Canada Revenue Agency (CRA) is not responsible for the content or practices of such websites. While efforts are made to make sure that hyperlinks are current and up-to-date, it is not guaranteed.
Chapter 8.0 Business Intelligence and Quality Assurance
Table of Contents
- 8.1.0 Business Intelligence
- 8.2.0 For future use
- 8.3.0 For future use
- 8.4.0 Audit Quality Review
The Compliance Programs Branch has established the Business Intelligence and Quality Assurance (BIQA) structure to focus on risk to improve audit selection and to improve the quality of audits. BIQA is a regional team, with workload development and audit quality review done at the regional level.
The goals of BIQA are to:
- strengthen audit and internal controls;
- segregate duties;
- enhance risk assessment so issues of greatest risk are identified, selected, and audited;
- perform risk assessment and audit activities with professional standards;
- enhance oversight of audit activities to facilitate conformity and consistency with legislation and national policies and procedures;
- rationalize and document accurately and adequately, all decisions from workload selection to final audit reassessment; and
- identify in a timely manner, training needs, emerging compliance issues, and discrepancies in audit procedures and policies.
8.1.0 Business Intelligence
The Business Intelligence (BI) Section is responsible for providing ongoing guidance, leadership, support, and oversight to the regional BI teams, with respect to risk assessment, identification, and the selection of audit cases for the Office Audit, Small Business Audit, and Medium Business programs.
The BI Section receives Integras support from the Integras Migration Team, who are the lead responsible for the Integras Suite of Solutions for Small and Medium Enterprises Directorate (SMED).
- Provide support to regional BI teams, which includes to:
- establish policies, procedures, and training;
- establish national standards and program priorities;
- provide training support for analysts and screeners; and
- undertake program monitoring.
- Work with regional programs to provide direction on the development of Small and Medium Enterprises (SME) program workloads to reflect the Canada Revenue Agency (CRA)'s business intelligence and compliance strategies for SMED.
- Refine and develop new tools for risk assessment in conjunction with other areas.
- Coordinate the development of projects for Small, Medium, and Office Audit.
- Create BI learning path, including new Webinar products.
The regional BI teams are responsible for:
- selecting files for Small, Medium, and Office Audit to fulfill program business plans;
- reviewing files of Headquarters-directed mandatory workload;
- identifying the right files for each of the applicable audit programs in the region by screening leads and referrals, and creating self-generated workload;
- gathering, analyzing, integrating, and applying business intelligence in determining file selection; and
- collecting regional intelligence and contributing to national risk identification.
This structure allows for a more systematic and coordinated approach to achieve the stated program objectives at the regional and national levels.
The creation of regionalized BI teams ensures that there is adequate critical mass in the BI Program. This ensures that sufficient expertise exists and knowledge transfer takes place. It also facilitates analysis and sharing of risk information and business intelligence. Regionalization also creates better segregation of duties between file selection and audit activities.
For information about BI, go to:
- Business Intelligence Division;
- Business Intelligence and Quality Assurance; and
- memorandum September 16, 2013, The Regional Business Intelligence (Income Tax) Function and its impact on TSO audit activities.
8.2.0 For future use
8.3.0 For future use
Quality audits are necessary to effectively deliver the CRA compliance programs’ mandate of identifying, addressing, and deterring non-compliance and promoting voluntary compliance. For this reason, enhancing audit quality is a priority for the audit programs and the Audit Quality Review (AQR) Program.
Definition of a quality audit
A quality audit is defined as one where:
- the auditor has identified and addressed material risk;
- the audit adjustments are technically correct;
- the auditor has adhered to the Taxpayer Bill of Rights ; and
- a third party would clearly understand how the auditor reached their conclusions based on the information in the file.
8.4.1 Audit Quality Standards
AQR officers review selected cases against predetermined quality standards. The standards have been developed in consultation with the audit programs and reflect the Income Tax Act, and the policies and procedures outlined in the Income Tax Audit Manual (ITAM), Audit communiqués, memoranda, and current learning products.
Every two years, the AQR Program consults with stakeholders, including auditors in the field, to ensure the standards continue to be relevant, and to identify changes needed due to new or obsolete policies.
In between these major reviews, the Audit Quality Standards are revised when necessary, due to emerging issues or new audit policies and procedures. The version of the Audit Quality Standards that applies to an audit case is based upon the standards in place when the AQR case is created.
While certain standards or measurement criteria may differ depending on the functional audit program, as much as possible, the AQR Program tries to ensure standards are aligned between the programs. The program also works closely with the GST/HST AQR Program to ensure the standards for income tax Small Business Audit and Medium Business, Office Audit, and GST/HST are closely aligned, with program-specific differences where required.
The May 2019 standards are the most recent; go to Quality Assurance Section.
The one-day learning product, HQ1163-002, Conducting Quality Income Tax Audits, provides information on how the Audit Quality Standards are applied and on improving audit quality. As well, the AQR Program and the audit program areas are developing a series of outreach products focusing on how to ensure audit quality in particular aspects of the audit process, such as working papers or risk assessment.
The Audit Quality Standards are grouped under five elements of a quality audit:
- Application of law, policy, and procedures
- End products
The first four elements follow the natural progression of an audit. As the name suggests, the Planning element includes the preliminary file review and preparing and continuously updating the Audit Plan. The Conducting element addresses the field work portion of the case. It begins with the initial interview and tour of the premises, and requesting the relevant information and books and records, followed by the review of revenues, expenses, and balance sheet items, as well as indirect verification of income (IVI) procedures, where applicable. The Application of law, policy, and procedures element considers the technical accuracy of the results, whether the case conforms to CRA policies, and whether the taxpayer relief and penalty provisions were properly considered. The End products element includes the content of the working papers, reports, correspondence, reassessment documents, and audit results coding.
The Professionalism element touches on each stage of the audit, and covers confidentiality, security, and service to the taxpayer.
Within each element there are one or more standards.
Individual standards are referred to using the format E#-S#. The number following the E refers to one of the five elements noted above, while the number following the S refers to a particular standard under that element. For example, E2-S3 would indicate the third standard, Revenue, under the second element, Conducting.
For each standard, there are a number of measurement criteria. The first measurement criterion for every standard is always “fully met,” signifying that every aspect of the standard has been completely satisfied.
This is followed by measurement criteria identifying one or more ways in which the standard may not have been met. Next are “needs improvement” criteria, which indicate ways in which the standard has been met, but where the audit work could have been better. The final measurement criterion is “N/A,” used where the standard does not apply to the audit under review.
In some cases, a “not met” rating results in a change required (CR).
CRs are significant errors or omissions in an audit case that:
- could jeopardize the assessment;
- could result in an incorrect assessment;
- could diminish the credibility of the CRA;
- represent a failure to properly record the results of an audit;
- represent a failure to address the risk associated with a case; or
- represent a failure to follow CRA policies and procedures.
When any standard has been rated as not met, CR in a live case, the case is returned to Audit for revision.
Sometimes an error that results in a not met, CR, rating for one standard causes a “flowthrough error” that affects other standards.
For example, a material calculation error in a working paper will impact the Audit Report, correspondence, reassessment documents, and results coding. Even though these documents may have otherwise been correctly completed, the calculation error will need to be corrected in all of them.
Only the original calculation error would be rated as a not met, CR. The flow-through changes directly related to the initial error would be rated as an actionable item (AI).
An AI can never be the only rating for a standard. If a standard is fully met, except for the flowthrough error, then the AQR officer will rate it as 1, fully met, and as an AI, with a comment explaining what changes are required under the AI.
An AI rating is not included in a region’s statistics for regional reporting purposes. In other words, only the original error is counted as a CR. The flow-through errors are not.
Depending on the situation, more than one measurement criterion could apply to a particular standard. For example, say there was a risk of unreported revenue in an audit. The auditor neither reconciled revenue nor did a second supporting IVI test. Standard E2-S6 would be rated both measurement criteria:
3. Reported revenue was not reconciled to the books or the reconciliation was incomplete and/or inaccurate. (not met, CR)
5. The second supporting IVI test was not conducted or was incomplete, inappropriate or inadequate. (not met, CR)
Reference/Learning and Interpretation/Review considerations
For each standard, there are also references and Interpretation/Review considerations.
Go to the most recent standards at Quality Assurance Section for a list of sections of the ITAM, communiqués, memoranda, learning products, and other publications that apply to each standard.
The Interpretation/Review considerations define certain terms in the measurement criteria and spell out the circumstances that would warrant a specific rating.
8.4.2 AQR team structure
Each region has one AQR team, with the exception of Ontario, which has two, due to the number of audits the region completes.
AQR teams report to an AQR team leader, who in turn reports to the regional BIQA director, or in some regions, to a BIQA manager who reports to the BIQA director.
Program 18 audits at the AU-02 level and Program 17 audits are reviewed by the AQR team in the auditor’s own region. On the other hand, reviews of Program 18 audits at the AU03 level (also referred to as Basic Files) and all Office Audit files are centralized, regardless of the auditor’s location.
AQR officers include SP-06s who review Office Audit files, AU-01s and AU-03s who review SME files, and an AU-04 who reviews Basic Files. Usually, AQR staff are former auditors and audit team leaders
8.4.3 Random file selection
Integras is used to select a random sample of cases for AQR based on the following coverage rates applied at the regional level to ensure the proper coverage is achieved in each region:
- Office Audit – 17%
- Small Business Audit – 25%
- Medium Business – 25%
- Basic Files – 10%
Once the audit team leader reviews and approves the first case in a case set that is sent for finalization, the system applies the sampling algorithm. If that case is selected for AQR, then all the other cases in the case set should also be selected.
When a case is selected for AQR, the audit team leader receives a pop-up message in Integras and also a notification in their “My Messages” inbox, advising that the case has been selected for AQR and that no reassessment documents or final correspondence should be sent to the taxpayer until the AQR team advises that they can be sent.
Where possible, audits are reviewed on a live basis, before the reassessment is processed.
The AQR Program has established a service standard of providing the initial feedback on a live case to Audit within eight business days of the date the audit team leader approved the Integras case.
If the AQR does not result in any Audit Quality Standards being rated as not met, CR, no revisions to the audit are required. The audit case is released from the AQR team and the audit team leader is advised to process the adjustments or close the case without changes, and issue the final correspondence.
If the AQR results in any Audit Quality Standards being rated as not met, CR, then changes to the audit are required. The audit team leader is notified this is the case, and the audit case is returned to Audit for revision. The auditor will have access to the AQR case in Integras in order to review the standards, ratings, and feedback provided.
For each standard which received a CR rating, the auditor must provide comments detailing how the CR was addressed in the audit case, along with references to working papers where the changes were made, if applicable.
Sometimes, the changes required result in a change to the audit adjustments and the auditor must issue a revised proposal letter to the taxpayer.
The revised proposal letter templates are in the Integras Template Library. They are also available in the CRA Electronic Library > Compliance Programs Branch Reference Material > Audit > Income Tax – Forms and Letters > Letters > Chapter 11. See:
- A-11.1.1, Post-AQR Proposal - With Penalty
- A-11.1.2, Post-AQR Proposal - No Penalty
- A-11.1.3, Post-AQR Proposal - Repeated Offence Penalty
Once the AQR team is satisfied that all issues have been appropriately addressed by the auditor, the audit case is released for final processing.
Depending on factors, such as the statute-barred dates of cases in the case set or the volume of AQR case intake, the review of some cases selected may be delayed, which allows the finalized audit case to be processed and the correspondence to be sent before the AQR is completed.
Delayed cases are reviewed using the same standards as are live cases. These reviews offer feedback to Audit on the strengths of the audit and areas for future improvement.
Usually, even where a standard has been rated with a CR, delayed cases are not returned to Audit for changes. The exceptions are when:
- the audit case contains technical errors (calculation errors or incorrect application of the legislation, policy and procedures), resulting in the taxpayer being over-assessed; or
- errors were made in processing the audit results that were not noted during the review by the audit team leader.
In such cases, an amending audit case will be generated in Integras so that the adjustment to the taxpayer can be processed. This policy aligns with ITAM 12.7.1, Situations that require a follow-up income tax audit.
In the future, in addition to randomly selected cases, some AQR Program reviews will be focused reviews, intended to provide more in-depth information on particular quality topics of interest. The subjects for these reviews will be chosen jointly by the audit program areas and the regions. For example, no change audits could be a possible focused review topic.
Depending on the topic selected, the focused reviews could be conducted on either a live basis or a delayed basis, and could be subject to a more in-depth look at all or a subset of the existing standards and/or to additional standards.
For every case the AQR team reviews, regardless of whether the review results in any CRs, it provides an AQR results report to the audit team leader and manager within the AQR Integras case once the review is complete.
The report includes an overall comment on the quality of the audit as well as a rating and in most cases, detailed comments for each standard. If a standard has been fully met (that is, rated as a 1), the AQR officer will only provide comments if additional feedback is warranted. For example, the officer may want to highlight a best practice or provide positive feedback.
If the ratings include any CRs in a live file, the auditor must make the necessary revisions to the case, and provide a response in the report for each CR, explaining how it was addressed. Then the case is sent back to the AQR team for a second review.
AQR reports should not be used directly in performance management, but the findings can indicate potential areas for improvement in future audits as well as training needs.
There is a dispute resolution process in place to resolve situations where Audit does not agree with the AQR results. This process has three levels. Resolving disputes should be treated as priority workload by everyone involved.
- Level One: The audit team leader and the auditor review the AQR findings and disagree with the results. The audit team leader contacts the AQR team leader to try to resolve the disagreement. Usually, disputes can be resolved at this level.
- Level Two: If no agreement can be reached at level one, then the regional BIQA director (or BIQA manager, if one exists) and the audit manager or assistant director of Audit discuss the issue.
- Level Three: If no agreement can be reached at level two, the regional assistant commissioner (RAC) makes the final decision.
In rare circumstances, based on the situation at hand, the BIQA director or the RAC may decide that although the AQR results are correct, there will be no changes/amendments made to the audit case. Under such rare circumstances, the AQR team will modify the comments to reflect that although the AQR ratings are correct, the audit case will not be changed/amended.
AQR statistical reports are prepared for each functional audit program semi-annually.
They include cumulative results for the fiscal year at both the national and regional level, and also highlight those standards where each region is achieving its best results or needs to improve.
Where a case received multiple ratings for a single standard, only one rating is included in the statistical report. For example, if a standard was rated with both a not met, CR and a needs improvement, only the not met, CR would be counted for the final report.
Each region is responsible for preparing and implementing Audit improvement action plans which describe how it will improve the quality of its audits in relation to:
- its top three not met standards;
- its top three standards needing improvement; and
- any risk assessment and conducting standards where its percentage not met is greater than 10%.
As well, the Headquarters (HQ) audit programs develop action plans to address quality issues of a national nature that have been identified in the reports, including taking appropriate actions to revise learning products if necessary.
The Quality Assurance Section at HQ (National QA) is responsible for the functional program design and direction of the AQR Program in close consultation with the HQ audit programs. It provides the regional teams with policies and procedures, technical guidance related to AQR, and AQR training and learning products.
To ensure the quality and consistency of AQR Program reviews, National QA undertakes a number of initiatives:
- AQR Program monitoring to ensure the AQR teams are following policies and procedures;
- National QA advisors review a sample of AQR reviews completed by AQR officers and provide feedback on the quality of the review;
- Calibration exercises, where all AQR officers review the same audit case, the responses are compared and the correct ratings are discussed;
- Training for new AQR officers; and
- Providing ongoing support by responding to enquiries made through the HQ mailbox and facilitating regular monthly conference calls. This communication also ensures that emerging issues are addressed in a timely manner.
National QA and the HQ audit programs regularly communicate to address national trends and situations where policies and procedures need to be clarified. National QA also liaises with other stakeholders, including Legislative Policy and Regulatory Affairs Branch, Appeals Branch, and International and Large Business Directorate.
Report a problem or mistake on this page
- Date modified: