06. The Nature and Evaluation of Application Controls
06.03. Evaluating Application Controls Effectiveness Through Testing
Briefly reflect on the following before we begin:
- What are key considerations when assessing the design and implementation of application controls?
- What elements would be critical in developing an audit program for testing application controls?
- How can data analytics enhance the testing of application controls, and what unique insights might it offer?
- How can an auditor effectively detect control failures and weaknesses in application controls?
- Why is monitoring the performance of application controls over time necessary, and how can continuous improvement be integrated into this process?
The effectiveness of application controls is not just a matter of implementation; it is a continuous assessment, monitoring, and improvement journey. This section aims to provide an in-depth understanding of effectively evaluating and enhancing application controls within an organization. It thoroughly examines how these controls are structured and integrated into the business environment. The design of application controls should meet the technical requirements and align with the organization’s business processes and objectives. Moreover, ensuring that controls meet compliance standards is paramount in this era of stringent regulatory requirements.
Once the controls are in place, the focus shifts to monitoring their performance over time. This is where setting up clear performance metrics and regular auditing procedures comes into play. Monitoring is not a one-off task but a continuous process that involves real-time techniques and analyzing trends and patterns in control performance. This ongoing monitoring is crucial for maintaining the effectiveness of application controls in the dynamic landscape of technology and business operations. However, application controls may face challenges and failures even with robust design and monitoring. Detecting these failures and weaknesses is a critical aspect of control evaluation and involves being vigilant about the red flags that indicate control issues. Root cause analysis is an essential tool in this context, helping to identify the underlying reasons for control failures. Understanding the impact of these failures on business operations and compliance is also key, as it guides the response strategies to mitigate any adverse effects.
The final and most dynamic aspect of evaluating application control effectiveness is the continuous improvement of these controls. The business and technological environments are constantly evolving, and so should the application controls. This improvement process concerns technology upgrades, learning from past experiences, and incorporating stakeholder feedback. Regular training and awareness programs for staff play a significant role in enhancing the effectiveness of controls. Furthermore, benchmarking against industry standards and best practices provides valuable insights for continuous improvement.
Evaluating the Design Effectiveness of Application Controls
The evaluation of application controls begins with assessing the effectiveness of the design of those controls. Controls must be robust yet flexible, designed to cater to specific risks and business needs. They need to be scalable, adapting to the changing size and complexity of the business. Controls that are too rigid or overly complex can hinder business operations, while those that are too lenient may fail to protect against risks effectively. Moreover, controls should not exist in isolation; they must integrate seamlessly with business operations. This alignment ensures that controls are relevant and practical, contributing to the overall efficiency of the business process. For instance, a control designed for a financial transaction system must consider the flow of transactions, authorization levels, and reporting requirements. Misalignment can lead to operational bottlenecks or gaps in control effectiveness.
The strategy for implementing these controls is equally essential. It involves not just the technical deployment of controls but also considering the human and process aspects. Training and communication are crucial to ensure staff understand and adhere to these controls. The implementation strategy should also consider the potential impact on existing processes and workflows, aiming to minimize disruptions while enhancing security and efficiency. Compliance with regulatory standards must be considered. Given the current emphasis on data protection and privacy, controls must meet various regulatory requirements. This involves staying updated with the latest standards and ensuring that controls are designed to comply with these requirements. Non-compliance can lead to legal repercussions and damage to the organization’s reputation.
In evaluating application control design effectiveness, the IS Auditor must consider the entire control lifecycle, including its inception, deployment, and ongoing management. Regular reviews and updates ensure that controls remain effective over time. This lifecycle approach helps identify areas where controls may need refinement or updating. It is an ongoing process that requires vigilance and adaptability. This continuous, consistent, and dynamic approach ensures that application controls continue to provide effective protection and support for business operations.
Often, IS Auditors will find that most application controls are designed effectively. This is primarily due to the (automated) nature of the controls. As such, a risk assessment exercise should be performed at this stage to help prioritize which application controls should be tested first (or at all). This involves identifying potential risks in business processes and designing controls that specifically address these risks. The risk assessment should be comprehensive, considering the likelihood of risks occurring and their potential impact. It forms the basis for designing controls that are both effective and proportionate to the risk.
Application Control Testing Risk Assessment
The assessment of control implementation is essential to align application controls with an organization’s broader goals and processes. In doing so, IS Auditors find it helpful to employ a top-down risk assessment to determine which applications to include in the control review and what tests must be performed. Similar to the risk assessment performed at the time of developing an annual or multi-year IS audit plan, the IS Auditor will perform the following steps as a part of this risk assessment approach:
- Define the universe of systems and supporting technology components (operating systems and databases) that use application controls.
- Identify the most relevant risks and corresponding (existing) controls using the risk and control matrix.
- Define the relevant risk factors associated with each application control, including (but not limited to) the number of critical processes supported, frequency and complexity of changes to the systems, the impact on financial and regulatory reporting perspective, the effectiveness of the supporting ITGCs, etc.
- Weigh all risk factors to determine which risks need to be weighed more heavily than others based on the IS Auditor’s assessment of the organization’s control environment, understanding of the prior audit findings, etc.
- Conduct the risk assessment, rank all risk areas, and evaluate risk assessment results.
- Prioritize the application controls with the highest risk scores and establish a plan to evaluate the operating effectiveness of those controls.
The output of this risk assessment exercise should identify:
- Application controls to be tested for operating effectiveness thoroughly (ones with the highest risk score).
- Application controls are to be evaluated using a benchmarking approach, where only changes to the control or configuration are evaluated (one with medium risk scores).
- Application controls to be skipped for testing (based on low-risk scores OR on the conclusion that they were poorly designed, in which case they should be reported as findings).
Evaluating the Operating Effectiveness of Application Controls
Typically, IS Auditors apply a top-down review approach to evaluate the application controls. They start by assessing whether the application controls are operating effectively during the period in scope of the audit or if they were being circumvented by creative users or management override. IS Auditors also determine the effectiveness of ITGCs and consider if application-generated logs or audit trails need to be reviewed.
Once application controls are identified for testing (full-fledged testing or using a benchmarking approach), IS Auditors are expected to employ several methods based on the application control type. At a high level, the following types of tests are typically applied by IS Auditors in evaluating the test of application control’s operating effectiveness depending on the nature, timing, and extent of testing:
- Inspection of system configurations.
- Inspection of user acceptance testing, if conducted in the current year.
- Inspection or re-performance of reconciliations with supporting details.
- Re-performance of the control activity using system data.
- Inspection of user access listings.
- Re-performance of the control activity in a test environment (using the same programmed procedures as production) with robust testing scripts.
In illustrating a system configuration test, consider the examination of a three-way match system’s parameters within the scrutinized system, achieved through meticulous tracing of a singular transaction. Furthermore, the auditor should diligently observe a subsequent rerun of the pertinent query, thereby facilitating a thorough comparison between the report generated by management and its replicated version. The audit process may also encompass a rigorous assessment of edit checks for pivotal fields. This can be accomplished by stratifying or categorizing transactions based on the values within these fields. Employing specialized audit software emerges as an invaluable tool in this context, significantly streamlining the process of recalculating and authenticating computations executed by the system. Concluding this comprehensive audit approach, auditors can adeptly conduct reasonableness checks, which involve an in-depth evaluation of potential value data ranges pertinent to critical fields, ensuring the integrity and accuracy of the system’s outputs.
More specifically, the table below illustrates examples of audit procedures testing the operating effectiveness for the types of “Input Application Controls” identified in the previous section:
Control Domain | Brief Definition | Test of Controls Audit Procedure |
---|---|---|
Field Checks | Verifies the data type of an input. | Select a sample of transactions and verify that data types in specified fields match the required format. |
Form Checks | Confirm that data is entered in the correct format. | Review a sample of data entries to ensure they adhere to the predefined format, such as date fields. |
Range Checks | Ensures data falls within a predefined range. | Examine a sample of entries to check if values (e.g., age, price) are within the specified range. |
Limit Checks | Checks for data exceeding a specific limit. | Test a set of transactions to verify that values, such as budget constraints, do not exceed the established limits. |
Validity Checks | Verifies whether data is reasonable and logical. | Assess a sample of entries for logical consistency, such as zip codes matching known valid areas. |
Completeness Checks | Ensures all required data fields are entered. | Inspect several records to confirm that all mandatory fields are completed. |
Check Digits | Adds a digit to numbers to validate their authenticity. | Randomly select account numbers and validate the check digit for accuracy. |
Duplication Checks | Prevents entering the same information more than once. | Review system logs or records to identify duplicate entries, like repeated customer registrations. |
Sequence Checks | Verifies data is in a proper sequence. | Analyze a sequence of transactions (e.g., invoice numbers) to ensure they follow the correct order. |
Cross-Field Validation | Compares data entered in one field against another. | Cross-check a sample of transactions to verify that related fields (e.g., total cost and unit price) are consistent. |
Preformatted Screens | Guides data entry with a specific layout. | Observe the data entry process to ensure the preformatted screens are correctly used. |
Transactional Totals | Summarizes numerical data for verification. | Review a batch of transactions to confirm that the transactional totals are accurate and complete. |
Error Prompts | Alerts users to incorrect data entries immediately. | Test the system’s response to incorrect data entries to ensure error prompts function correctly. |
Input Authorization | Ensures only authorized personnel enter data. | Verify authorization logs or records to confirm that only authorized individuals are making specific data entries. |
Batch Controls | Manages data processing in groups for verification. | Examine batch processing records to ensure the number of items and total values match the expected counts and totals. |
More specifically, the table below illustrates examples of audit procedures testing the operating effectiveness for the types of “Processing Application Controls” identified in the previous section:
Control Domain | Brief Definition | Test of Controls Audit Procedure |
---|---|---|
Automated Error Detection | Automatically identifies and flags errors in data processing. | Test the system with intentional mistakes to ensure it correctly identifies and flags these errors. |
Transaction Matching | Ensures related transactions are correctly matched. | Review a sample of transaction pairs (e.g., purchase orders and invoices) to verify accurate matching. |
Workflow Authorization | Requires specific approvals for certain processing steps. | Inspect authorization logs to confirm that transactions requiring approval have been appropriately authorized. |
Logical Access Controls | Restrict processing functions to authorized users. | Examine user access logs and compare them against approved user lists to ensure compliance with access policies. |
Data Integrity Checks | Verifies data remains unchanged during processing. | Compare a sample of original data inputs with processed data to verify consistency and lack of alteration. |
Audit Trail Maintenance | Tracks changes to data throughout the processing phase. | Review audit trail logs to ensure all changes made to data are recorded and traceable. |
Exception Reporting | Flags transactions that fall outside normal parameters. | Analyze exception reports to verify that anomalies are correctly identified and reported. |
Duplication Checks | Prevents processing the same transaction multiple times. | Test the system with duplicate entries to ensure the control flags and prevents identical processing. |
Reconciliation Procedures | Matches processed data with source documents. | Perform reconciliation of a sample of processed transactions against their source documents for accuracy. |
Automated Calculations Verification | Checks the accuracy of system calculations. | Manually recalculate a sample of transactions and compare it with system-generated calculations for accuracy. |
Sequence Control | Ensures transactions are processed in the correct order. | Review a sequence of transactions to ensure they are processed in the correct chronological order. |
Input/Output Control | Matches input data with output data to ensure accuracy. | Compare input data samples with corresponding output reports to check for accuracy and consistency. |
Processing Limits | Sets thresholds for transaction processing. | Test transactions that exceed processing limits to ensure the system enforces these limits correctly. |
Version Control | Manages updates to the software to ensure consistency. | Verify that all systems operate on the latest software version and check for consistent processing across versions. |
Integrity Controls | Ensures the integrity of processing operations and data. | Conduct regular system checks to validate the integrity and consistency of data throughout the processing stages. |
More specifically, the table below illustrates examples of audit procedures testing the operating effectiveness for the types of “Output Application Controls” identified in the previous section:
Control Domain | Brief Definition | Test of Controls Audit Procedure |
Review & Reconciliation of Output Reports | Compares output data with source data for accuracy. | Perform a reconciliation of a sample of output reports with the source data to verify accuracy. |
Output Distribution Controls | Manages who receives output data. | Review access logs and distribution lists to ensure only authorized personnel receive sensitive output data. |
Output Encryption | Protects data integrity and confidentiality during transmission. | Verify that sensitive data transmitted via email or other methods is encrypted and check the encryption standards used. |
Error Reporting Mechanisms | Enables reporting of discrepancies in output data. | Test the error reporting system by submitting a known error and track the reporting and correction process. |
Audit Trails of Output Data | Tracks access to output data. | Examine audit trails to identify who accessed specific output data, ensuring all access was authorized. |
Printout Management | Secure handling and disposal of printed reports. | Inspect the procedures for handling and disposing of printed confidential reports to ensure secure practices are followed. |
EDI Controls | Ensures accuracy and security in EDI transactions. | Review a sample of EDI transactions for accuracy and verify that security measures are in place. |
User Access Logs for Output Retrieval | Tracks who retrieves output data. | Check user access logs to confirm that only authorized individuals have retrieved specific output data. |
Data Integrity Verifications Post-Output | Ensures data consistency after processing. | Compare a sample of post-output data with original output records to check for data integrity. |
Automated Output Alerts | Notifies relevant personnel of critical data outputs. | Test the alert system by triggering a condition to generate an alert and verify it reaches the appropriate personnel. |
Backup and Recovery Procedures | Ensures output data can be recovered in case of system failure. | Evaluate the backup logs and conduct a recovery test to ensure output data can be retrieved. |
Version Management | Keeps track of different versions of output reports. | Verify that all users access the most current version of output reports and that version control is effectively managed. |
Confidentiality Measures | Protects sensitive information in output documents. | Review a sample of output documents to ensure sensitive information is adequately masked or redacted. |
Output Formatting Controls | Ensures output data is presented in a consistent and understandable format. | Evaluate a selection of output reports to confirm that they are consistently formatted and easily interpretable. |
Timeliness Controls | Ensures output data is generated and distributed in a timely manner. | Assess the timeliness of output generation and distribution against predefined schedules or standards. |
The Role of Data Analytics in Application Control Testing
Data analytics can be a game-changer when it comes to application control testing. Integrating data analytics into application control testing enhances the precision and depth of our analysis and significantly increases efficiency. Traditional control testing methods may require extensive manual effort and can be limited in scope, often focusing on sample data. Data analytics, however, enables auditors to analyze entire datasets, providing a more comprehensive view of the application’s performance. This comprehensive analysis is critical in detecting subtle anomalies that indicate control weaknesses or failures.
Incorporating advanced analytics techniques such as predictive analytics and machine learning further deepens insight. They can predict potential control failures by identifying patterns indicative of emerging risks. For instance, machine learning algorithms can analyze trends in user access data to predict unauthorized access attempts, allowing auditors to address security vulnerabilities proactively. Data visualization is another critical aspect of data analytics in application control testing. Complex data findings can be challenging to interpret and communicate effectively, and data visualization tools translate these findings into clear, understandable, and actionable visual formats. This approach not only aids auditors in pinpointing specific areas of concern but facilitates communication with stakeholders who may not have a technical background.
Efficiency in control testing is significantly enhanced with the integration of data analytics. Automation of data analysis processes speeds up the testing cycle and reduces the likelihood of human error. This efficiency is particularly beneficial in large organizations with complex systems, where manual testing of every control can be impractical. After the initial audit phase, analytics tools can continuously monitor application controls. Continuous monitoring helps quickly identify and address any control issues that arise post-audit. Additionally, post-audit analysis using data analytics provides deeper insights into the effectiveness of the controls and guides future improvements.
Detecting Control Failures and Weaknesses
Detecting control failures and weaknesses is a process that requires vigilance and a deep understanding of how controls are supposed to function within an organization’s unique environment. It begins with identifying red flags or early warning signs such as anomalies in data, unexpected system behaviour, or frequent user complaints. Identifying these signs early is crucial as it allows for prompt investigation and resolution, minimizing potential damage.
Root cause analysis is essential in understanding the reasons behind control failures. It is not enough to address the symptoms when a failure is detected. Delving into the underlying causes is necessary to prevent recurrence. This analysis might reveal design flaws, implementation errors, or external factors affecting the control’s effectiveness. Assessing the impact of control failures is a significant part of the detection process. This assessment involves understanding how the failure affects the organization’s operations, compliance posture, and risk exposure. Understanding the impact helps prioritize response efforts and allocate resources where needed most.
Incident reporting and documentation are vital for maintaining a record of control failures and weaknesses. It provides a historical record, aids in analyzing trends over time, and ensures accountability. Effective incident reporting should be clear and concise and include details such as the nature of the failure, the affected areas, and the steps taken to resolve the issue. Leveraging technology for detection is increasingly important. Advanced technologies like artificial intelligence and machine learning can provide sophisticated monitoring capabilities. They can detect anomalies that might be difficult for human auditors to identify.
Human judgment and expertise remain critical in interpreting technological findings and understanding their implications in the business context. As IT auditors and professionals, our role is to detect and report these issues and provide insights and recommendations that help strengthen the control environment. This proactive approach to detection is essential in building resilient and efficient IS that support the organization’s objectives and mitigate risks.
Continuous Improvement of Application Controls
Static control measures can quickly become obsolete with technological advancements and business processes evolving perpetually. Controls that were effective yesterday may not suffice tomorrow. Continuous improvement of application controls starts with adapting to changing business environments. Businesses must regularly review and update their controls to ensure they remain effective in the face of new technologies, emerging risks, and evolving business models.
Incorporating lessons learned is a crucial aspect of this continuous improvement process. Every incident, audit finding, or compliance review provides valuable insights, and these lessons should be used to refine and enhance the controls. Stakeholder involvement is integral to the improvement process and includes the IT team and the end-users, management, and external auditors. Their feedback can provide diverse perspectives on the effectiveness of controls and areas that may need improvement. Engaging stakeholders also helps align the controls more closely with business needs and enhances user compliance and satisfaction.
Training and awareness programs are critical components in this continuous improvement journey. They ensure that all personnel know the importance of controls and how to implement them effectively. Regular training sessions help keep the staff updated on new risks, control techniques, and compliance requirements. They also serve as a platform for discussing potential improvements and encouraging a culture of security and compliance. Benchmarking against industry best practices is another vital element. Organizations should regularly compare their control measures with those adopted by peers and leaders in their industry. This benchmarking can reveal gaps in controls and provide ideas for improvement. Continuous improvement also involves looking beyond immediate technical fixes. It includes strategic considerations such as aligning controls with long-term business objectives, investing in scalable and flexible control solutions, and anticipating future trends and risks.
In the Spotlight
For additional context on evaluating application testing using a benchmarking approach, please read the article titled “Benchmarking IT Application Controls” [opens a new tab].
De Bruijin, R.J.C.H.M., & Op. het Veld, M.A.P. (2008). Benchmarking IT application controls. Compaq. https://www.compact.nl/articles/benchmarking-it-application-controls/
Key Takeaways
Let’s recap the key concepts discussed in this section by watching this video.
Source: Mehta, A.M. (2023, December 6). AIS OER ch 06 topic 03 key takeaways [Video]. https://youtu.be/Lei6M35VCbE
Knowledge Check
Review Questions
- What are two key aspects to consider when assessing the design and implementation of application controls?
- What is a crucial first step in detecting control failures and weaknesses in application controls?
- What is a critical strategy for continuously improving application controls?
- What is a crucial factor when designing an audit program for application control testing?
- How does data analytics enhance the application control testing process?
Mini Case Study 1
XYZ Corporation, a large retail company, has recently implemented a new inventory management system. The system includes various application controls to ensure accurate inventory tracking and reporting. Six months after implementation, the internal audit team is tasked with evaluating the effectiveness of these application controls. During the evaluation, the audit team discovers the following:
- Inventory data entry errors are higher than expected.
- There is a delayed response in updating inventory levels after sales transactions.
- The monthly inventory reports have discrepancies when compared with physical inventory counts.
- Feedback from the system users indicates frustration with complex and time-consuming data entry processes.
Required:
- What aspects of control design and implementation should be assessed to address these issues?
- Suggest monitoring strategies to detect such issues in the future.
- Identify potential control failures or weaknesses indicated by these findings.
- Recommend actions for the continuous improvement of these application controls.
Mini Case Study 2
Acme Corporation, a medium-sized manufacturing company, has recently implemented a new procurement system to streamline its purchasing process. The system automates purchase order creation, approval, and supplier payment processes. The procurement system integrates with the company’s inventory and financial systems.
As a newly hired IT auditor at Acme Corporation, you are tasked with evaluating the effectiveness of the application controls within this new procurement system.
The procurement process involves the following steps:
- Creation of purchase orders based on inventory requirements.
- Approval of purchase orders by department heads.
- Automatically match purchase orders with supplier invoices.
- Payment processing to suppliers after invoice verification.
Required: Identify the critical controls in the procurement process and propose tests for these application controls.
Independent auditors not employed by the organization, focusing on providing an unbiased opinion on financial statements and internal controls.