I remember working for one audit function that established a testing/sampling population of 25 for everything. If the actual population was 25,000 records, we tested 25. If the population was 1,000,000 records, we tested 25. Although fairly young in my career when I experienced this, my instinct told me this was wrong. After inquiring, I was told
(1) There was not enough money in the budget to purchase data analysis software.
(2) It would take too much time to construct a test population that represented the data set. And finally,
(3) The data does not lie.
And because of this, I witnessed client after client receive audit comments boasting:
Twelve of 25 (48%) of transactions (or whatever) were not (insert whatever here…processed timely, accurate, etc).
The standard formula for writing issues was just that dull and flawed. Clients were unhappy because they believed it was not a fair representation of their environment. The audit staff was unhappy because it was not a fair representation of clients’ environments and yet we were forced test in this manner. Fortunately a few things changed over the next several years:
(1) Internal auditors began focus more heavily on controls
(2) Microsoft Excel added more features that supported Computer Assisted Auditing Techniques (CAAT)
(3) Data became more accessible
(4) I was no longer with that specific organization
These items allow auditors to obtain and test larger volumes of data. This, in turn, provides a more accurate assessment of a client’s environment. Using the power of data in your audits is not too difficult. Here are a few things you will need to learn:
- The type of data available (i.e. flat file, text file, delimited file, etc)
- How to obtain data (i.e. directly from the client, self-extraction, etc)
- Preserving the original data
- Using Excel to analyze data
- Data presentation. Or how to make the results visually appealing when communicating to stakeholders. Let’s face it, no one wants to see volumes of data that provides no context about the results.
I must caution, using these techniques will help you cut through the clutter and get to the central issue (if one exists). In other words, it will provide a true depiction of the situation. While this is what everyone really wants, be warned that you may face extreme opposition from “Emotional Clients”.
Please note that I am not saying that all clients are emotional, but rather labeling those (hopefully only a few) who tend to ignore facts that can improve their operations and instead “shoot the messenger”.
So what do I mean by an “Emotional Client”. I remember one incident in which we used Excel to test an entire population of data. Excel determined (haha Excel, not the auditors) that over 60% of the client’s activities deviated from their predetermined acceptable limit. Working with the client’s staff, we all determined the root cause of the issue. His staff actually came up with a viable solution. There was a cost associated with the solution and therefore required the client’s approval. We (the auditors and the client’s staff) worked diligently on a presentation describing the issue and the proposed solution. I was in for a big surprise when we presented it to the client. What follows is a step by step of what happened (notice the escalation):
- First he asked where the information came from
- We explained the technique used
- Then he asked if we were sure
- We further explained (or re explained) that the technique considered the entire population, formulas were tested multiple times not only by the auditors but by his staff
- Next he commented that this is bad
- We agreed but reminded him that there is already a solution on the table
- Finally, he said that we “made the data look the way we wanted it to”. This made him look bad. Somehow we had taken the data, change it, presented it in a damaging format all to derail his career.
While it is true that bad data can lead to bad decisions. This is definitely not what happened here. What was extremely insulting was the accusation of purposefully manipulating data to ruin him. None of us knew him personally and had no reason to do him harm. Someone in the group politely mentioned that “data doesn’t lie”. The client was invited to take the data, analyze it, and formulate his own conclusions. He respectfully declined. I suppose that might have been too much work. This particular client never addressed this problem. It is a shame because his staff was excited about fixing the issue. They eventually did, but under new leadership.
Data does not lie. However, your data collection, analysis and presentation techniques are critical in the telling the story behind the data. You must pay careful attention to the items so that your data is not accused of telling tall tales.