I have been thoroughly enjoying watching the Olympics on television. I am not one to watch sport but the Olympic sport has been so captivating and really quite accessible, even to a non sports buff like me, in a quite surprising way. One theme of my surprise has been how medal winning seems to come down to such a few moments, minutes and seconds. It does not matter whether it is the rowing at Eton and the 7 minutes a race takes there, or the 2 minutes of swimming or the 3 seconds of diving; seemingly years of work comes down, in the final analysis, to a few moments of good or bad performance.
During my career I have been lucky (!) to undertake and supervise a significant amount of data audit. Now I’m not keen on data audit, in fact it is my least preferred type of audit. Why? Well generally the audit questions posed at first appear simple, but are often complex once the data is reviewed. The amount of work it takes to answer even simple data questions is significant. Unless it is possible to sample, the testing is laborious, and even getting data sets in auditable formats is a difficult audit job in itself. During the audit the source documentation is normally scattered to the organisational four winds. Normally sample checking reveals issues, it is then difficult to identify the population of the error. Finally, once an error is quantified, an error reported is normally not welcomed, either by the client, or by the funder that commissioned the audit in the first place. The only winning answer is one that appears to do an audit but does not find any problems. For all of these reasons data audit is not my favourite activity.
My point here is that a data audit, picking up on a position at a point in time, feels a lot like some of these Olympic sports. Yes I can appreciate that years of hard work, many early morning swims, jogs, rows, matches, gym visits, go into the 7, 3, 2 minute or seconds of performance we see on our screens. Yet what if they had a cold on the day, or a sore muscle? How much is contingent on the moment?
That’s why when I audit I try to put the data in context. I try to think about the control environment. I think about the quality of effort put into the database, the types of people and work they have undertaken to prepare the data, the general quality of the audit trail for source documents, the evidence of data validation routines, the amount and nature of senior management interest and involvement in the data, and the general culture underpinning the data preparation. If these feel right I am inclined to believe errors in my samples are one offs; performance issues on the day. I am more inclined to do extra audit work to quantify how small a particular error is in a population of data. I will look harder for alternative audit trails and evidence to support the claim.
So perhaps the similarity here is that even if a data audit does not deliver a personal best on the day, it makes sense to appreciate and contextualise the sheer hard work put in by a client to get to the point of audit. For this it requires the internal auditor to think like a coach and not like a judge. As with most things in life, things are not black or white, they are grey; data audit is no different in my view.
So next time you are stood by the data ‘finish line’ with your audit stopwatch, do give a thought to the dedication and training put in to get to that point, because for me that is generally the point of a data audit, to get a macro sense of the ‘right’ answer, for both the client’s board and the client’s funder.