I’d like to build on my last blog post about systems and systems thinking. It’s clear that building and maintaining systems is hard work. As an auditor we can see when our clients run out of energy and will to maintain and deliver systemic responses to risks or challenges. Often you see this in organisations that are large, bureaucratic and long-established, but I think it can happen in any organisation. It’s a sense that to oversee and enforce policies and standards is just too difficult or requires too much effort. As internal auditors I think it is our role to point out, push back and energise our clients against such complacency.
A good example is on safety standards. I imagine that following some of the very detailed requirements for operating an aircraft can become tedious. It’s that sense that this is bureaucratic. Keeping the exit row clear, making sure tray tables are up when taking off and landing etc. They don’t feel necessary when things are going well. Yet, they are essential when trying to empty a plane a pace when it catches fire, as in the recent example in the news at Tokyo airport. It’s the same for financial and procurement controls, though of course the circumstances are different. It’s the same for food safety standards in restaurants. Perhaps it doesn’t matter if an employee doesn’t wash their hands once, or food is not thrown out with clear in date rules a few times. Over time, however, the risk increases, and the one certainty about a risk is that, unmanaged, it eventually becomes an issue.
Just as systems can be effectively deployed to ensure things happen, they can be so deployed to ensure things don’t happen. Think about fraud or misconduct prevention. We have the now very public and celebrated example in the UK of the Post Office’s failed roll out of its accounting IT system, Horizon. If you had asked me as an oversight professional how likely it was that vast numbers of staff, systems, data controls, oversight and governance would have allowed, for such a long period, and to such a scale, the scandal that evolved; I would say unlikely. Yet it happened. That scale of collusion seems almost incredible.
This, for me, makes the point that frameworks of control (which are systems) in fact embody very different types of controls. They comprise rational / legal / compliance controls. They comprise risk based, principles-based controls. Most crucially, however, they embody cultural controls. Culture eats not just strategy for breakfast, but also other types of control systems. For, if the culture is to turn a blind eye, or to ignore certain types of risks, or to allow certain types of behaviours, then no amount of training, advice, risk management, governance, will push that inclination back.
I have never worked at the UK Post Office, nor have I undertaken a detailed analysis of the Post Office in this tragic case. It seems to me, however, that culture played a key part in this. A while ago it was fashionable in internal audit circles to look at culture. I think largely on the back of the financial crash of 2008. Perhaps we, as the internal audit profession, should reconsider bringing this back into vogue. Understanding control frameworks, from a perspective of culture and cultural controls, as well as risk / principles-based and rational / legal controls should be embedded into our work as a matter of course. If I was an executive manager of a department of an organisation, hearing some independent view of controls and culture would be very important. It would be even nicer to hear it in the normal course of ongoing oversight work, rather than as a lessons learned report on the back of some corporate failure.
Yet, this requires internal audit to be braver. Courage is one of the new elements of the revised IIA Standards for 2025 (the subject of my next post). It requires internal audit to own the space of subjectivity and opinion forming. It requires internal audit to own its independence. It requires individual internal auditors and internal audit leaders to step into this space more and to be courageous. In some cases it will require tenacity to get to the bottom of something and not let initial signs and indications of problems go. In the case of the UK Post Office, I wonder if a brave CAE, being courageous, could have literally saved lives.
Without prefacing my remarks in my next blog post about the revised IIA Standards too much, I think the new Standards are a big step forward and have some really good things to place the profession in a good space. We can debate whether making internal audit a creature of the board is necessarily realistic or smart, but the inclusion, overtly, of courage is a very clever step.
So my conclusion from this post is that courage, and cultural review, are essential to good internal audit. Saying things ‘as they are’, rather than how people would like them to be, or how they would like them to be presented, is crucially important. Standard 1.1 requires professional courage – are you ready for this?