The gap your procedures cannot see

Work-as-imagined versus work-as-done — and why the distance between them is where risk lives
Every organisation has two versions of how work gets done. The first is documented, approved, and filed. It lives in procedures manuals, safety management systems, training materials, and audit checklists. The second version is what actually happens on a Tuesday afternoon when the schedule is tight, the system is down, the experienced colleague is on leave, and someone has to make a judgement call.
These two versions are never identical. The gap between them is not negligence. It is not a sign that your workforce is cutting corners or your managers are not paying attention. It is a fundamental feature of complex work in complex environments. Erik Hollnagel calls this the distinction between Work-as-Imagined (WAI) and Work-as-Done (WAD), and understanding it may be the single most important shift an organisation can make in how it thinks about risk.
Work-as-Imagined is the version that exists in the heads of those who design the system — the safety officers, the engineers, the policy writers, the regulators. It assumes predictable conditions, complete information, and workers who follow instructions to the letter. This is not a criticism of the people who write procedures. It reflects the genuine difficulty of designing systems for work you are not doing in real time.
The procedure is written in a controlled environment. The work is done in an uncontrolled one. That gap is where incidents are born.
Work-as-Done reflects the reality that people encounter: variability, competing demands, resource constraints, informal knowledge, time pressure, and the need to improvise when the manual does not quite cover the situation in front of them. Experienced workers develop workarounds. Teams develop norms. Shortcuts become standard. None of this is necessarily unsafe — much of it is the accumulated wisdom of people who understand the work far better than any procedure document does. The problem arises when those adaptations accumulate invisibly, when nobody knows the workaround has become routine, when the informal norm diverges from the formal requirement in ways that remain hidden until something goes wrong.
This is what Diane Vaughan, studying the Challenger disaster, called the normalisation of deviance. Decisions that once required conscious deliberation become standard practice through repetition. The gap between the written rule and the actual behaviour widens gradually, each increment too small to trigger alarm. The system appears to be working. People are confident. The risk is invisible — until it is not.
Organisations that rely solely on compliance as their safety strategy are, in effect, managing Work-as-Imagined while ignoring Work-as-Done. They audit against the procedure. They measure adherence to the training. They count the incidents that are reported. What they do not do is go and look at the work itself — not to catch people out, but to understand it. What actually happens when the pressure is on? Where do people rely on informal knowledge that has never been captured? What are the conditions that make a shortcut feel like the only reasonable option?
The organisations that get this right treat the gap between WAI and WAD as information, not as failure. They create conditions where workers can surface the discrepancy without fear — where saying "the procedure doesn't match reality" is treated as a valuable contribution rather than an admission of wrongdoing. They conduct walkthroughs and task observations not as audits but as learning exercises. They ask operational staff to describe how work actually gets done, and they listen carefully to the answer.
This shift requires a different relationship between management and the workforce. It requires leaders who are genuinely curious about operational reality rather than reassured by compliance metrics. It requires a culture where the person doing the job is seen as the expert on the job — because in every meaningful sense, they are.
The procedure is necessary. It encodes collective knowledge, legal requirement, and organisational intention. But it is a starting point, not an endpoint. The real work of safety is understanding what happens in the space between the document and the decision — and closing that gap deliberately, continuously, before circumstances close it for you.

