The investigation trap

Why most incident investigations find the person closest to the event rather than the conditions that made the event possible
Most incident investigations are exercises in reverse engineering. Starting from an unwanted outcome, they work backwards through the event sequence, identifying decisions and actions that, in hindsight, appear to have caused the result. The process feels analytical. It is, in practice, heavily shaped by cognitive biases that reliably lead it to the same conclusion: the person who was nearest to the event when it occurred made a mistake.
This conclusion is usually technically true and almost always incomplete. The person nearest the event did make a decision that contributed to the outcome. They also made that decision in a specific context — under particular pressures, with particular information, within particular norms and expectations — and understanding that context is the difference between an investigation that produces learning and one that produces a report.
Hindsight bias is the primary distorting mechanism. Once an outcome is known, the decisions that produced it appear more obviously wrong than they were at the time. The signal that should have been acted on seems clearly a signal, rather than one of hundreds of ambient data points. The shortcut that contributed to the failure seems clearly reckless, rather than a pragmatic adaptation that had worked reliably dozens of times before. Investigators — and the leaders who receive their reports — are not immune to this bias. They are, without specific training to counteract it, fully subject to it.
Hindsight makes reasonable decisions look unreasonable and normal conditions look exceptional. Good investigation requires undoing that distortion deliberately.
The Five Whys, widely taught as an investigation tool, is particularly susceptible to this problem. The technique asks investigators to repeatedly ask "why" until a root cause is identified. In practice, it tends to produce one of two outcomes: it stops at individual behaviour — the person did X because they made a poor choice — or it reaches the most abstract possible systemic conclusion — ultimately, management commitment was insufficient. Neither of these is wrong, and neither produces the specific, actionable understanding of what actually happened in the system that would allow the organisation to prevent a recurrence.
More sophisticated frameworks approach this differently. AcciMap maps the contributing factors across multiple system levels simultaneously — the operational environment, the organisation, the regulatory context — showing how decisions made at each level created the conditions experienced at the levels below. FRAM maps the functions that make up normal work and traces how variability in those functions coupled to produce the unexpected outcome. Systems Theoretic Accident Model and Process (STAMP) approaches the system as a control problem: how did the controls that should have prevented this outcome fail to do so?
What these frameworks share is a commitment to understanding the system rather than the individual. They do not ignore individual behaviour. They place it in context — the context of design decisions, organisational priorities, resource constraints, cultural norms, and management choices that shaped what the individual's behaviour looked like from the inside at the time of the decision.
Building better investigation capability requires investment in training, in frameworks, and in the organisational courage to accept findings that implicate systems and management rather than individuals. It also requires protecting the investigation process from the pressure — legal, reputational, or operational — to reach conclusions quickly that minimise organisational responsibility. This pressure is real and significant in most organisations. Managing it is a leadership task, not an investigation task.
The trap is the comfort of the conclusion that finds the individual and closes the file. Escaping it requires asking what the individual's behaviour tells you about the system — and being willing to act honestly on the answer.

