The bad apple problem

April 14, 20263 min read

Fire the person, keep the system, and the next person inherits identical conditions. Blame produces closure, not change. Be better than yesterday by understanding what the system created — not just who was standing nearest.

Why blame is the least effective investigation outcome — and what happens when you fire the person but keep the system

When something goes wrong, the search for a cause almost always ends at a person. The worker who made the error. The manager who failed to supervise. The individual whose decision, in retrospect, looks obviously wrong. This is cognitively natural, legally convenient, and organisationally catastrophic.

The bad apple narrative is seductive because it provides everything an organisation needs from an incident: a clear cause, a definitive response, and the reassurance of exceptionalism. It was that person. It was that decision. It was that failure of individual judgement. We have addressed the individual. The system is fine. The message to everyone else is implicit but unmistakable: this would not happen to us, because we would not make that choice.

The problem is that this narrative is almost always wrong, and the research on it is unambiguous. When investigators go beyond the proximate cause — the last person to touch the system before it failed — they consistently find the same things: production pressure that made corners worth cutting, ambiguous procedures that left judgements to individuals who were not equipped to make them, training that covered the normal case but not the edge case, supervisory structures that created accountability without authority, and a reporting culture that had quietly taught people not to raise concerns.

When you fire the person and keep the system, the next person inherits exactly the same conditions. The only thing that has changed is who is at risk.

Sidney Dekker, whose work on human error has been influential across aviation, healthcare, and increasingly industrial settings, describes this as the difference between the old view and the new view of human error. The old view treats error as a cause — the explanation for what went wrong. The new view treats error as a symptom — an indicator that the system created conditions in which error was likely. The distinction sounds philosophical. Its practical implications are profound.

Under the old view, the response to an incident is to find and fix the individual: discipline, retraining, dismissal. Under the new view, the response is to understand the system that produced the individual's behaviour: what were the conditions, the pressures, the information available, the norms of the team, the design of the task? The old view produces clean closure. The new view produces useful learning.

This does not mean that individuals are never responsible for their actions, or that accountability has no place in incident response. It means that accountability placed only at the individual level, in the absence of systemic understanding, is accountability without change. The system that created the conditions for the error continues to operate. The next person who faces those conditions will face the same choices — and without the benefit of knowing what happened to their predecessor, they may make exactly the same decision.

The organisations that move beyond bad apple thinking do not abandon accountability. They extend it. They ask not just what the individual did, but what conditions the organisation created, what signals it missed, what pressures it applied, and what it is prepared to change. They treat the individual's behaviour as a data point about the system, not as the conclusion of the investigation.

This shift requires courage from leaders, because it means accepting that the organisation's own decisions contributed to the outcome. It also requires a different kind of investigation capability — one that can move from events to conditions, from individual action to systemic context, from blame to understanding. The tools for this exist: AcciMap, FRAM, Systems Theoretic Accident Model and Process, and others. They are not complicated to apply. They require only the willingness to look further than the last person standing.

The bad apple is rarely the problem. The barrel usually is.

Gareth Lock is the founder of The Human Diver and Human in the System — two organisations built on a single conviction: that most unwanted events in high-risk environments are system failures, not people failures. Through structured courses, immersive simulations, incident investigation, and keynote speaking, he brings frameworks from military aviation and academic human factors research into the practical reality of diving and high-risk industry. His work spans recreational and technical divers learning non-technical skills for the first time, through to senior safety leaders restructuring how their organisations investigate, debrief, and learn. Everything sits under one guiding principle: be better than yesterday.

Gareth Lock

Gareth Lock is the founder of The Human Diver and Human in the System — two organisations built on a single conviction: that most unwanted events in high-risk environments are system failures, not people failures. Through structured courses, immersive simulations, incident investigation, and keynote speaking, he brings frameworks from military aviation and academic human factors research into the practical reality of diving and high-risk industry. His work spans recreational and technical divers learning non-technical skills for the first time, through to senior safety leaders restructuring how their organisations investigate, debrief, and learn. Everything sits under one guiding principle: be better than yesterday.

LinkedIn logo icon
Back to Blog