« Tip: (moving toward email sanity) Getting control of your email signatures | Main | GAO Study Comparing Commercial and Military Shipbuilding Practices »
Sunday
Jun072009

Sidney Dekker's Thoughts about Accident Investigations

I attended a two day seminar with Sidney Dekker (author of The Guide to Human Error Investigations, Ten Questions about Human Error, and Just Culture) 8-9 Apr in DC sponsored by the Forest Service, NTSB, and DOE. It was a great learning opportunity that also included Dr. Karlene Roberts, Dr. Najmedin Meshkati, and Hank Kashdan joining Dr. Dekker for an invigorating panel-audience discussion.

I used my LiveScribe Pulse Smart Pen to record the audio while simultaneously typing notes as fast as I could on my Asus Eee PC 1000HE netbook and then transcribed what he and others said at the event.  My notes with numerous pictures, diagrams, and web links that give you a "you were there" feel, the flyer for the event, and several of Dr. Dekker's journal articles are available in the file download area of this site.

Most of what you think you know (called “the Old view” by the author) about investigating problems (critiques and fact findings) and human factors is probably wrong, will not lead to the outcomes most people seek, or, worse yet, actually make effective progress in making things safer harder to achieve:. This is not because of any evil intent, merely poor understanding (very few people get formal training on causal analysis) and ignorance of the impacts of what people have been taught to do through on the job training. There is no escaping that the hopes we have put in traditional post-event “remedies” like adding automation or procedures, reprimanding miscreants, retraining, and adding supervision (only temporarily, until we are sure they “get it”) are bankrupt. There is little long-term effect if the basic conditions that people work under are left unchanged.

 

Key ideas in Sidney Dekker’s Writing

  • Errors do not exist "out there," actions are labeled errors after the fact, usually by people with access to how the situation turned out (and thus more info than the people had at the time). In almost every situation, what people where doing prior to "the problem" made sense to them at the time. If you cannot see that, you have been blinded by hindsight.
  • The leverage for learning about how to make systems safer lies in the space between what people do and what you want them to do. There is always a gap between what people on the front line do and what you think they are/are supposed to be doing. What you do about that gap when you learn about it has profound consequences for how readily people will share safety information with you (the type that only they can know).
  • Every action you take to make the system safer has consequences that can work against safety.
  • For progress on safety, organizations must monitor and understand the reasons behind the gap between procedures and practice. Additionally, organizations must develop ways that support people’s skill at judging when and how to adapt.
  • Blame-free is not the same as accountability free. While punishing, prosecuting, or jailing people for making errors satisfies a deep human need in making sense of bad outcomes, it has profound consequences for future safety (what you will learn from self-disclosure).
  • There is no perfect balance in building a just culture. Each organization and society has to figure it out for itself.

The point of a human error investigation is to understand why people did what they did, not to judge them for what they did not do. Learning from failure is ultimate goal of an investigation: failures represent opportunities for learning-opportunities that can fall by the wayside primarily because of the overconfidence people feel in post-event reconstructions of reality. The mystery in an investigation is not why people could have been so unmotivated or stupid not to pick up the things that you can decide were critical in hindsight. The mystery is to find out what was important to them, and why.

 

All safety–critical work is ultimately channeled through relationships between human beings (such as in medicine), or through direct contact of some people with the risky technology. At this sharp end, there is almost always a discretionary space into which no system improvement can completely reach.

PrintView Printer Friendly Version

References (1)

References allow you to track sources for this article, as well as articles that were written in response to this article.
  • Response
    products. these bags are mostly hand crafted and environment friendly. these can effectively suit if you have put on a "sarree" or you are dressed up

Reader Comments

There are no comments for this journal entry. To create a new comment, use the form below.

PostPost a New Comment

Enter your information below to add a new comment.
Author Email (optional):
Author URL (optional):
Post:
 
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>