11 May 2012

Rethink Safety

Related Article: Hindsight Bias and Fundamental Attribution Error

Today's the last class for "16.863/ESD.863 System Safety". This class is taught by Professor Nancy Leveson, Professor of Aeronautics and Astronautics, and Professor of Engineering Systems at MIT. She is well recognized for her work in system and software safety. More of her work can be found at here.

In the aftermath of accidents, reports and legal proceedings tend to focus on assigning blame to individuals whose actions led to the mishaps. Assigning blame is easy,  but does very little to prevent future accidents from happening. My view on safety has changed over the course of the class. Specifically, it shifted from focusing on individual responsibility and errors to viewing safety as a control problem. Also, it reaffirmed my view that quantifying safety with probablisitic numbers is not meaningful. 10^-9 for software coding errors? How did the number come about?

STAMP (System-Theoretic Accident Modeling and Processes) and STPA (System-Theoretic Process Analysis) are powerful techniques developed by Professor Leveson to perform accident analysis and hazard analysis. The techniques  (Leveson, 2012) are not perfect, but certainly better than unrealistic probabilistic estimations. These techniques can be applied to sociotechnical systems, and a wide variety of industry applications including aerospace, nuclear, financial, chemical, oil and gas, and so on. More details, examples, and applications of STAMP/STPA can be found in the book "Engineering a Safer World"and here.

The just culture movement (Dekker, 2008) promotes learning from mistakes rather than blaming and punishing people. Being blame-free does not equate to being accountability-free. On the contrary, just culture promotes the active involvement of people to help build better and safer systems to work in or with. The key is to see accountability as forward-looking instead of backward looking. Backward looking includes the acts of blaming and punishing which does not really help to reduce the reoccurrence of similar accidents. The forward-looking mindset concentrates on making improvements to the system to prevent similar accidents from happening again. 

Actions of improvement speak louder than words of blame. Punishment serves as deterrence but does little to prevent future accidents if the system is inherently deficient. It is easier to design safer systems than to change people. Be safe than sorry!

As part of my class project, I came up with a safety control structure for the Traffic Collision Avoidance System (TCAS). Professor Leveson liked it and I'm sharing it here.

Thanks very much for reading.


Works Cited
Leveson, N. G. (2012). Engineering a Safer World. MA: The MIT Press.
Dekker, S. (2008). Just culture: who gets to draw the line? Cogn Tech Work.
Teo, K. S. (2012). 16.863/ESD.863 System Safety Class Project: STPA for TCAS. MIT, ESD.

3 comments:

  1. I have updated the link to Professor Leveson's book. The previous link was a draft. The current link is the actual published book.

    ReplyDelete
  2. 互相埋怨
    设法推卸
    人之常情
    制定机制
    防患未然
    乃为上策

    干得好!赞!

    ReplyDelete
  3. CML, you captured the essence of my thoughts in so few Chinese characters. The beauty of Chinese characters is unsurpassed. 赞!

    ReplyDelete