"Why did you do this? You should have done that." Most of us (including me) are guilty of similar thoughts or words.
This is a classic example of hindsight bias (knew-it-all-along effect).
Accident investigators already know the
accident outcomes and sequence of events. Hence, they tend to be biased and focused on significant
data points leading to the accident. Instead of blaming individual actions, it would be more meaningful to understand why individuals took or did not take appropriate actions based on the information available and context facing them leading to the accident. Recommendations should focus on the changes and improvements
required to help individuals make better decisions in future to prevent
accidents. Of course, the underlying assumption is that humans do not possess malicious intent.
Another concept is that of fundamental attribution error. The observer overvalues his own point of view and undervalues the actual situational context faced by the actor being observed. If the roles of the observer and actor are swapped, the same fundamental attribution error is likely to occur.
It is almost impossible to eliminate hindsight bias and fundmental attribution error. The takeaway is to focus on systems rather than individuals. Improving the design of the systems (including technical, social and sociotechnical systems) is certainly more productive than focusing on blame. For instance, performance measurement systems shape and drive the behaviors of employees in enterprises. Who is to blame when undesired outcomes occur?
Again, even if the system in place is flawless (is there one?), human ingenuity, the very reason that differentiates humans from other living things on Earth, more often than not, is the very same reason that collapses the systems we built. Seems like "People" are stickier problems than "Technology". Leadership may be the best solution available.
Thanks very much for reading.
No comments:
Post a Comment