embracing ambiguity
when it comes to determining to [the] truth
what's more reliable: ambiguity or unanimity
strangely enough, some times the closer you get to total agreement, the less trustworthy a result becomes
source:
5:10
How do you know what is true? - Sheila Marie Orfano
https://www.youtube.com/watch?v=xg5y6Ao7VE4
https://www.youtube.com/watch?v=xg5y6Ao7VE4
TED-Ed
Jun 10, 2021
____________________________________
Sidney Dekker, The field guide to human error investigations, 2002
p.54 (pdf page: 57/154)
By referring to procedures, physically available data or standards of good practice, investigators can micro-match controversial fragments of behavior with standards that seem applicable from their after-the-fact position. Referent worlds are constructed from the outside the accident sequence, based on data investigators now have access to, based on the facts they now know to be true. The problem is that these after-the-fact-worlds may have very little relevance to the circumstances of the accident sequence. They do not explain the observed behavior. The investigator has substituted his own world for the one that surrounded the people in question.
p.54 (pdf page: 57/154)
Referent worlds are constructed from the outside the accident sequence, based on data investigators now have access to, based on the facts they now know to be true.
p.54 (pdf page: 57/154)
The problem is that these after-the-fact-worlds may have very little relevance to the circumstances of the accident sequence. They do not explain the observed behavior. The investigator has substituted his own world for the one that surrounded the people in question.
p.56 (pdf page: 59/154)
PUT DATA IN CONTEXT
Taking data out of context, either by:
• micro-matching them with a world you now know to be true, or by
• lumping selected bits together under one condition identified in
hindsight
p.57 (pdf page: 60/154)
robs data of its original meaning. And these data out of context are simultaneously given a new meaning──imposed from the outside and from hindsight.
p.57 (pdf page: 60/154)
You impose this new meaning when you look at the data in a context you now know to be true. Or you impose meaning by tagging an outside label on a loose collection of seemingly similar fragments.
p.57 (pdf page: 60/154)
To understand the actual meaning that data had at the time and place it was produced, you need to step into the past yourself. When left or relocated in the context that produced and surrounded it, human behavior is inherently meaningful.
p.57 (pdf page: 60/154)
Historican Barbara Tuchman put it this way: “Every scripture is entitled to be read in the light of the circumstances that brought it forth. To understand the choices open to people in another time, one must limit oneself to what they knew; see the past in its own clothes, as it were, not in ours.”4
4 Tuchman, B. (1981). Practicing history: Selected essays. New York: Norton, page 75.
source: The field guide to human error investigations, by Sidney Dekker,
Cranfield university press
filename: DekkersFieldGuide.pdf
(Sidney Dekker, The field guide to human error investigations, 2002, )
<------------------------------------------------------------------------>
Sidney Dekker, The field guide to human error investigations, 2002
p.62 (pdf page: 63/154)
• Safety is never the only goal in the systems that people operate.
Multiple interacting pressures and goals are always at work. There
are economic pressures; pressures that have to do with schedules,
competition, customer service, public image.
• Trade-offs between safety and other goals often have to be made
under uncertainty and ambiguity. Goals other than safety are easy
to measure (How much fuel will we save? Will we get to our
destination?). However, how much people borrow from safety to
achieve those goals is very difficult to measure.
• Systems are not basically safe. People in them have to create safety
by tying together the patchwork of technologies, adapting under
pressure and acting under uncertainty.
Trade-offs between safety and other goals enter, recognizably or not, into thousands of little and larger decisions and considerations that practitioners make every day. Will we depart or won't we? Will we push on or won't we? Will we accept the directive or won't we? Will we accept this display or alarm as indication of trouble or won't we? These trade-offs need to be made under much undertainty and often under time pressure.
p.63 (pdf page: 64/154)
****************************************
* *
* HUMAN ERRORS ARE SYMPTOMS OF *
* DEEPER TROUBLE *
* *
****************************************
Human error is the starting point of an investigation. The investigation is interesting in what the error points to. What are the sources of people's difficulties? Investigations target what lies behind the error──the organizational trade-offs pushed down into the individual operating units; the effects of new technology; the complexity buried in the circumstances surrounding human performance; the nature of the mental work that went on in difficult situations; the way in which people coordinated or communicated to get their jobs done; the uncertainty of the evidence around them.
Why are investigations in the new view interested in these things? Because this is where the action is.
source: The field guide to human error investigations, by Sidney Dekker,
Cranfield university press
filename: DekkersFieldGuide.pdf
(Sidney Dekker, The field guide to human error investigations, 2002, )
<------------------------------------------------------------------------>
Sidney Dekker, The field guide to human error investigations, 2002
p.111 (pdf page: 109/154)
People generally interpret cues about the world on the basis of what they have told their automated systems to do, rather than on the basis of what their automated systems are actually doing. In fact, people do not act on the basis of reality, they act on the basis of their perception of reality. Once they have programmed their ship to steer to Boston in NAV mode, they may interpret cues about the world as if the ship is doing just that. Evidence about a mismatch has to be very compelling for people to break out of the misconstruction of mindset. They have no expectation of a mismatch (the system has behaved reliably in the past), and such feedback as there is (a tiny mode annunciation) is not compelling when viewed from inside the situation.
p.114 (pdf page: 112/154)
The pattern is typical because people in dynamic worlds always face a trade-off between changing their assessments and actions with every little change (or possible indication of change) in the world, versus providing some stability in interpretation to better manage and oversee an unfolding situation; creating a framework in which to place newly incoming information. There are errors of judgment on both ends. On the other, people can get fixated, they do not revise their assessment in the face of cues that (in hindsight) suggested it could be good to do so.
source: The field guide to human error investigations, by Sidney Dekker,
Cranfield university press
filename: DekkersFieldGuide.pdf
(Sidney Dekker, The field guide to human error investigations, 2002, )
<------------------------------------------------------------------------>
Sidney Dekker, The field guide to human error investigations, 2002
p.116 (pdf page: 114/154)
• Find out what organizational history or pressures exist behind
these routine departures from the routine; what other goals help
shape the new norms for what is acceptable risk and behavior.
• Understand that the rewards of departures from the routine are
probably immediate and tangible: happy customers, happy bosses,
money made, and so forth. The potential risks (how much did
people borrow from safety to achieve those goals?) are unclear,
unquantifiable or even unknown.
• Realize that continued absence of adverse consequences may
confirm people in their beliefs (in their eyes justified!) that their
behavior was safe, while also achieving other important system
goals.
p.116 (pdf page: 114/154)
Borrowing from safety
With rewards constant and tangible, departures from the routine may become routine across an entire operation or organization.
****************************************
* *
* DEVIATIONS FROM THE NORM CAN *
* THEMSELVES BECOME THE NORM *
* *
****************************************
Without realizing it, people start to borrow from safety, and achieve other system goals because of it──production, economics, customer service, political satisfaction. Behavior shifts over time because other parts of the system send messages, in subtle ways or not, about the importance of these goals. In fact, organizations reward and punish operational people in daily trade-offs (“We are an ON-TIME operation!”), focusing them on goals other than safety. The lack of adverse consequences with each trade-off that bends to goals other than safety, strengthens people's tacit belief that it is safe to borrow from safety.
source: The field guide to human error investigations, by Sidney Dekker,
Cranfield university press
filename: DekkersFieldGuide.pdf
(Sidney Dekker, The field guide to human error investigations, 2002, )
<------------------------------------------------------------------------>
Acknowledgements
I want to thank those who alerted me to the need for this book and who inspired me to write it, in particular Air Safety Investigator Maurice Peters and Captain Örjan Goteman. It was written on a grant from the Swedish Flight Safety Directorate and Arne Axelsson, its director. Kip Smith and Captain Robert van Gelder and his colleagues were invaluable for their comments and suggestions during the writing of earlier drafts.
S.D.
Linköping, Sweden
Summer 2001
<------------------------------------------------------------------------>
Sidney Dekker, The field guide to human error investigations, 2002
p.4 (pdf page: 8/154)
Investigators intend to find the systemic vulnerabilities behind individual errors. They want to address the error-producing conditions that, if left in place, will repeat the same basic pattern of failure.
(Sidney Dekker, The field guide to human error investigations, 2002, )
<------------------------------------------------------------------------>
Sidney Dekker, The field guide to human error investigations, 2002
p.20 (pdf page: 23/154)
Focusing on people at the sharp end
Reactions to failure focus firstly and predominantly on those people who were closest to producing and to potentially avoiding the mishap. It is easy to see these people as the engine of action. If it were not for them, the trouble would not have have occurred.
p.20 (pdf page: 23/154)
Blunt end and sharp end
In order to understand error, you have to examine the larger system in which these people worked. You can divide an operational system into a sharp end and a blunt end:
• At the sharp end (for example the train cab, the cockpit, the surigical
operating table), people are in direct contact with the safety-
critical process;
• The blunt end is the organization or set of organizations that supports
and drives and shapes activities at the sharp end (for example the
airline or hospital; equipment vendors and regulators).
pp.20-21 (pdf page: 23-24/154)
The blunt end gives the sharp end resources (for example equipment, training, colleagues) to accomplish what it needs to accomplish. But at the same time it puts on constraints and pressures (“don't be late, don't cost us any unnecessary money, keep the customers happy”). Thus the blunt end shapes, creates, and can even encourage opportunities for errors at the sharp end. Figure 2.3 shows this flow of causes through a system. From blunt to sharp end; from upstream to downstream; from distal to proximal. It also shows where the focus of our reactions to failure is trained: on the proximal
p.21 (pdf page: 24/154)
Figure 2.3: Failures can only be understood by looking at the whole system in which they took place. But in our reactions to failure, we often focus on the sharp end, where people were closest to causing or potentially preventing the mishap.
p.22 (pdf page: 25/154)
Why do people focus on the proximal?
Looking for sources of failure far away from people at the sharp end is counter-intuitive. And it can be difficult. If you find that sources of failure lie really at the blunt end, this may call into question beliefs about the safety of the entire system. It challenges previous views. Perhaps things are not as well-organized or well-designed as people had hoped. Perhaps this could have happened any time. Or worse, perhaps it could happen again.
source: The field guide to human error investigations, by Sidney Dekker,
Cranfield university press
filename: DekkersFieldGuide.pdf
(Sidney Dekker, The field guide to human error investigations, 2002, )
<------------------------------------------------------------------------>
No comments:
Post a Comment