Hiljaista Pohdintaa

Hiljaista Pohdintaa

sunnuntai 25. syyskuuta 2011

Käsityskyvyn ja logiikan puutteellisuus

http://socialpathology.blogspot.com/2011/09/logic-of-failure.html

http://photonplaza.blogspot.com/2003_12_28_photonplaza_archive.html#107275628844222087

http://www.amazon.com/Logic-Failure-Recognizing-Avoiding-Situations/dp/0201479486

Toinen kirja, joka liityy samaan aiheeseen:

http://www.amazon.com/Normal-Accidents-Living-High-Risk-Technologies/dp/0691004129/ref=sr_1_1?s=books&ie=UTF8&qid=1316998292&sr=1-1

Raskaamman sarjan versio samasta aihepiiristä:

http://www.amazon.com/Blackwell-Handbook-Handbooks-Experimental-Psychology/dp/1405157593/ref=sr_1_6?s=books&ie=UTF8&qid=1306162642&sr=1-6

Kaksi Amazon arvostelua kirjasta The Logic of Failure:

"Dietrich Dörner is an authority on cognitive behavior and a psychology professor at the University of Bamberg, Germany. His research shows that our habits as problem solvers are typically counterproductive.

Probably our main shortcoming is that we like to oversimplify problems. Dörner offers a long list of self-defeating behaviors, but common to all of them is our reluctance to see any problem is part of a whole system of interacting factors. Any problem is much more complex than we like to believe. And failure doesn't have to come from incompetence. The operators of the Chernobyl reactor, as Dörner points out, were "experts." And as experts, they ignored safety standards because they "knew what they were doing."

Dörner identifies four habits of mind and characteristics of thought that account for the frequency of our failures:
1. The slowness of our thinking-We streamline the process of problem solving to save time and energy.
2. Our wish to feel confident and competent in our problem solving abilities-We try to repeat past successes.
3. Our inability to absorb quickly and retain large amounts of information-We prefer unmoving mental models, which cannot capture a dynamic, ever-changing process.
4. Our tendency to focus on immediately pressing problems-We ignore the problems our solutions will create.

Successful problem solving is so complex that there are no hard-and-fast rules that work all the time. The best take-away from the book (and this is my favorite quote): "An individual's reality model can be right or wrong, complete or incomplete. As a rule it will be both incomplete and wrong, and one would do well to keep that probability in mind." The book is 199 easy-to-read pages, and Dörner gives lots of interesting examples from lab tests illustrating people's actual behavior in problem-solving situations.

It's a thought-provoking book for anyone whose job is to tackle complex problems. In one way or another that includes anyone in just about any profession."

"Napoleon said "On s'engage et puis on voit!" Loosely translated that means "One jumps into the fray, then figures out what to do next," a common human approach to planning. This discussion (page 161) takes on the adaptability of thought and cautions decision makers about the risks of overplanning in a dynamic, multivariate system. Using examples from Napoleon as well as more concrete examples such as the quotation about soccer strategy (also on page 161,) Dietrich Dörner, the brilliant German behavioral psychologist (University of Bamberg) has created a masterwork on decision making skills in complex systems; I find it to be highly complimentary to Perrow's work and also highly recommend his equally brilliant "Normal Accidents."

A strength of this work is that Dörner takes examples from so many areas including his own computer simulations which show the near-universal applicability of his concepts. One of Dörner's main themes is the failure to think in temporal configurations (page 198): in other words, humans are good at dealing with problems they currently have, but avoid dealing with and tend to ignore problems they don't have (page 189): potential outcomes of decisions are not foreseen, sometimes with tragic consequences. In one computer simulation (page 18) Dörner had a group of hypereducated academics attempt to manage farmland in Africa: they failed miserably. In this experiment Dörner made observations about the decision makers which revealed that they had: "acted without prior analysis of the situation; failed to anticipate side effects and long-term repercussions; assumed the absence of immediately negative effects meant that correct measures had been taken; and let overinvolvement in 'projects' blind them to emerging needs and changes in the situation." (How many governmental bodies the world over does this remind you of?)

I am a safety professional, and am especially interested in time-critical decision making skills. Dörner's treatment of the Chernobyl accident is the most insightful summation I have seen. He makes the point that the entire accident was due to human failings, and points out the lack of risk analysis (and managerial pressure) and fundamental lack of appreciation for the reactivity instability at low power levels (and more importantly how operators grossly underestimated the danger that changes in production levels made, page 30.) Dörner's grasp here meshes the psychology and engineering disciplines (engineers like stasis; any change in reactivity increases hazards.) Another vital point Dörner makes is that the Chernobyl operators knowingly violated safety regulations, but that violations are normally positively reinforced (i.e. you normally "get away with it," page 31.) The discussion about operating techniques on pages 33 and 34 is insightful: the operators were operating the Chernobyl Four reactor intuitively and not analytically. While there is room for experiential decision making in complex systems, analysis of future potential problems is vital.

In most complex situations the nature of the problems are intransparent (page 37): not all information we would like to see is available. Dörner's explanation of the interactions between complexity, intransparence, internal dynamics (and developmental tendencies,) and incomplete (or incorrect) understanding of the system involved shows many potential pitfalls in dynamic decision making skills. One of the most important of all decision making criteria Dörner discusses is the importance of setting well defined goals. He is especially critical of negative goal setting (intention to avoid something) and has chosen a perfect illustrative quote from Georg Christoph Lichtenberg on page 50: "Whether things will be better if they are different I do not know, but that they will have to be different if they are to become better, that I do know." A bigger problem regarding goals occurs when "we don't even know that we don't understand," a situation that is alarmingly common in upper management charged with supervising technical matters (page 60.)

Fortunately Dörner does have some practical solutions to these problems, most in chapter six, "Planning." One of the basics (page 154) is the three step model in any planning decision (condition element, action element, and result element) and how they fit into large, dynamic systems. This is extremely well formulated and should be required reading for every politician and engineer. These concepts are discussed in conjunction with "reverse planning" (page 155) in which plans are contrived backwards from the goal. I have always found this a very useful method of planning or design, but Dörner finds that is rare. Dörner argues that in extremely complex systems (Apollo 13 is a perfect example) that intermediate goals are sometimes required as decision trees are enormous. This sometimes relies on history and analogies (what has happened in similar situations before) but it may be required to stabilize a situation to enable further critical actions. This leads back to the quote that titles this review: 'adaptability of thought' (my term) is vital to actions taken in extremely complex situations. Rigid operating procedures and historical problems may not always work: a full understanding of the choices being made is vital, although no one person is likely to have this understanding; for this reason Dörner recommends there be a "redundancy of potential command" (page 161) which is to say a group of highly trained leaders able to carry out leadership tasks within their areas of specialty (again, NASA during Apollo 13) reportable in a clear leadership structure which values their input. Dörner then points out that nonexperts may hold key answers (page 168); though notes that experts should be in charge as they best understand the thought processes applicable in a given scenario (pages 190-193.) This ultimately argues for more oversight by technicians and less by politicians: I believe (and I am guessing Dörner would concur) that we need more inter- and intra-industry safety monitoring, and fewer congressional investigations and grandstanding.

This is a superb book; I recommend it highly to any safety professional as mandatory reading, and to the general public for an interesting discussion of decision making skills."

Ei kommentteja:

Sivun näyttöjä yhteensä

Lukijat

Blogiarkisto