I pulled Normal Accidents: Living with High-Risk Technologies out of the library this week to see what a book written in 1984 might have to say about the Deepwater Horizon explosion in the Gulf of Mexico. Charles Perrow, then and now a professor of sociology at Yale, chronicled accidents over the decades in different industries. Earlier technologies mostly involved a linear production sequence. If a machine failed, a process was interrupted until the machine was fixed and the consequences of that failure were obvious and limited. Perrow’s conclusion was humans are bad at analyzing systems with complex interactions, e.g., where one component serves multiple functions and if it fails there are multiple consequences. He kicks off the book with a discussion of the Three Mile Island nuclear power plant meltdown.
Perrow says that failures in complex systems are invariably blamed on human error: “lack of attention to safety features; lack of operating experience; inadequately trained personnel; failure to use the most advanced technology”. People don’t look at the characteristics of the systems themselves and ask “Could the potential for accidents have been reduced by eliminating complex interactions?”
Perrow does not use offshore oil drilling as an example, but he has a chapter on petrochemical plants. The plants are “tightly coupled and [have] many complexly interactive components. … it illustrates the presence of system accidents in a mature, well-run industry that has a substantial economic incentive to prevent accidents.”
In Perrow’s chapter on aviation, the emphasis is on the autopilot: “workload has become more ‘bunched’, with long periods of inactivity and short bursts of intense activity. Both of these are error-inducing modes of operation.”
Perrow suggests that the nuclear power industry be shut down because it can never be made sufficiently safe. A nuclear power plant is simple proven technology compared to what BP and Transocean were up to with Deepwater Horizon.
Computers have gotten cheaper and faster since 1984, but it is unclear that humans have learned anything. If we are intent on intensifying our usage of the Earth, both by increasing the population and having the average human consume more resources, Perrow would tell us that we will have to foresee periodic unforeseeable human-caused disasters.
[Perrow noted, back in the early 1980s, that “One enormous risk which the industrialized nations may be facing is not considered in this book on normal accidents… This is the problem of carbon dioxide produced from deforestation primarily, but also from burning fossil fuels such as coal, oil, and wood. This threatens to create a greenhouse effect, warming the temperature of the planet, melting the ice caps, and probably causing an incredible number of other changes, most of them disastrous.”]
I think Tufte exposes this problem in analyzing the space shuttle disaster(s). Individual contributors tend to understand their own incremental changes to a system, but have a poor overall view. Each flight of the shuttle experiences several minor problems that may be addressed by the subject matter expert. Catastrophic system weaknesses are not addressed as no individual can comprehend the complex interactions.
One might think (erroneously) that a hierarchical project strategy would involve worker bees who focus on the minutia, and that some super human manager would oversee the whole project “vision” thing. In reality, these are just people who specialize in Power Point and politics.
The professor does have a good point about management of complex systems. Humans do have a hard time monitoring complex processes over long periods of time.
I suspect that in ten years or so, after the ass-kicking and finger pointing have subsided, a some group of adults will do a rational study of the disaster. We’ll find that the root human causes of the Deepwater Horizon were divided chains command and differing operating and safety procedures between BP, Transocean, equipment suppliers, and regulators.
BTW, it is not logical for a person concerned about human caused global warning to oppose nuclear power. It’s certainly not true that nuclear reactors ‘can never be made sufficiently safe’. Chalk that remark right up there with ‘if man had been meant to fly God would have given him wings’.
Jim: I don’t think that it is inconsistent for someone who admits the possibility of melting down the planet with CO2 to oppose nuclear power. Remember first that when the book was published, nuclear power was being produced by regulated monopolies. They had no financial incentive to control costs or avoid costly accidents, since any spending was passed on to consumers. (Perrow points this out in his book.)
Now that power generation has been deregulated to some extent, someone could still logically advocate simultaneously against nuclear power and increased carbon emissions. Our cars and buildings are ridiculously inefficient (I rented a car on Martha’s Vineyard 1.5 weeks ago and Hertz gave me an SUV. I drove up and down one side of the tiny island twice. The SUV consumed more fuel for going these handful of miles than the four-seat 180 mph airplane consumed getting from Boston to the Vineyard.). A near-infinite supply of solar, geothermal, and wind energy is available to anyone clever enough to tap it.
Human beings aren’t good at understanding complex chains of causality generally. Most political disputes are fueled by competing causal explanations of roughly the same evidence. (And of course there are lots where the evidence itself is disputed.) We have a strong desire for simple narratives in which Action A produces Result B, but biological, ecological, economic and political systems all involve so many moving parts with chains of influence flowing in all directions that such narratives are almost always seriously oversimplified and frequently misleading.
We’re now experiencing a financial/economic crisis in which the amount and direction of fiscal and monetary response is known fairly uncontroversially (at least to those who take the time to read FOMC reports or summaries of the Federal budget), yet the likely consequences of these policies are the subject of radically discordant interpretive commentary.
Normal accidents indeed 😉
The human factor in these accidents could be more of a problem than the complexity of the systems involved. Its not necessarily the complexity of the system, but being able to recognize chain of events that lead to the accident. This maybe a generalization, but it always seems these accidents happen when a non-technical person (i.e. manager) that is not directly involved in the plant, airplane or etc, overrides a technical person (i.e. engineer or scientist) that is very familiar with the plant, airplane or etc.
Examples:
Chernobyl – Manager from Moscow overrides local engineers at nuclear reactor in pushing an experimental test way beyond the design of the nuclear reactor with all the local engineers telling the manager from Moscow that this is a very bad idea. Reactor goes boom.
Challenger – Managers override engineers, space shuttle goes boom.
Deep Water Horizon – The rumors are that there was a big argument between a local technical person and somebody who’s only concern was how much money it costs to run the platform before the Deep Water Horizon went boom.
It would have also helped if they tested methods to cap deep water wells before drilling, but this was probably an expense that BP management considered too expensive and it would cut into their profits.
Pavel, I think the point you made pretty much summed up the idea of the article.
“Perrow’s conclusion was humans are bad at analyzing systems with complex interactions, e.g., where one component serves multiple functions and if it fails there are multiple consequences.”
There was no mention of specific job titles because that’s irrelevant, every project is going to have someone overseeing it. In many cases this person
has no understanding of the procedures leading up to the finished product.