The Deepwater Horizon accident, predicted in 1984
I pulled Normal Accidents: Living with High-Risk Technologies out of the library this week to see what a book written in 1984 might have to say about the Deepwater Horizon explosion in the Gulf of Mexico. Charles Perrow, then and now a professor of sociology at Yale, chronicled accidents over the decades in different industries. Earlier technologies mostly involved a linear production sequence. If a machine failed, a process was interrupted until the machine was fixed and the consequences of that failure were obvious and limited. Perrow’s conclusion was humans are bad at analyzing systems with complex interactions, e.g., where one component serves multiple functions and if it fails there are multiple consequences. He kicks off the book with a discussion of the Three Mile Island nuclear power plant meltdown.
Perrow says that failures in complex systems are invariably blamed on human error: “lack of attention to safety features; lack of operating experience; inadequately trained personnel; failure to use the most advanced technology”. People don’t look at the characteristics of the systems themselves and ask “Could the potential for accidents have been reduced by eliminating complex interactions?”
Perrow does not use offshore oil drilling as an example, but he has a chapter on petrochemical plants. The plants are “tightly coupled and [have] many complexly interactive components. … it illustrates the presence of system accidents in a mature, well-run industry that has a substantial economic incentive to prevent accidents.”
In Perrow’s chapter on aviation, the emphasis is on the autopilot: “workload has become more ‘bunched’, with long periods of inactivity and short bursts of intense activity. Both of these are error-inducing modes of operation.”
Perrow suggests that the nuclear power industry be shut down because it can never be made sufficiently safe. A nuclear power plant is simple proven technology compared to what BP and Transocean were up to with Deepwater Horizon.
Computers have gotten cheaper and faster since 1984, but it is unclear that humans have learned anything. If we are intent on intensifying our usage of the Earth, both by increasing the population and having the average human consume more resources, Perrow would tell us that we will have to foresee periodic unforeseeable human-caused disasters.
[Perrow noted, back in the early 1980s, that “One enormous risk which the industrialized nations may be facing is not considered in this book on normal accidents… This is the problem of carbon dioxide produced from deforestation primarily, but also from burning fossil fuels such as coal, oil, and wood. This threatens to create a greenhouse effect, warming the temperature of the planet, melting the ice caps, and probably causing an incredible number of other changes, most of them disastrous.”]
Full post, including comments