"Set Phasers on Stun And Other True Tales of Design, Technology and Human Error" is a book reviewing some cases on the topic by Steven Casey. I am working from the First Edition text.
Recommend this product?
A review of this book must come with the warning that it contains some graphic descriptions of the outcomes of accidents. This is not a light nor humorous look at how some accidents happen.
Any look at the environment around us reveals that much thought and design has gone into the everyday things which surround us. We do not expect that when we accidentally mis-dial and re-dial a setting on a dishwasher that it will jump out and kill us. Yet accidents do continue to happen, some by happenstance and others due to poor construction of the interface between humans and their devices. This latter category is known as the Man Machine Interface and is specialized for computers as the Human Computer Interface. Rigorous engineering can be applied to both areas in an attempt to forestall obvious accidents through misuse of the things we create. But when one of our devices acts up because we have used it properly, then you start to hit the topics of this book.
Steven Casey has supplied us with quite a few true stories of the proper use of something coming up with horrific results. As an environmental design specialist he has spent much time understanding the background of the stories he presents and tries to give us some insight into why these accidents happened the way they did. The prose can be a bit dry, and there is some small amount of fictionalization to protect the innocent and rebuild events. Given that, this book is very compelling for the reason that in many important areas humans have not taken simple precautions to safeguard ourselves and those around us.
This book starts with a review of device used in radiation treatment for cancer, the Therac-25. A vivid description of having to be strapped down to a treatment table and then having this device centered on the area to be irradiated is vivid in its sparseness. The inclusion of the fact that on this occasion the internal surveillance camera monitoring the room being broken increases the feeling of dread. And the final description of the capabilities of this machine to use either x-rays or electron beam is chilling in its clinical exactness. And when the nurse in this case accidentally types in an 'x' and not an 'e' you assume the worse... but she uses the edit function to change the 'x' back to 'e'. All is not well and good, however, as no one who designed the software running this machine ever anticipated this mistake! So instead of a low-power, therapeutic dose from the electron beam, the patient is delivered a jolt of x-rays 250,000 times the expected treatment dosage. And the nurse does this three times before the fact that something serious has gone wrong is brought to her attention. The patient has experienced a high level of ionizing radiation multiple times and has seen particle physics first hand as witness the blue glow of Cherenkov radiation given off my atoms traveling at a significant fraction of the speed of light. This patient would die a slow and horrible death of radiation sickness. Further investigation will reveal that this same problem has happened at nine other institutions in the United States and Canada.
This is definitely a case of poor error trapping, no use of safety devices to keep a machine from an unexpected mode of operation and a lack of design oversight and user testing which would have revealed this problem before shipping. I come to those conclusions, personally, after having read much in the way of MMI and HCI work and an understanding of some basic engineering. And that makes this first story even more horrific.
Another, less horrific, story covers the case of a USAAF pilot in the Pacific theater during WW II. A surprise attack by the Japanese forced the pilots to scramble to their P-47 Thunderbolts and one pilot found himself going to a new plane that had just been delivered. While the base was under attack and his compatriots were taking to the air this pilot realized that he was faced with a totally different instrument panel. Once he finally figured out how to start the plane it was all he could do to maneuver it on the ground to survive strafing attacks. Some bright person at the factory had decided to re-design the entire instrument panel for the P-47 in the middle of a war!
One story not dealing with an interface as such deals with a migrant farm worker in California that had been electrocuted while stacking irrigation pipe. An investigator went to the sight, saw the power lines and the length of the pipe and realized that this was an accident. But the 'how' of it did not interest him as much as the 'why', and so he investigated the pipes. Easy to pick up at the middle there would be no reason to move the pipe vertically for stacking. He decided to try getting the pipe a bit more vertical and walked himself under the pipe, moving his hands down the pipe as he walked. A friend stopped him just a yard short of repeating the accident and quipped that it would sure make a great headline 'Accident investigator killed while recreating accident.' He put the pipe down and decided to talk to some other farm workers nearby. When asked why they would ever put the pipe up, one of them walked to some piping, did the hand under hand walk and got the pipe vertical. A couple of quick shakes and a rabbit dropped out of the pipe. 'Great fun' said the worker.
In this case it was just some boredom in the field leading to a bit of fun while forgetting about how long the pipe was and that it was made of aluminum that led to the accident. And yet what sort of person would leave people in a condition where something like that could happen? Long metal pipes and low hanging power lines should not remain close to each other.
This book contains quite a few such stories, including some famous ones like the Bhopal tragedy in India, ferry boat capsizing in the Baltic and a nuclear reactor meltdown in a test reactor in Idaho. In each case the causes of these disasters is examined and the reactions to them are likewise reviewed. How people react to disasters large and small will often determine how bad the final problem will be.
An overworked supervisor of the New York City electrical grid working off of poor information and asking for useless information during a powerful thunderstorm is an accident waiting to happen. Likewise cutting costs and maintenance crews in Bhopal would lead to inexperienced personnel performing hazardous cleaning jobs in an environment where most of the fail-safe devices have been taken off-line due to poor maintenance. Conditions and how we react to them guide us every day of our lives on a course that hopefully does not lead to disaster. Sometimes automated safety systems must be shut down, but when that is done one should have some knowledge of how the overall system works without such safety devices. In that area experts can be in deep danger as their knowledge of how they think a system will work and how it really does may be at extreme variance.
In some cases a simple change of plug design would be enough to ensure that someone getting an EKG doesn't get plugged into a power cable by accident. Sometimes it such simple things as adding a smell to a caustic cleaning detergent that could prevent it from being decanted into another container and being mistaken as a drink mixer, with horrific results. Often the simplest thing to do is to just ensure that hazardous materials are kept out of the hands of those who don't understand the hazard (as was the case of wheat that had a mercury based anti-fungicide applied to it and was dyed red out of the hands of the starving as happened many times in the world).
I recommend this book to anyone who does any sort of engineering and is trying to understand why design safety features of various sorts (from software to civil engineering) are implemented the way they are. An understanding that a warning device that is too sensitive, and thus has a high false-positive reading, can be just as or more dangerous than one that is not sensitive enough due to people becoming inured to the warnings of the system (like the motion alarm system at a prison that one prisoner used to make good her escape).
There aren't many books like this for everyday sorts of disasters, but there are some for the other types. "It Seemed Like a Good Idea..." edited by William R. Forstchen and Bill Fawcett has a round-up of disasters, fiascoes and plain bad decision making. Geoffrey Regan gives us some great looks at military disasters in "Someone Had Blundered..."
Two wonderful sites for HCI are http://www.useit.com/ by Jakob Nielsen, an expert in the field and Bruce Tognazzini's wonderful site (associated with the Nielsen Norman Group) at http://www.asktog.com/ and with a little bit of digging you can find many examples of poor interface design. And, finally, there is the Computer Interface Hall of Shame at http://www.iarchitect.com/mshame.htm which is one of the best time wasting educational sites I have ever run across (even if they haven't updated it in 2 years).
Write a Review