jnd.org   Books Essays Interviews Recommended Reading About Me In Praise of Good Design Ask Don NNg

Search:

Jump to essays on

My Books

   
Living with complexity
Living with Complexity.

To be published in October, 2010 by MIT Press.
The Design of future things

Available now at:
The Design of everyday things

Available now at:
The Invisible Computer

Available now at:
Things That Make us Smart

Available now at:
                                     

Recommended Readings:

Chapter 15: It's a Million to One Chance

CHAPTER 15 OF TURN SIGNALS ARE THE FACIAL EXPRESSION OF AUTOMOBILES
1

How come some people always know just how they got sick? And how come they always expect me to know how I got my cough, or runny nose.

Most of the time, there is no way of knowing. The reasons that people tend to give are examples of what we call "folk psychology" or "folk medicine." They have no scientific validity, even though they are a part of popular culture.

Getting wet, or even cold does not lead to a cold. Yes, you can get cold that way, but this refers to comfort, not to illness. The illness we call a "cold" is not related to how appropriately you dress for the outside temperature.

Research with animals has shown that they tend to associate stomach ailments with the last novel food eaten, even if the food was eaten several days prior to the illness. This is probably a good evolutionary rule-of-thumb - if you feel ill, avoid the last novel food you tried - but it isn't good science. The result is that many of us have learned to dislike perfectly good foods simply because of the accidental correlation of nausea and eating the food for the first time. On the other hand, the opposite rule - the last novel food eaten before feeling better - is thought by some scientists to be responsible for the belief in certain cultures that when ill, you should eat chicken soup, or drink mint tea, or any of the other standard folk remedies.

Most colds are minor and run their course in a couple of days. If, however, we are given several big bowls of chicken soup only when ill, then we are apt to associate getting better with eating the soup. The fact that we would have gotten better anyway, or for that matter, that we would have gotten ill in the first place despite the exposure to the cold, or the damp, or that funny tasting food at the party doesn't stop us from trying to find a reason for the getting better or the falling ill.

When I get sick, I haven't the slightest idea why. I'm just sick, that's all. I don't even care why I am sick: I simply want to get better. But people keep asking: "Where did you get it?" How would I know? I even get asked about other people: "How come George got sick? Do you think it was something at the office?"

I have finally learned how to respond. I now shake my head wisely and say, "There's a lot of that going around lately."

What a wonderful statement. On the one hand it has no meaning whatsoever. On the other hand people always manage to interpret it so that it is always true. It always satisfies the person to whom I say it: always.

How can "that" always be going around? It isn't. For that matter, what is "that"? But there are always enough cases of colds or flu or other ailments to make the sentence appear meaningful. When I say "there is a lot of that going around lately," you try to think of someone else who has been sick. I guarantee you will almost always think of someone. "Ah yes," you say to yourself, "I just thought of another one." So, "yeah," you say aloud, "oh yeah, there certainly is."

How likely are you to know someone who is sick? Very likely. A recent newsletter stated that the average American gets a cold three times a year. Suppose we count as "knowing someone who is sick" any time you can think of a person -- including yourself -- who has been sick in the past week.

If you know fifty people, even casually, the chance that just random sickness will cause at least one of them to have been sick within the past week is 95%: those are odds of twenty-to-one that someone you know will be sick. And in fact, if you include the fact that the circle of people you know or that others will tell you about is larger than 50, the chance that you will know of someone else who is sick is awfully close to 100%. In fact, if you know 100 people, the odds are over 300 to 1 that at least one of them will be sick.

So when I say "there's a lot of that, if you can think of a single other person, you will nod. And if several names come to mind (I'll help out, after all), you will walk away saying, "yes, there certainly is a lot of that going around." Without the slightest idea of what "that" is, or why it should be going around.

Chapter Note: "The average American gets a cold three times a year": The number comes from an article in the newsletter Bottom Line Personal for March 30, 1991,attributed to David Fairbanks, MD, quoted in Working Mother (no date given). A "fact" from a relatively unknown newsletter quoting an unknown physician who is reported to have said something in an unknown publication is not particularly reliable, but the number itself seems reasonable.



Chapter Note: "If you know 100 people, the odds are over 300 to 1 that at least one of them will be sick": If the 3 times of the 52 weeks in the year the average person is sick are randomly distributed across the year, then the chance of being sick for any particular week is 3/52. The chance of being well for any particular week is, therefore, 49/52. If you know 50 people, the chance that at all 50 have been well is just 49/52 multiplied by itself 50 times or (49/52)50 = 0.05, 5 chances in 100. That�s only a 5% chance that all were well, or if you like, a 95% chance that at least one was ill. The odds are 20 to 1 that one of your 50 acquaintances was ill.

If you know 100 people, the chance that all of them were well this week is (49/52)100 or only 0.003. This means the chance that at least one of them was sick this week is 99.7%: odds of over 300 to 1.

People are forever underestimating the power of probability, as when they are surprised that two people they know might be sick in the same week (and if so, "my, there is a lot of that going around lately"). But people also make just the opposite mistake: people tend to be very bad at realizing that infrequent events do happen.

Psychologists are fond of showing how badly people estimate the likelihood of rare events. Most people believe airplane travel is far more dangerous than automobile travel even though statistics show the opposite. In fact, both are very safe, and the likelihood that you will be injured on any particular ride by either means is very low.

We see a lot of aircraft accidents reported on television and in the newspapers, but that is precisely because they are so rare. Automobile accidents occur all the time. Most people have been in an accident, or witnessed one. Roughly 50,000 people die each year in the United States from automobile accidents and hundreds of thousands are injured. There are hundreds of accidents every day. As a result, an automobile accident is commonplace, so it isn't likely to be reported in the newspapers.

Aviation, on the other hand, is very safe. This means that when one happens, it is important news, so the newspapers and television make a fuss. In 1990 there were only 25 accidents with fatal injuries for scheduled, commercial aviation in the entire world. Twenty-five for the entire world. But twenty-five per year is about once every two weeks. Infrequent enough that they make the news, frequent enough that there always seems to be one. Of course, airplane, train, bus, and ship accidents tend to be more costly in terms of lives, money, and ecological impact than the much smaller accidents involving automobiles. Even so, the number of deaths through automobile accidents each month far exceeds the number of deaths from these other forms of better publicized accidents.

Maybe our perception of airplane crashes is something like our perception of colds: all you need is for someone to say "there certainly seems to be a lot of accidents lately" and if you think about it, why sure enough, you can remember another one. So, "yeah," you say aloud, "oh yeah, there certainly are."

Colds and airplane crashes are relatively rare events that most people believe to be more frequent than they really are. Some events have just the opposite problem: they are thought to be less likely than they are.

Years ago I read an article on the safety of elevators. "Why," said the reporter, "modern elevators are so safe that the chance of being stuck in one is less than one chance in ten thousand."

Yikes. One chance in ten thousand? At the time I was working on the twelfth floor of William James Hall, the home of the psychology department at Harvard University. I took the elevator about ten times a day. Up to my office when I arrived, down again when I left. Down to the library, up to colleagues on a higher floor. Down for lunch. Up back to my office. Down to go teach a class. Back up afterwards. Five round trips a day was not unusual. Even if I had only worked on weekdays that would be about 250 days a year, 2,500 elevator trips a year.

If that reporter was correct, I would be stuck in elevators at least once every five years. The interesting thing to me was that the reporter thought the number to be so small that it indicated the great safety of elevators, whereas I, doing some calculation, thought it indicated a danger. What the reporter failed to take into account was that even the most unlikely events will happen occasionally if the number of opportunities is high enough.

Nobody is immune to the difficulty of estimating the likelihoods of rare events. Even pilots of aircraft make the mistake about their own airplanes. Pilots are apt to be somewhat like the reporter, believing that if something has a very small chance of happening, it is impossible.

Perhaps my favorite example comes from an incident where an unlikely sequence of events did happen, but with a happy ending - the plane landed safely. The full story is complex, but worthy of telling. To me, a student of human behavior and aviation safety, it is a wonderful tale of how accidents seldom have a single cause: there is almost always a unlikely sequence of events.

The airplane was a Lockheed L1011, a commercial airliner that looks something like a DC-10 or MD-11: it is about the same size and has three engines, one on each wing, one in the rear of the plane on the tail structure. One morning as it flew its scheduled route from Miami, Florida to Nassau in the Bahamas, the instruments reported that one engine was low on oil pressure. The plane was then over the Atlantic Ocean just east of Miami. The pilot dutifully turned off the engine and turned the plane around to return to Miami. As they flew along on the remaining two engines, they too began to lose oil pressure. The thought of turning off all three engines while over the ocean understandably upset the flight crew.

Chapter Note: The aviation incident is described in detail in the National Transportation Safety Board�s report of the incident (NTSB, 1984).

An excellent review of the scientific literature on these topics is provided by Gilovich: How we know what isn't so: The fallibility of human reason in everyday life (1991).

The L1011 has three engines of two very different types. The two engines on the wings are different from the one in the tail section. Moreover, they usually are serviced by different people (which was true for this airplane as well). There is no obvious malfunction that could cause all three engines to lose oil pressure at the same time. In fact, the most obvious cause would be that the oil pressure is fine, but that the electrical system that powers the indicators is faulty.

The pilot, therefore said, "It's one chance in a million" and continued to fly with the other two engines.

One chance in a million. How many commercial, scheduled flights are there each year in the United States alone? Around eight million. A one-chance-in-a-million event will happen around eight times a year. In fact, in the commercial safety business, one chance in a million is considered an unsafe value.

About the incident: The pilot at first decided the problem probably was a faulty electrical system, but that was quickly ruled out by the tests the flight crew performed. So the pilot had no choice but to keep going, hoping to make it back to land. The remaining two engines failed while they were still over water, and the crew prepared for a water landing. However, just in time, they managed to restart the first engine, which they had turned off before it was damaged. It lasted only long enough to land them on the runway, and then failed, leaving them stranded, but safe.

The pilot's actions were prudent. Although he could not believe that he had truly lost oil pressure on all three engines, he couldn't have done anything differently even had he known. The underlying causes of the incident make a classic, textbook case for those interested in the study of human error and how multiple actions and multiple failures lead to unexpected accidents.

Every evening the plane was checked over by mechanics. In the check, they would remove a magnetic plug from the engine and look to see how many metal particles were sticking to the magnet. This is a good way to check on the amount of wear, indicating when the oil should be changed or the engine overhauled. After the plug is removed, there is a hole that allows oil to leak out, so of course it has to be filled with a new plug, and the plug itself sealed with a rubber "O-ring."

Well, the plane was checked, serviced, and returned to duty. Unfortunately, they didn't put O-rings on the plugs, so they leaked. All three of them. This despite the fact that two different mechanics serviced the three engines, despite the fact that after servicing, they turned on each engine to check for leaks. Why weren't the O-rings put in place? The mechanics had even signed a check-list that they had installed the O-rings.

The problem was, for the many years they had been working at the job, they had never ever installed O-rings. Their supervisor always did it for them. The supervisor tried to be helpful. For years, he would get the plugs and the O-rings from the warehouse, put the O-rings on the plugs, and then store the assembled parts in his desk drawer. This made it easier for the mechanics to do their job, so easy in fact, that these mechanics had never ever had to get the parts themselves from the warehouse or install the rings on the plugs.

The supervisor was doing a useful favor, one that did help the mechanics. Unfortunately, this one night the drawer was empty. The mechanics went to the warehouse to get new plugs, but neglected the O-rings. But why did they sign the checklist? They always did, yet they had never before put on the O-rings.

Who is at fault here? The mechanics? Maybe, for after all, they signed the sheet even though they didn't really do the task. The supervisor? Maybe, for after all, he wasn't supposed to have been so helpful. The warehouse? Maybe, because they had no business handing out the plugs without the O-rings. The system? Yes. A safe system would have made sure that plugs were always packaged with O-rings. It would have permitted the supervisor to help, but the supervisor should then have been the one to sign the checklist, so that the absence of the supervisor's signature should have alerted the mechanics to the fact that something was wrong. The mechanics should not have allowed themselves to have been helped so much: they should have always checked the plugs and rings for themselves.

Was the system at fault for not developing a checking procedure? Yes, the mechanics started the engines to look for leaks, but not for long enough. It took a long time for the leak to develop: the people who developed the checking procedure should have actually tried it to make sure it would catch errors of this sort, errors that would have been visible at night, in the dark (the mechanics had to use searchlights to see what they were doing).

What about the pilots who couldn't believe that all three engines had failed? Were they at fault? Maybe, although they didn't have too much choice in the matter: whether they believed or not, they still took the appropriate actions.

The real point is that human error strikes in unlikely ways, in unlikely cases. The rule in major accidents is that they always occur because of an extremely unlikely combination of extremely unlikely events. However, these unlikely events do happen: and therefore so do accidents.

One chance in a million seems awfully unlikely, so why should we worry about such a rare event? Well, because there are several billion people on earth. In a country of 250 million people and a world of several billion people, infrequent events happen several times a day.

In the United States alone there are about eight million aircraft departures every year. Departures: That word alone should raise the eyebrows. Airplane flights are measured in departures, not complete flights and not in arrivals because, alas, there are fewer arrivals than departures. In fact, if the chance of particular accident is one in a million, there will be roughly eight of them a year. One in a million is a very rare event. Eight accidents a year are eight too many.

< previous page | next page >

FOOTNOTE

1. Copyright © 1992 by Donald A. Norman. All rights reserved. Originally published by Addison Wesley. Now out of print. [Return to Text]