jnd.org
  Books  |  Essays  |  Interviews  |  Recommended Readings  |  Press Kit  |  NN/g  
 
 home > books > turn signals: coffee cups in the cockpit

Coffee Cups in the Cockpit

CHAPTER 16 OF TURN SIGNALS ARE THE FACIAL EXPRESSION OF AUTOMOBILES 1

Chapter Note: Please do not think that because most of my examples are from aviation that air travel is unsafe or that the same thing doesn't happen elsewhere. Aviation is a very safe activity because of these careful investigations of each accident. They are done with great care and thoroughness, and the results are taken very seriously by the aviation community. The voluntary aviation safety reporting system is a major source of safety lessons. It is essential that this service be maintained and I recommend it to other industries.

Moreover, other industrial areas don't receive the same care and analysis as does aviation, so I can't turn to them as readily for examples. But the problems are there, I assure you. Moreover, the problems are often worse. Just look at all those railroad accidents, industrial manufacturing plant explosions, toxic chemical leaks, nuclear power disasters, ship leaks and collisions: the same problems recur all over the place. In fact, aviation is extremely safe compared to some of these other areas. Moreover, their problems are fundamental to the industry: I refer you to Perrow's study Normal accidents (1984). An excellent review of aviation issues is in Human Factors in Aviation (Wiener & Nagel, 1988).

In 1979, a commuter aircraft crashed while landing at an airport on Cape Cod, Massachusetts (USA) (NTSB report, 1980). The captain (the pilot) died and the first officer (the co-pilot) and six passengers were seriously injured. As the plane was landing, the first officer noted that they seemed to be too low, and he told the captain. However, the captain did not respond. The captain, who was also president of the airline and who had just hired the first officer, hardly ever responded: he was the strong, silent type. He was in charge, and that was that. United States airline regulations require pilots to respond to one another, but what was the co-pilot to do? He was new to the company and the captain was his boss. Moreover, the captain often flew low. There were obvious social pressures upon the first officer.

What the first officer failed to notice was that the captain was "incapacitated." That's technical jargon. What it means is that the captain was unconscious and probably dead from a heart attack. After their investigation of the resulting accident, the U.S. National Transportation Safety Board (NTSB) rather dryly described the incident this way:

"The first officer testified that he made all the required callouts except the ‘no contact’ call and that the captain did not acknowledge any of his calls. Because the captain rarely acknowledged calls, even calls such as one dot low (about 50 ft below the 3° glide slope) this lack of response probably would not have alerted the first officer to any physiologic incapacitation of the captain. However, the first officer should have been concerned by the aircraft's steep glidepath, the excessive descent rate, and the high airspeed."

Seems strange, doesn't it: there they are, flying along, and the captain dies. You'd think the co-pilot would notice. Nope, it isn't as obvious as you might think. After all, reconsider. During landing, the two pilots are sitting side by side in a noisy airplane with lots of things to do. They hardly ever look at each other. Assuming the captain dies a quiet, polite death, what is there to attract the copilot's attention? Nothing. In fact, United Airlines had tried it out in their simulators: they told the captain to make believe he had died, only to do it very quietly. Then they watched to see how long it took for anyone else to notice. The NTSB reviewed that study in their typical dry fashion:

"In the United simulator study, when the captain feigned subtle incapacitation while flying the aircraft during an approach, 25 percent of the aircraft hit the ‘ground.’ The study also showed a significant reluctance of the first officer to take control of the aircraft. It required between 30 sec and 4 min for the other crewmember to recognize that the captain was incapacitated and to correct the situation."

So there: the pilot dies and it takes some people as long as four minutes to notice. Even the quick ones took 30 seconds. And 1/4 of them crashed (or as the NTSB says, "hit the 'ground' " ("ground" is in quotes because, fortunately, this was a simulator: embarrassment yes, but no injury).

Commercial aviation is a strange and wondrous place where perceived images are somewhat in conflict with the reality. The image is of a heroic, skilled adventurer, successfully navigating a crippled aircraft through storms, fires, and unexpected assaults. Images of Lindberg and Earhart flying alone over the ocean, or World War I fighter pilots in open cockpit with helmet, goggles, and scarf still come to mind.

The reality is that the commercial aviation pilot of today is a manager and supervisor, not the daredevil pilot of yore. Today's flight crew must be well schooled in the rules, regulations, and procedures of modern aviation. They are not permitted to deviate from assigned boundaries, and on the whole, if they do their job properly, they will lead a routine and uneventful life. The flight crew is in charge of a large, expensive vehicle carrying hundreds of passengers. The modern flight deck is heavily automated, and multiple color computer screens show maps, instrument readings, and even checklists of the tasks they are to do. The flight crew must act as a team, coordinating their actions with each other, with the air traffic control system, and in accordance company and federal policies. Pilots spend much of their time studying the vast array of regulations and procedures and being tested and observed in the classroom, in the simulator and in actual flight. Economics and reliability dominate.

The old-fashioned image of the cockpit crew is more like a military hierarchy: Captain in charge, first and second officers serving subsidiary roles. As a result of numerous studies of aircraft crews performed by scientists from the National Aeronautics and Space Administration (NASA) and universities, we have learned a lot about the need for cooperative work and interaction. The lessons actually apply to almost any work situation. It is unwise to rely on an authoritative figure. Individual people can become overloaded and fail to notice critical events. People also have a tendency to focus on a single explanation for events, thereby overlooking other possibilities. It really helps to have several people looking over things, especially if they feel that their contributions are appreciated and encouraged. In fact, it isn't a bad idea to consider alternative courses of action. This has to be done with some sensitivity. The goal is not to disobey suggestions from the person in charge, but rather to explore possible alternatives and implications of the actions now being done.

Once upon a time it used to be assumed that when an airplane got into trouble, it was the captain's responsibility to fix things. No longer. Another dramatic example of why this philosophy fails comes about from an accident over the Everglades, a swampy, jungle-like region in southern Florida. This particular accident has become famous in the eyes of students of aviation safety and human error. As the plane was coming in for a landing in Miami, Florida, the crew lowered the lever that lowers the landing gear. However, the lights that indicate a fully lowered and locked gear did not come on. So the crew received permission to circle for awhile over the Everglades while they figured out what the problem was.

Chapter note: NTSB report (1973). It turned out that the landing gear had been lowered properly, but the light bulb that would have informed the cockpit crew was burned out.

Now imagine a crowded cockpit with everyone examining the lights, reading through the "abnormal procedures" manual, trying analyze the situation. Maybe the gear was down and the lights weren't working? How can you tell? Simple, look at the gear. Alas, you can't normally see the landing gear from inside the plane, and anyway, it was night. But airplane manufacturers have thought of a solution: you can remove a panel from the floor and lower a little periscope, complete with light, so you can look. Now imagine the entire crew trying to peek. Someone is on hands and knees taking off the plate. Someone is trying to lower the periscope. Someone reading from the manual. Oops, the light in the periscope broke. How to fix that? Everyone was so busy solving the various problems that arose that nobody was flying the airplane.

Nobody flying the airplane? That's not quite as bad as it sounds. It was being flown by the automatic flight controls, automatically flying in a circle, keeping constant airspeed and altitude. Except that someone, probably the captain, must have bumped against the control wheel (which looks something like an automobile's steering wheel) while leaning over to watch the other people in the cockpit. Moving the control wheel disconnected the altitude control, the part of the automatic-pilot that keeps the airplane at a constant height above ground. When the controls disconnected they also made a beep to notify the crew. You can hear the beep on the tape recording that was automatically made of all sounds in the cockpit. The recorder heard the beep. The accident investigators who listened to the tape heard the beep. Evidently, nobody in the cockpit did.

"Controlled flight into terrain." That's the official jargon for what happens when a plane is flying along with everyone thinking things are perfectly normal, and then "boom," suddenly they have crashed into the ground. No obvious mechanical problem. No diving through the skies, just one second flying along peacefully, the next second dead. That's what happened to the flight over the Everglades: controlled flight into terrain. As nobody watched, the plane slowly, relentlessly, got lower and lower until it flew into the ground. There was a time when this was one of the most common causes of accidents. Today, I am pleased to report, such cases are rare.

Today pilots are taught that when there is trouble, the first thing to do is to fly the plane. I have watched pilots in the simulator distribute the workload effectively. One example I observed provides an excellent demonstration of how you are supposed to do it. In this case, there was a three-person crew flying NASA's Boeing 727 simulator. An electrical generator failed (one of many problems that were about to happen to that flight). The flight engineer (the second officer) noticed the problem and told the captain, who at that moment was flying the airplane. The captain turned to the first officer, explained the situation, reviewed the flight plan, and then turned over the flying task. Then, and only then did the captain turn to face the second officer to review the problem and take appropriate action. Newer aircraft no longer have a flight engineer, so there are only two people in the cockpit, but the philosophy of how trouble should be handled is the same: one person is assigned primary responsibility for flying the aircraft, the other primary responsibility for fixing the problem. It is a cooperative effort, but the priorities are set so that someone -- usually the captain -- is making sure the most important things are being taken care of.

Pilots now are trained to think of themselves as a team of equals. It is everyone's job to review and question what is going on. And it is the captain's job to make sure that everyone takes this seriously. Proper crew resource management probably would have saved the crew and passengers in both the Everglades crash and that 1979 crash on Cape Cod. And it has been credited with several successes in more recent years.

Chapter Note: "Pilots now are trained to think of themselves as a team of equals": See Foushee and Helmreich (1988), Group interaction and flight crew performance.

The Strong Silent Type Recurs in Automation

Chapter Note: "The Strong Silent Type Recurs in Automation": Some of the social issues of that affect the manner by which automation interacts with workers and changes the nature of jobs is discussed in Zuboff’s In the age of the smart machine: The future of work and power (1988). Her distinction between "automating" and "informating" is particularly relevant.

Alas, the lessons about crew resource management have not been fully learned. There is a new breed of strong, silent types now flying our airplanes. For that matter, they are taking over in other industries as well. They are at work in chemical plants, ships, nuclear power plants, factories, even in the family automobile. Strong silent types that take over control and then never tell you what is happening until, sometimes, it is too late. In this case, however, I am not referring to people, I am referring to machines.

You would think that the 1979 crash and a large bunch of other ones would have taught designers a lesson. Nope. Those crashes were blamed on "human error." After all, it was the pilots who were responsible for the crashes, right? The lesson was not attended to by the folks who designed the mechanical and electronic equipment. They thought it had nothing to do with them.

The real lesson of crew resource management is that it is very important for a group of people doing a task together to communicate effectively and see themselves as a team. As soon as any one feels superior to the others and takes over control, especially if that one doesn't bother to talk to the others and explain what is happening, then there is apt to be trouble if unexpected events arise.

The case of the loss of engine power

Chapter Note: Incident reported in NTSB report, 1986. Also Wiener (1988).

In 1985, a China Airlines 747 suffered a slow loss of power from its outer right engine. When an engine on a wing goes slower than the others, the plane starts to turn, in this case to the right (technically, this kind of a turn is called "yawing"). But the plane -- like most commercial aviation flights -- was being controlled by its automatic equipment, in this case, the autopilot, which efficiently compensated for the turn. The autopilot had no way of knowing that there was a problem with the engine, but it did note the tendency to turn to the right, so it simply kept the plane pointed straight ahead. Negative feedback (remember Chapter 15?). Eventually, however, the autopilot reached the limit of how much it could control the turn, so it could no longer keep the plane stable. So what did it do? Basically, it gave up.

Imagine the flight crew. Here they were, flying along quietly and peacefully. They noticed a problem with that right engine, but while they were taking the first preliminary steps to identify the problem and cope with it, suddenly the autopilot gave up on them. They didn't have enough time to determine the cause of the problem and to take action: the plane rolled and went into a vertical dive of 31,500 feet before it could be recovered. That's quite a dive: almost six miles! And in a 747. Ten kilometers! The pilots managed to save the plane, but it was severely damaged. The recovery was much in doubt.

In my opinion, the blame is on the automation: why was it so quiet? Why couldn't the autopilot indicate that something was wrong? It wouldn't have had to know what, but just say, "Hey folks, this plane keeps wanting to yaw right more than normal." A nice informal, casual comment that something seems to be happening, but it may not be serious. Then, as the problem persisted, why didn't the autopilot say that it was reaching the end of its limit, thus giving the crew time to prepare. This time perhaps in a more formal manner because, after all, the problem is getting more serious: "Bing-bong," "Captain, sir, I am nearing the limit of my control authority. I will soon be unable to compensate anymore for the increasing tendency to yaw to the right."

The Case of the Fuel Leak

Let me tell you another story, yet again where there was trouble developing, yet the automatic equipment didn't say anything. This story comes from a report filed with the NASA Aviation Safety Reporting System. These are voluntary reports, filed by people in the aviation community whenever an incident occurs that is potentially a safety problem, but that doesn't actually turn into an accident (so there would not ever have been any record of the event, otherwise). These are wonderful reports, for they allow safety researchers to get at the precursors to accidents, thereby correcting problems before they occur. In fact, let me tell you about them for a minute.

One accident researcher, James Reason of the University of Manchester, calls these reports and other similar signs of possible problems, the early signs of "resident pathogens." The term comes from analogy to medical conditions. That is, there is some disease-causing agent in the body, a pathogen, and if you can discover it and kill it before it causes the disease, you are far ahead in your aim to keep the patient healthy. "All man-made systems," says Reason, "contain potentially destructive agencies, like the pathogens within the human body." They are mostly tolerated, says Reason, kept in check by various defense mechanisms. But every so often external circumstances arise to combine with the pathogens "in subtle and often unlikely ways to thwart the system's defenses and to bring about its catastrophic breakdown." (p. 197).

Chapter Note: The term "resident pathogens" comes from the book Human error, by James Reason (1990). A symposium on these issues for "High Risk Industries" was presented in 1990 at the Royal Society, London (I presented a very early version of this essay: See Broadbent, Baddeley & Reason, Human factors in hazardous situations, 1990).

This is where those voluntary aviation safety reports come in. They give hints about those pathogens before they turn into full-fledged accidents. If we can prevent accidents by discovering the "resident pathogens" that give rise to them before they cause trouble, we are far ahead in our business of keeping the world a safe place. You would be surprised how difficult this is, however. Most industries would never allow voluntary reports of difficulties or errors of their employees. Why, it might look bad. As for fixing the pathogens that we bring to their attention? "What are you talking about?" I can hear them say, "We've been doing things this way for years. Never caused a problem." "Damn," I can imagine them mumbling as we leave, "those damn university scientists. That's a million to one chance of a problem." Right, I reply: a million to one isn't good enough.

The aircraft in the incident we are about to discuss is a three-engined commercial plane with a crew of three: captain, first-officer (who was the person actually flying at the time the incident was discovered), and second officer (the flight engineer). The airplane has three fuel tanks: tank number 1 in the left wing, tank 2 in the aircraft body and tank 3 in the right wing. It is very important that fuel be used equally from tanks 1 and 3 so that the aircraft stays in balance. Here is the excerpt from the aviation safety report:

"Shortly after level off at 35,000 ft. the second officer brought to my attention that he was feeding fuel to all 3 engines from the number 2 tank, but was showing a drop in the number 3 tank. I sent the second officer to the cabin to check that side from the window. While he was gone, I noticed that the wheel was cocked to the right and told the first officer who was flying the plane to take the autopilot off and check. When the autopilot was disengaged, the aircraft showed a roll tendency confirming that we actually had an out of balance condition. The second officer returned and said we were losing a large amount of fuel with a swirl pattern of fuel running about mid-wing to the tip, as well as a vapor pattern covering the entire portion of the wing from mid-wing to the fuselage. At this point we were about 2000 lbs. out of balance."

Chapter Note: "Shortly after level off at 35,000 ft.": The voluntary reporting incident was "Data Report 64441, dated Feb, 1987." (The records are anonymous, so except for the date and the fact that this was a "large" commercial aircraft, no other information is available. Properly, in my opinion, for the anonymity is essential to maintain the cooperation of the aviation community.)

In this example, the second officer provided the valuable feedback that something seemed wrong with the fuel balance, but not until they were quite far out of balance. The automatic pilot had quietly and efficiently compensated for the resulting weight imbalance, and had the second officer not noted the fuel discrepancy, the situation would not have been noted until much later, perhaps too late.

The problem was very serious, by the way. The plane became very difficult to control. The captain reported that "the aircraft was flying so badly at this time that I actually felt that we might lose it." They had to dump fuel overboard from tank 1 to keep the plane balanced and the airplane made an emergency landing at the closest airport. As the captain said at the end of his long, detailed report, "We were very fortunate that all three fuel gauges were functioning and this unbalance condition was caught early. It is likely that extreme out-of-balance would result in loss of the aircraft. On extended over-water flight, you would most certainly end up with … a possible ditching."

Why didn't the autopilot signal the crew that it was starting to compensate the balance more than was usual. Technically, this information was available to the crew, because the autopilot flies the airplane by physically moving the real instruments and controls (in this situation, by rotating the control wheel to maintain). In theory, you could see this. In practice, it's not so easy. I know because I tried it.

We replicated the flight in the NASA 727 simulator. I was sitting in the observer's "jump seat," located behind the pilots. The second officer was the NASA scientist who had set up the experiment, so he deliberately did not notify the pilots when he saw the fuel leak develop. I watched the autopilot compensate by turning the wheel more and more to the right. Neither pilot noticed. This was particularly interesting because the second officer had clipped a flight chart to the wheel, so as the wheel tipped to the right, he had to turn his head so that he could read the chart. But the cockpit was its usual noisy, vibrating self. The flight was simulating turbulence ("light chop" -- as described elsewhere in the incident report) so the control wheel kept turning small amounts to the left and right to compensate for the chop. The slow drift of the wheel to the right was visible to me, the observer, but I was looking for it and had nothing else to do. It was too subtle for the pilots who were busy doing the normal flight activities.

Chapter Note: "We replicated the flight in the NASA 727 simulator": The simulated flights that I refer to were done at the Boeing 727 simulator facilities at NASA-Ames in studies conducted by Everett Palmer of NASA, who is also the person who is the project manager for our research grant.

Why didn't the autopilot act like the second officer? Look, suppose that there wasn't any autopilot. The first officer would be flying, controlling the wheel by hand. The first officer would note that something seemed wrong and would probably mention it to the captain. The first officer wouldn't really know what was happening, so it would be a casual comment, perhaps something like this: "Hey, this is peculiar. We seems to be banking more and more to the left, and I have to keep turning the wheel to the right to compensate." The captain would probably take a look around to see if there were some obvious source of trouble. Maybe it is nothing, just the changing winds. But maybe it is a weight unbalance, maybe a leaky fuel tank. The second officer would also have heard the comment and so would have certainly been alerted to scan all the instruments to make sure they were all normal. The weight problem would have been caught much earlier.

Communication makes a big difference: you shouldn't have to wait for the full emergency to occur. This is just as true when there a machine is working with a person as when there are two people working together. When people perform actions, feedback is essential for the appropriate monitoring of those actions, to allow for the detection and correction of errors, and to keep alert. This is hardly a novel point. All automatic controlling equipment has lots of internal feedback to itself. And we all know how important feedback is when we talk to one another. But adequate feedback between machines and people is absent far more than it is present, whether the system be a computer operating system, an autopilot, or a telephone system. In fact, it is rather amazing how such an essential source of information could be skipped. Without appropriate feedback, people may not know if their requests have been received, if the actions are being performed properly, or if problems are occurring. Feedback is also essential for learning, both of tasks, and also of the way that the system responds to the wide variety of situations it will encounter. Without feedback, the technical jargon for what happens when people are not given proper feedback is that they are "out of the loop."

If the automatic equipment were people, I would say that it was way past time that they learned some social skills. I would advise sending them to class to learn those skills, to learn crew resource management. Obviously it wouldn't do any good to send the equipment, but we certainly ought to send the designers. continued...

< previous page | next page >

 

FOOTNOTE

1. Copyright © 1992 by Donald A. Norman. All rights reserved. Originally published by Addison Wesley. Now out of print. [Return to Text]

 

return to books >

Google search
 
 
 
 My essays:
• Design
• Technology & Society
• Education
• Television
• People
 
 
  About Don Norman & The Nielsen Norman Group  •  Contact Don Norman  •  Site Map
  http://www.jnd.org     Copyright 2002 © Donald A. Norman. All rights reserved.