WORKING TO PROMOTE FLYING SAFETY,
AFFORDABILITY, GROWTH AND FUN!!
 Member Login 

 Email Address 


Password

Forgot Password

Flyer Signup
 

The Human Side of Decision-Making

by Bill Belanger
Reprinted with permission of FAA Aviation News

Mechanical failures account for only a small percentage of aviation accidents. The great majority of accidents can be traced to the pilot as the cause. Some of these are classified as "loss of control," others as "continued VFR flight into IMC," and still others as "buzzing the neighbor's barn." But if you look carefully, virtually every pilot-caused accident can be traced back to a bad decision at some point. For example, a pilot who loses control landing in a crosswind made a decision to attempt that landing. A pilot who flies into IMC made a decision to fly that day, and another decision not to turn back (or failed to make a decision to turn back) when the weather started to go bad. Even the pilot who can't handle a situation for lack of basic flying skills made a decision not to get additional training at some point.

Much of our flying safety is dependent on the quality of the decisions we make. In this article, we'll examine decisions from a very broad perspective. We'll look at the human side of decision making. In particular, we'll examine the human characteristics that lead to bad decisions that are built into all of us. In the process, we'll also examine the officially documented human causes of two major non-aviation accidents, and see how the lessons learned from them can be applied to aviation safety.

There are many human traits that interfere with decision-making. Any pilot who has done any reading on the subject is familiar with "Macho," "Anti-authority," "Invulnerable," "Resigned," "Impulsive," etc. If you look carefully, you'll see that some of these traits, "resigned" and "impulsive," for example, are really opposite extremes of the same mental attitude. Carrying this a little farther, if we look at all the extremes, we get the chart below. This is really just a reorganization of the "Impediments to Decision Making" with a couple more extremes thrown in.

Note that the extremes of each trait are shown in red with green in the middle. As usual, red is used to symbolize danger. Here's an example of how this chart works. It's fairly obvious how an anti-authority attitude can get you into trouble. But what about unquestioning? Well, we've all heard about the problems that arise when the captain makes a mistake and the second in command is too timid to speak up. The resigned pilot may not act to handle an emergency, but the impulsive pilot may cause one. The pilot with an invulnerable attitude may get into trouble by defying common sense, but the fearful pilot may be too timid to do what's necessary to stay out of trouble. The macho pilot may see flying dangerously as a test of his (her?) masculinity, but the delicate pilot may not apply sufficient control input to compensate for an upset. The point is that it is the extremes that are bad and the middle is good.

Random Risks

The more often we do something dangerous and get away with it, the less we perceive the danger. Eventually, our mind fools us into thinking there's no danger. A risky activity done many times begins to feel safe. This is a human trait that is in all of us.

Now this perception of safety may be real if there is a real increase in skill with repetition. A novice skier would be a fool to take on the expert slope (like I did once'only once). But as skill is gained, the expert slope can be safely mastered (more or less). In the case where there is a real increase in skill, the perception of increased safety is real.

Now let's look at a situation where the risk is purely random and skill improvement through practice is not a factor. If you'll forgive a bit of math, we can look at the probabilities that determine the likely outcome of this kind of situation. Consider the roll of a standard (not loaded) six-sided die. There is an equal probability of the die landing with any of its six sides up. For this exercise, we'll invent a little game. The object will be not to roll a "six." If you roll a six, you lose. If you don't roll a six by the end of the game, you win. After any roll, if you don't roll a six, you get to roll again, or you can quit the game. For each roll you have a one in six chance of losing. Thus you have a five in six chance of winning on any given roll. Pretty good odds?

But let's see what happens when we roll the die many times. Here, we lose if any roll comes up a "six," so we have to calculate the probability of not losing on many successive rolls. The probability of not losing is five sixths or a little over 83 percent on any given roll. Each time we roll the die, we must multiply the probability of not losing by five sixths. Thus for two rolls, the probability of not losing is about 69 percent (five sixths times five sixths).

For 10 rolls, the probability of not losing is only about sixteen percent (five sixths multiplied by itself ten times). For 20 rolls, the probability of not losing on any of the rolls is only about two and a half percent. In other words the probability of rolling a six on 20 rolls is almost 98 percent. Even though the probability of losing on any given roll of the die is not bad (one in five), when you put a lot of rolls of the die together, it's almost certain you will lose! This is how the casinos stay in business.

Let's translate this to a flying situation. Suppose a pilot decides it's not really necessary to pre-flight the airplane, check the weather, or some other omission where random chance might come into play. Every time he or she makes a successful flight, it seems to be safer and safer. This is the natural human reaction to this kind of situation. Our mental programming tells us that something we do over and over without incident must be a safe thing to do. We even have a common phrase for the situation. The pilot is being "lulled into a false sense of security."

And while each "success" adds to the subjective feeling of security, each repetition actually adds to the probability of, shall we call it, a "non-success?" Our perception of the safety of the situation is exactly opposite what the true risk is doing. In a situation where the risk is random, our natural instincts work against us. When we do something risky over and over, we feel safer with each repetition. We feel safer and safer when there is actually a decreasing chance of success. This is a mental trap that is built into each of us. How many times have you heard someone say, "I've done that (pick a risky activity) over and over and it hasn't hurt me yet." The only way to avoid this trap is to understand it. It takes some clear thinking not to let random chance bait us into that false sense of security.

For a really good example of how this trap can affect even the best of us, read the report on the Space Shuttle Challenger disaster. Pay particular attention to the rubber "O" rings. They were supposed to be a gas tight seal with no leakage, but after a number of shuttle launches it became obvious that there was some occasional leakage of hot gases between the sections of the solid rocket booster. But the launches went off without a problem.

Soon it became a matter of routine that there would be some leakage. There was greater and greater confidence that the leakage was not a problem as the number of successful launches increased. But this was just a subjective impression based on previous successes. In addition, the "O" rings were not designed to fly in sub-freezing temperatures. On that tragic day, the launch temperature was below freezing, but the classic "get-there-itis" came into play. This further eroded the safety margin.

This demonstrates that even the best professional managers can fall prey to the same impediments to decision making that plague every pilot. No one is immune, so every pilot needs to be on guard with every decision to be sure not to fall into one of these traps.

Too Little or Too Much Information

Sometimes decisions become impaired because there is too little information. Sometimes it is because information is not presented in a way that can be understood. And sometimes it is because there is too much information. Let's look at these three situations as they apply to a single accident. A good example of all three of these problems is combined in the accident at Three Mile Island in Pennsylvania. Again, for details the reader is referred to one of the many reports, which has been written on the accident, but stick with the official reports. There's a huge amount of "junk science" out there on this event. I will give only a brief summary here.

The initiating event was a pressure relief valve that was stuck in the "open" position. This caused steam to escape from the reactor system. In the wee hours of the morning, the operators saw that the water level was dropping when the emergency feed water pumps came on automatically. No one knew why the water level had dropped. Looking at the situation and the available readouts on the console, the operators did not have enough information on why the pumps came on. It could have been because of leakage in the reactor system, or the pumps might have been activated by the computer in error. There was not enough information to make a decision.

The operators looked at the available information from the console and decided that the pumps had come on in error. They thought that the pressurizer (just a big expansion tank) was filled with water instead of steam as it should be. If the pumps were allowed to remain in operation they might over pressure the reactor vessel, or so the operators thought. It turns out that the pressurizer was actually full of steam, which was leaking out the top through the relief valve. But the information was not presented to the operators in a way that could be readily understood. So the operators shut off the pumps.

Very shortly afterward, the console was alight with warnings. There were so many warning messages that the computer that ran the plant was saturated, not to mention the operators. There was so much information being thrown at them that it was not humanly possible to sift through all of it to see what was really happening. It took many hours to sort things out, and by that time the damage was irreversible.

So the Three Mile Island accident had all three components: too much information, too little information, and information not presented in an understandable manner. The findings of this accident are the basis of much of the improved human factors workspace design in many venues, not just nuclear power plants.

This is another classic in decision making errors that happen because we are human. There is no question the operators were trying to keep the plant under control the best they could. It's just that the information they had available was not designed well for people. We need good solid information if we are to make a good decision. It can't be too little or too much, and it has to be presented in a form human beings can understand.

The examples I chose were major events which were not related to routine aviation operations. But they are classics in the sense that they carry lessons on decision making that can be carried over to aviation.

Relating this to aviation, there are many cases where the available information may be too much or too little or just not understandable. Take something as simple as a change in the weather. There may be a forecast of good visibility and a high ceiling for the entire route. Certainly easy enough to understand. Now suppose the ceiling begins to lower and the visibility get sworse than forecast, but still okay for VFR flight.

If this is all the available information, it may not be enough to make a decision. You don't know whether it's a temporary or local change or whether it's a widespread change that will impact the flight. There's not enough information available in the cockpit to know. So what to do?

The first thing is to recognize that something has changed from the original plan. The second is to recognize that there's not enough information available. Once you know there's a decision to be made and not enough information to make it, the course of action becomes clear'get more information or take action to account for what you don't know. For the example, either call Flight Service Station (FSS) or begin to develop options for a precautionary landing, if things get worse. But the important thing is to recognize that a decision needs to be made and that you need more or better information.

Now let's look at the other extreme. You're arriving IFR at a reliever near a very busy airport with a Class B airspace. There's a lot of VFR traffic in the area and also at several other reliever airports. The approach controller is firing off instructions like a well-oiled machine gun. You got the ATIS with number two com about 10 minutes out, and you now have number two com on the CTAF. Local VFR traffic is chattering away on that frequency. Meantime, the approach involves a complicated series of fixes and intersections. You're just about saturated keeping up with the situation.

Suddenly the controller re-routes you for traffic separation. The re-routing is no more complex than your expected approach, but it means writing down a new clearance and thumbing through the approach plates for a different approach. Now what would happen if there was a mechanical problem right at this time? Let's make it a minor problem like a popped breaker on the flaps. Ordinarily it would not cause much of a problem.

Again, the first thing to recognize is that a decision has to be made'in this case what to do about the flaps. Second is to recognize that you're now overloaded with information. It's time to clear the board. This might mean any one of a number of things depending on the situation. It means at least informing ATC of the problem and tell them what you need to do to handle it. Or if you feel it's a really unmanageable situation, perhaps declaring an emergency (that clears the board really fast). The last thing you'd want to do is to let the extra workload prevent you from handling the decision that has to be made. Remember, fly the airplane first, but making a good decision comes in at a close second.

The Setting

Let's look where this falls into pilot decision making. First, let's get rid of the people with obvious incurable impediments (macho, anti-authority, etc.) We know there's a whole lot of bad decisions in that group. Thank goodness it's all the OTHER people. Now what do we have left? We have a population of pilots who are fully capable of making good decisions, but who sometimes don't. Present any of them with a decision-making problem in a classroom setting (maybe a Safety Seminar) and they will make the right decision every time. But put them in the air and the outcome is occasionally different. So why the difference between the classroom and the airplane?

In a classroom, we tend to think analytically. We are presented with a decision to make. The instructor says "make this decision." So the student applies his or her knowledge and reasoning power and out comes a really good decision.

In the airplane, the setting is different. Instead of a bland classroom, we are surrounded with the sights, sounds, and physical sensations of flight. There is also a strong emotional component. We find flying intensely enjoyable. That's why we put all that time and money into getting the certificate in the first place. Flying tends to load up your senses. The cockpit is not a place to sit back and consider a decision as you would in the classroom. It's almost a left-brain right-brain thing. Analytical thinking is hard to do in the cockpit.

In a conversation with a CFI friend a year or so back, he suggested that perhaps one problem was that "Some of them don't realize they're making a decision!" In other words, the pilot is reacting to a situation, not making a conscious decision on a course of action. The pilot is fully capable of making the decision correctly, but abdicates because he/she doesn't see that this is a situation that requires a decision. If this happens, the decision just doesn't get made!

It's only after you settle down into a decision-making mind-set that you can clearly think through the decision in a logical manner as you would in a classroom setting. Only then can you ask yourself, "Am I acting as if I'm invulnerable? Have I got get-home-itis?" Classroom exercises show that most everyone makes a sound decision when they stop to think about it carefully. Accident statistics show they don't always do as well in the air.

The Decision Points

One solution might be to establish "decision points" during each flight. During each flight the pilot should establish specific decision points, act on them, and recognize situations where another decision point needs to be inserted. There would be seven formally established decision points during every flight. Pilots would stop and think to themselves at each of these points, "I'm going to make a decision now, and my life might depend on getting it right." This would establish the frame of mind you need to counteract those human tendencies that keep us from making the right decision. Here are my suggested decision points:

1. The preflight decision before even going to the airport (weather okay, personal minimums, IM SAFE, etc.)

2. Preflight inspection (is the airplane okay? Is there any reason not to launch?)

3. Before leaving the airport vicinity (One last check: Weather as forecast? Everything as planned? Everything I might need aboard and accessible?)

4. En route whenever anything is not as expected (time not right at a check point, weather changed, river not where the map says it's supposed to be, engine rough, etc.)

5. Prior to pattern entry as part of the pre-landing checklist or ATIS check (which runway? airport safe? crosswind acceptable?)

6. Prior to landing as part of the GUMP check (any reason not to put this thing on the ground? Landing is optional as long as you've got an engine and fuel.)

7. Insert a decision point any time anything is not as expected, or anytime there's a temptation to buzz the neighbor's barn.

We're all capable of making good decisions when we think about it. We just occasionally forget we're making a decision. By formally recognizing the points in a flight where a decision needs to be made, we give ourselves a chance to apply what we know about decision making. If we do this, we might be able to put a real dent in those accident statistics where decisions are involved.

What do you think? It's your decision whether or not to make use of this idea. Or perhaps you might decide to come up with a system of your own. Any personal system you devise would be better than failing to make a decision when one is needed.

Bill Belanger is a Health Physicist in the Environmental Protection Agency's Regional Radiation Program and an FAA Aviation Safety Counselor for the Philadelphia FSDO.

I Fly America
PO Box 882196
Port St. Lucie, FL 34988
614-497-4088

Office hours M-F 8:30am - 5:00pm
Our Privacy Policy
© I Fly America 2024