WORKING TO PROMOTE FLYING SAFETY,
AFFORDABILITY, GROWTH AND FUN!!
 Member Login 

 Email Address 


Password

Forgot Password

Flyer Signup
 

Just a Bit Biased

How to See and Avoid Dangerous Assumptions

By Sabrina Woods, Ph.D.

Source: FAA Safety Briefing, July/August 2020

Hello again readers! I enjoy every edition of the magazine immensely, but I am particularly excited about this edition of FAA Safety Briefing. This is because it highlights what some of my human factors contemporaries are doing across the FAA to ensure safe aviation everywhere. The field of human factors is broken into so many different sub-disciplines and within the FAA we have people who are subject matter experts in engineering/air-worthiness, ergonomics, and design; flight deck design and testing; human physiological response; human-in-the-loop and user interfaces; and my own specialty — perception, cognition, and decision-making, to name a few. The work human factors engineers, practitioners, and scientists do is essential to attaining and maintaining a high level of human performance and in keeping the NAS safe from the risks associated with human error. But I might be just a bit biased.

Bias Defined

If you have ever read up on human factors and human error, you have likely heard a story or two of how a pilot was affected by a cognitive bias. Cognitive biases are mental adaptations that occur when a person’s prior knowledge, or his or her expected outcome, influence their resulting perceptions, interpretations, and decisions. But not all bias is bad. Biases result from cognitive heuristics, also known as “shortcuts,” that we mentally make to aid in the decision-making process.

Think about it this way: when you have to make a judgment or decision about the world around you, there might literally be a thousand things to consider in that environment. Perfectly rational decisions would involve weighing ALL of the factors individually; potential costs versus possible benefits, negatives versus positives, before coming to a conclusion. In the irreverent words of a certain viral meme out there: ain’t nobody got time for that! Our attention is a limited resource so we have to be a bit selective in what we can pay attention to and use for our decision making. Cognitive biases are often a result of your brain’s attempt to simplify that information processing and speed up problem solving.

Sometimes though, the biases trip us up, leading to poor decisions and bad judgments. There are many different types of bias, but all have similar causes, risks, and mitigations. The most important tool to countering biases is to know they exist, and to understand when you might be most susceptible.

When Bias Goes Bad

Working in the FAA’s Office of Accident Investigation and Prevention, I get to see firsthand how bias can get the better of us. While heuristics are often a good thing and help us navigate life quickly, they become an issue of biases gone bad when they lead to perceptual distortion, such as misidentifying the designated runway/taxiway; inaccurate judgment, such as over/underestimating how close/far convective weather is; distraction, like being too busy chatting with a passenger to effectively monitor flight progress; fixation, like only looking at the cockpit array rather than looking out the window; and complacency, such as assuming a controller will advise you of traffic instead of actively looking for it.

Some of the more common biases that affect pilots are expectation bias, confirmation bias, plan continuation error, automation bias, and automaticity.

It’s To Be Expected

Expectation bias is when we have a strong belief or mind-set towards something we expect to see or hear, and act according to those beliefs. For example, a pilot contacts the tower and indicates he or she is ready for an intersection take-off. The controller clears the pilot for runway 10, however he or she departs from 28 because that is what the pilot was expecting and where he or she has typically departed from in the past.

Other things play into the expectation such as routine or familiarity. We humans tend to be creatures of habit. If a person is used to doing things a certain way (such as taking off from runway 28) and doesn’t expect the runway change due to winds, that person might simply proceed as though nothing has changed. This might even be despite acknowledging the actual assigned runway from ATC. Sound familiar? Perhaps it has even happened to you or someone you know?

Looking for Confirmation in all the Wrong Places

Next is confirmation bias. This is when we only look for, listen to, or acknowledge information that confirms our own preconceptions. We tend not to seek out or pay attention to evidence that could disconfirm the belief. I see this a lot in VFR into IMC incidents and accidents where there is often evidence of the pilot’s natural inclination to look for positive information that will allow him or her to complete the flight even as he or she ignores or downplays information that could lead away from achieving that goal.

Confirmation bias is a perfect segue into plan continuation error — more colloquially known as get-there-itis — and the two often go hand-in-hand. Plan continuation is the unconscious cognitive bias to continue with the original plan in spite of changing conditions. The closer to the destination, the worse the bias gets. Our decisions are further influenced by the emotional, financial, and time investments that have already been made in accomplishing the goal. This is all despite the fact that the hazard is constant and the risk of a worse-case scenario (crashing and dying) is more profound.

In one example, a non-instrument-rated private pilot departed from a remote lake in Alaska with his son, anxious to get home. The airplane’s flight path was heading over mountainous terrain in an area that frequently had low ceilings and reduced visibility due to rain, fog, and mist. The pilot queried Flight Service who advised of bad weather moving into the area. Rather than staying put, the pilot risked taking off in an attempt to outrace the weather and get home. Several weeks later the wreckage was finally found in an extremely wooded area several miles from the lake. There were no survivors. The National Transportation Safety Board stated that contributing to the accident was the pilot’s self-induced pressure to complete the flight.

What the Heck is it DOING?

avionics

Technological advancements in the cockpit are a wonderful thing but their use can lead to unintended consequences. Automation bias is when we over-rely on automated aids and decision support systems, or become complacent in assuming the technology is always correct. We subsequently fail to monitor or question the technology sufficiently. As a result, manual flying time dramatically decreases, we can get distracted, and the automated information replaces personal vigilance which leads to complacency. The problem arises when the human is left “out of loop” and no longer knows just what the heck the automation is actually doing when it comes time to step in.

Air France 447 serves as a tragic air carrier example of what happens when a whole crew was outside the automation loop and was unable to determine “what it is doing” in time to prevent the accident. The aircraft entered a stall and fell from the sky into the Atlantic Ocean taking all 228 souls on board with her.

Yes, Dear

Last, there is automaticity or what I call the “yes, dear” response. I am sure anyone who has a significant other, children, parents, close friends, etc., can relate to what I mean by this.

It is when routine tasks lead to an automatic response without any real consideration to what is being said or done. We just kind of mentally tap out and rote memory response takes over for a moment. We register that something is being said and automatically respond, but we don’t actually process what is being asked of us. In physical automaticity, the brain is no longer “conscious” enough to stop the automatic response of well-trained muscles, aka “muscle memory.”

One example of this is responding “roger” to a hold short, runway, or altitude change without actually processing what the new instructions were. Acknowledgement of an instruction does not always ensure understanding. A physical example would be in shutting down the engines instead of setting the parking brake when stopping in accordance with a line up and wait instruction (true story!).

runway

I Get Bias (With a Little Help from My Friends)

One of my favorite quotes of all time is from Douglas Wiegmann and Scott Shappell, two preeminent Human Factors (HF) researchers in the industry and in academia. They said that human beings by their very nature make mistakes; therefore, it is unreasonable to expect error-free human performance. It is a concept that my fellow HF contemporaries and I have taken to heart. For us, understanding how bias is likely to affect pilots helps us see the bigger picture better and helps us to create mitigations that prevent the inevitable human error from becoming an accident.

For you, mitigating cognitive bias can come down to exercising just a few best practices:

1. Perform a Self-Assessment: Understand that you ARE human and therefore CAN become biased. Know thyself and understand what stressors or fatigues may increase your own personal potential to be biased.

2. Make a Backup Plan: Don’t expect the standard strategy to always work. Have alternative plans. Think about what would happen if the “go to” option is no longer available.

3. Active Listening: Especially when communicating with ATC! Active listening is a way of listening and responding to another person that ensures mutual understanding. It requires each person in the conversation to fully concentrate on and understand the intent of what is being said.

4. Look for Disconfirming Evidence: Test out your assumptions by trying to actively disprove them.

One of the most important things in aviation we can do is work so that our own human nature doesn’t get the better of us. But again, I might be just a bit biased.

Dr. Sabrina Woods is a guest writer for the FAA Safety Briefing. She is a human factors analyst for the Office of Accident Investigation and Prevention. She spent 12 years as an aircraft maintenance officer and an aviation mishap investigator in the Air Force.

I Fly America
PO Box 882196
Port St. Lucie, FL 34988
614-497-4088

Office hours M-F 8:30am - 5:00pm
Our Privacy Policy
© I Fly America 2024