fbpx

Cogito, Ergo Sum (I Think, Therefore I Am)

“There’s a reacting side of the brain and a thinking side of the brain,” insists Clinton Anderson in his training DVDs. Anderson is a “horse whisperer” who demonstrates his Downunder Horsemanship on the RFD cable channel and in his series of DVDs. I’ve been using Anderson’s training techniques to work with Miss Biscuit, my three-year-old quarter horse, and it’s obvious she behaves better – and I’m safer – when she’s thinking rather than reacting.

But I hadn’t made the connection to how it applies to pilots until I was reading an article in the Flight Safety Foundation’s Aviation Safety World (since renamed AeroSafety World ). The article, “Pressing the Approach,” written by Benjamin A. Berman, a senior research associate at NASA’s Ames Research Center and a pilot with a major carrier, and R. Key Dismukes, chief scientist for aerospace human factors in the Human Factors Research and Technology Division at NASA Ames, details their investigation into 19 accidents. The researchers found that two of the most common themes in the 19 accidents were what they called “snowballing workload” and “plan continuation bias.”

It was in the discussion of “snowballing workload” that the authors seemed to agree with Anderson’s premise of the “two-sided” brain. “A particularly insidious manifestation of snowballing workloads,” they wrote, “is that it pushes crews into a reactive, rather than proactive stance. [Reacting rather than thinking?] Overloaded crews often abandon efforts to think ahead of the situation strategically, instead simply responding to events as they occur and failing to ask, ‘Is this going to work out?'”

In his On Top column last month, Richard Collins suggested that a factor in a number of accidents may be impairment of the pilot because of confusion. It’s not a great leap to see that snowballing workloads can add to confusion. Trying to figure out how to get a recalcitrant GPS to provide the needed information, while ice is building on the airframe and the controller asks for a change of frequency, can have a pilot using the reacting side of his brain rather than the thinking side. A recipe for a bad end.

“I have always wished that pilots would embrace a ‘time-out’ procedure to battle confusion,” Collins said. “The best one is with the autopilot on and set to fly the airplane on a safe heading and altitude.” In other words, using the autopilot can give a pilot the necessary time to take a deep breath, to start thinking and to stop reacting inappropriately.

Things can get out of control quickly when a horse uses the reacting side of its brain, and it’s really no different with pilots. There are a number of examples of the bad things that happen when pilots react rather than think. If the pilot of a multiengine airplane reacts without thinking when an engine fails, he would have been better off at the controls of a single. Too frequently, pilots have mistakenly feathered the one engine that was still cooperating. As a result, multi-engine students are taught the phrase, “Dead leg, dead engine,” to help them think about what’s happening so they can correctly identify which engine has failed before they do something precipitous. Even then, they’re taught not to rush to judgment, but to confirm that they have, in fact, selected the correct engine before attempting to feather it. If when they pull back the throttle the engine noise changes or the airplane yaws toward the engine they pulled, they’ve made the wrong choice and they’re about to kill the engine that’s been pulling extra duty while its stable mate was loafing. It’s only after they’ve thought through the situation (with the thinking side of their brain) and made sure they’ve identified the malingering engine that they should go ahead and feather it.

Then there’s the reacting rather than thinking that too often takes place when a pilot experiences an engine failure on takeoff. We frequently read about accidents in which a pilot – without thinking – tries to turn back to the runway without sufficient altitude, uses the rudder to keep from steeping the bank and ends up entering a cross-controlled stall and spinning in with too little room to recover.

We’re taught to use the thinking side of our brain and to consider, during the climb out from the runway, the point at which we’ve reached an altitude from which experimentation has demonstrated we can successfully execute a return to the runway. We also have to consider, based on any crosswind, which way to turn so we aren’t blown away from the runway during the turn. If he hasn’t reached the pre-determined altitude, the thinking pilot knows that the accepted – and safe – procedure is to look for a suitable landing site straight ahead or off to either side.

Ah, you say, what about stall recoveries? Don’t we have to use the reacting side of our brain when we inadvertently enter a stall? There’s nothing wrong with reacting as long as the reaction is a response that we’ve been carefully taught. During what seems to many students an uncomfortable number of practice stalls, our “thinking” is changed, so we no longer act instinctively by hauling back on the control wheel when the nose of the airplane ducks under and aims toward the ground. We’re taught that the proper procedure to use to recover from a stall is to reduce the angle of attack by easing the back pressure on the controls. The goal of the ground school explanation of the aerodynamics of stalls and the inflight practice of stall recoveries is designed to reprogram the reacting side of our brain to the thinking side. Someone who hasn’t learned to “think” about how to recover from a stall will intuitively haul back on the wheel in an effort to pull the airplane’s nose up. But that reaction accomplishes just the opposite.

Perhaps the most serious condition in which the reacting side of the brain takes precedence over the thinking side is when a non-instrument-rated pilot loses visual reference of the horizon and has to rely on his instruments. Again, the reacting side of the brain is not going to serve him in good stead. He has to use the thinking half to recognize that his sensory inputs, like little devils sitting on his shoulder, are whispering – or shouting – in his ears to mislead him into ignoring what the instruments are telling him.

The other phenomenon that the authors of the “Pressing the Approach” article isolated from the accident records was what they called “plan continuation bias.” Most of us know it as “gethomeitis” and it can raise it’s ugly head at any stage of a flight from preflight to the landing approach.

It’s not a stretch to consider that “plan continuation bias” is often a manifestation of using the reacting rather than the thinking side of the brain. When you read accident reports, you have to wonder what a pilot was thinking when he took off with a rough running engine, pressed on into weather he wasn’t qualified to confront, passed up airports when he was running dangerously low on fuel, tried to land with crosswinds that exceeded his or his airplane’s capability, didn’t clean the snow or frost off his airplane before launching or didn’t look for a way out as soon as inflight ice began to build on his airplane. The only response when reading far too many accident reports is, “What in the world was he thinking?” It’s likely, he wasn’t thinking at all. A pilot on a cross-country flight that he’s made many times might assume that he has sufficient fuel to press on to his planned destination. But what if his assumption is wrong? How will he know? Will he know in time? These questions, the authors report, are the basis for forming realistic backup plans and implementing them in time, but they must be asked before a snowballing workload limits the pilot’s ability to think ahead.

In thinking about a response to a potential critical fuel situation, how will the pilot know his assumption was incorrect? Obviously, if the engine quits, that’s a pretty powerful sign. But short of that, if fuel is a concern, the pilot should consider the elapsed time en route and calculate the time remaining to the destination, be aware of stronger than forecast headwinds, and there are always the readings on the fuel gauges. The prudent plan if low fuel is a concern is obvious: land and refuel.

Unfortunately, plan continuation bias is often reinforced by negative training. Berman and Dismukes found that as pilots “amass experience in successfully deviating from procedures, they unconsciously recalibrate their assessment of risk toward taking greater chances.” A pilot’s tendency to recalibrate is “abetted by a general tendency of individuals to risk a severe negative outcome of very low probability – such as the very small risk of an accident – to avoid the certainty of a much less serious negative outcome – such as the inconvenience and the loss of time and expense associated with a go-around,” or spending the night away from home, having to rent a car, or missing a birthday, wedding or funeral.

Each time a pilot successfully completes a flight during which he encountered some threat-low ceilings, thunderstorms, crosswinds, icing – he may attribute the success of the encounter to his superior flying skills rather than dumb luck and increase the level of risk he’s willing to assume on future flights. Eventually, Lady Luck won’t be riding along as copilot. Will Durant said, “The trouble with most people [read pilots] is that they think with their hopes or fears or wishes rather than with their minds.”

Cirrus and Avidyne recently introduced a function on the multifunction display in the Cirrus airplanes to encourage pilots to think about what they’re getting ready to do. When the power is turned on the pilot is presented several pages of “checklists” that prompt him to consider all the factors that could effect the safety of the planned flight. A reminder, in a sense, to look before you leap.

René Descartes is credited with coining the phrase, “Cogito ergo sum” (I think therefore I am), but for us, the operable idea is more appropriately: “I think before I react, therefore I am safer.”

R. Key Dismukes and Benjamin A.Berman, along with Loukia D. Loukopoulos, have written The Limits of Expertise: Rethinking Pilot Error and the Causes of Airline Accidents. The book reports on their study of 19 major U.S. airline accidents from 1991-2001 in which the NTSB found crew error to be a causal factor. The 364-page book is available from Ashgate Publishing for $39.95. For more information, call 800/535-9544 or visit www.ashgate.com.

Login

New to Flying?

Register

Already have an account?