fbpx

Pilot’s Discretion: Fifty Shades of Flying

Decisions are rarely black and white in the cockpit.

I’ve seen enough over the past few years to reach the conclusion that self-driving cars going mainstream is now a matter of when rather than if. The advanced technology in today’s production vehicles is helping to pave the way. While I haven’t had the chance to use the autopilot feature in a Tesla electric vehicle, I did recently take a new Ford SUV on a road trip and used its driver-assist technology over several hundred miles. This particular model had an advanced cruise-control system that would accelerate and brake automatically to maintain a user-defined distance from the car in front of you. It would also engage the steering system to guide you back into the middle of the lane if you started to drift.

This system still requires the driver to pay full attention to the task at hand, but autonomous cars aren’t far away. As we droned along, I started thinking more about how a computer “thinks” in contrast with a human, and about the inherent limitations of self-driving (and self-flying) technology. As if Mother Nature chose to underscore that point, we drove through a wall of rain, and the automated features were disabled due to blockages of the various sensors. The pseudo-self-driving car recognized its limitations and raised the white flag.

Computers and automated systems are routinely making a series of binary yes-no decisions, which are only as good as the finite amount of data they have access to. Humans, on the other hand, have vastly larger databases of knowledge and awareness of their surroundings compared to a computer (at least at the present) to help make better-informed decisions. We also have past experiences on which to base our decisions that a computer may not be able to consider, including the ability to account for outcomes such as passenger comfort, safety and even moral obligations.

As automation becomes more common in just about all modes of transportation, I’ve often found myself considering how a computer might handle a given situation if it were in complete control. This has allowed me to gain a fresh look at decision-making across all phases of flight, from preflight planning to handling inflight emergencies, and the reality that there’s rarely a single answer to any given task or problem.

As a flight instructor and check airman, I routinely see student pilots take the “if this, then that” approach when presented with an inflight dilemma, and that’s OK. When something abnormal happens, the immediate reaction should be to consult the checklist. Say, for example, you experience a flap-motor failure while maneuvering to enter the traffic pattern. The procedure requires little interpretation and could be accomplished just as easily by a computer-controlled pilot as by a human. Other than verifying that the destination runway still meets the additional landing distance required, there isn’t much left to consider.

It can be easy to develop the mindset that flying and decision-making are black and white. I unknowingly adopted this approach during my first few hundred hours and took a robotic approach to flying, following the same “if this, then that” mentality. That is, until one seemingly simple event changed my perspective.

At the time, I was doing some contract flying in the right seat of a Piper Cheyenne twin turboprop. My first flight was at night, departing from a busy Class B airport, and as we were cleared for takeoff, I called out the checklist items. Crossing the hold short line, I turned on the landing light just as I had done hundreds of times before in other airplanes in preparation for takeoff. The captain immediately smacked my hand on the switch, turned the light back off, and asked, “What are you doing?” I was obviously caught off guard. What I had failed to notice was the Delta 757 sitting across the runway on the opposing taxiway, which I had momentarily blinded with our landing lights.

It was one of those eye-opening moments that taught me to look for even the most basic checklist items, beyond black or white, on or off, yes or no. No matter how simple and routine a task may seem, there’s probably a situation or circumstance that will require a slightly different approach, as illustrated by the use of the landing light. While it might have been procedurally correct to turn it on when entering the active runway, I lacked the situational awareness, and the checklist fails to consider the courtesy aspect and what effect the bright light has on nearby pilots.

More dire consequences can result when you examine how to approach an abnormal or emergency situation in the cockpit. Consider the low-voltage light in your airplane, which is either on or off. In the modern Cessna 172, it remains off when the 28-volt electrical system is healthy, but the instant the voltage drops below 24.5 volts, the light is illuminated to make you aware that something isn’t working properly.

It’s up to you to decide what happens next. The checklist will have you accomplish a few tasks in an attempt to reset the alternator, but perhaps that’s not an option in the case of a broken belt. It’s now decision time.

Your course of action will vary greatly depending on the circumstances. For the VFR pilot about to enter the airport traffic pattern on a blue-sky day, this is a nonevent. But for the IFR pilot planning for an instrument approach 100 miles from the destination, this is a true emergency. This pilot has to begin load-shedding to get the most duration from the remaining power in the battery. The pilot may have to evaluate the utility of backup communication and navigation resources and determine if there is enough fuel on board to divert to VFR weather.

How would a computer handle this scenario? Activate the airplane parachute? I would assume the hypothetical computer-controlled airplane 20 years down the road would be much better equipped, with a backup for the backup so that a low-voltage situation could never happen. But again, there is not a one-size-fits-all approach to decision-making. We have to consider hundreds of variables, rely on our training, and think about the outcomes from past experiences to navigate these gray areas.

The good news today is that we have fewer unknowns when it comes time to make these decisions, thanks to the automation found in modern GA airplanes and portable technology like iPads, GPS and ADS-B receivers. Collectively, this provides the best of both worlds when it comes to flying, combining the system-monitoring and analysis tools of a computer with the real-world experience and the desire for comfortable and safe flight that we as 
human pilots naturally seek out.

Login

New to Flying?

Register

Already have an account?