On the other hand, once the airplane was fully stalled and mushing downward, how many crews, however well trained, could have figured out what to do about it, since the situation lay outside the boundaries of any training or, for that matter, even any flight test scenario? If during several minutes the crew, including a captain with experience in a wide range of aircraft types, could not figure out what was going on, was attitude and flight path information being presented in the most usable way? A profile view of the airplane, nose up, with a flight path arrow angled steeply downward would have made everything perfectly clear in an instant. But in the normal course of events, what would be the use of such a display?
The English version of the BEA report frequently used the phrase “startle effect” to translate the rather less specific French surprise. The English expression seems intended to confer a sort of scientific prestige upon the common experience of alarm and confusion following a sudden, unexpected (and generally unwelcome) event. But many pilots have learned from experience that the combination of urgency and fear can produce a sort of cognitive paralysis, and the BEA report noted that this element is generally missing from airline pilots’ recurrent simulator training.
The BEA also noted that crews receive minimal simulator training in hand-flying at high altitude, and none at all in high-altitude stall recovery, even though cruising angles of attack in the upper flight levels are quite close to the stall.
But, while some sort of startle-induced reflex or muscular miscue might help to explain the pilot’s initial and disastrous pitch-up command, the subsequent confusion of the crew scarcely requires explanation. Even though the airplane had gotten into its predicament quite easily, it was now in the realm of the unknown. Simulators did not visit angles of attack of 40 degrees, in part because no one knew for sure how transports would behave there. Wind-tunnel investigations of high-altitude “upsets” produced confusing results and were unreliable because of scale effects.
Furthermore, the Airbus design philosophy makes a point of hiding “unnecessary” information from the pilots. Redundant cues are avoided. For example, the Airbus sidesticks do not communicate with one another in such a way that one pilot can tell, by the motion of his stick, what the other pilot is doing. They also lack proportionate resistance or “feel,” which might have alerted the pilot to his presumably unintended pitch command. Similarly, when the autothrottle is operating the throttle levers do not move, even though power is changing. Finally, Airbus pilots are scarcely aware of pitch trim, which automatically, continually and silently operates to zero out elevator actuation forces. In this case, however, trim was important, because the pilot’s continually holding the stick back had run the autotrim to its nose-up stop. If the crew had managed to understand that they needed to push over into a 35-degree dive to recover, they would probably have had to retrim manually.
The problems were not confined to cockpit ergonomics. The BEA criticized shortcomings in training as well. “The combination of the ergonomic design of the stall warning, the conditions in which airline pilots are trained and exposed to stalls during their professional training, and the process of recurrent training does not generate the expected behaviors with acceptable reliability.”
An A330 pilot once wrote to me that although “the systems design and presentation [are] superb ... safely flying the 320-, 330- and 340-series Airbus requires something of a nonpilot mindset.” The advice he gives new pilots is to treat the flight “as a video game.” Boeing applied a somewhat more classical, pilot-centric philosophy, and a richer array of secondary cues, to the design of its fly-by-wire airplanes (777 and 787), and pilots have for years argued passionately over the merits of the two approaches.
At this point it has become obvious, from this event and plenty of others, that the transition from computer-mediated “protected” flight to manual “direct law” or anything close to it is fraught with difficulties. In fact, this was a well-known problem with ordinary autopilots long before fully-digital fly-by-wire control systems came into use. “Out of the loop” of the handling of the airplane for long periods, human crews falter when they are thrust unexpectedly back into it. They don’t know where they are, what is real, what is spurious. Startled, frequently fixating on an incorrect interpretation of the situation, they may do more harm than good.
A very simple backup autopilot, without reliance on airspeed, could have kept the wings level and the pitch attitude at five degrees while the crew got things sorted out. But the titanically complex and carefully reasoned Airbus flight system made no effort to ensure a smooth transition from digital to human control.
The abrupt “Your airplane!” approach is particularly strange, because the edifice of digital fly-by-wire stands upon the premise that airplanes need to be protected from mistakes that human crews will make. In their zeal for protecting the airplane, Airbus programmers seem to have forgotten that human crews need a little protecting as well. Did they really think that what happened on Air France 447 was inconceivable? Do they still think so?