(October 2011) We are all familiar with the adage "To err is human." This means that if you are human, you should accept the fact that from time to time you will make errors. Left unabated, many of these errors will be trivial and the outcomes insignificant. But in the operation of an aircraft, the smallest, most trivial errors can link together quite quickly to form the well-known error chain. Situations that aren't normal can be a breeding ground for errors due to the fact that pilots might find themselves in novel situations: rare or unusual events that may be compounded by stress, anxiety and distractions.
That was exactly what happened to me a number of years ago when, as a Part 135 Learjet check airman, I was giving a Part 135 proficiency check ride to a captain. The captain was in the left seat, I was in the right seat, and another company pilot and an FAA inspector were riding in the back as observers. The weather was good VFR on this day, and after some airwork we set up to do a no-flap landing. There was a lot of chatter going on between me and the captain as well as back-seat conversations and input by the FAA inspector. All of this was occurring below 10,000 feet when the cockpit should have been sterile. The before-landing checklist was accomplished in what we thought was its entirety, but one item was omitted: the extension of the landing gear. No one caught the error. We were going to make an unintentional gear-up landing with an FAA inspector on board. This would have been a big problem, at many levels, for the occupants of this Learjet. The omission was directly attributed to distractions, high workload and the stresses related to a check ride (for the captain and for myself). You may be wondering why the gear warning horn did not sound to give us an aural clue of the misconfiguration. It normally would except for the fact that, if the flaps are not extended beyond 25 degrees, the gear warning horn will not sound. Our flaps were at 0 degrees. Thus, our last line of defense to avert a gear-up landing (an aural warning) was not active. At about 100 feet above touchdown, with the before-landing checklist “completed,” I did something that dramatically changed the outcome of this event. I looked at the landing gear indicator lights one final time “just to be sure.” What I saw were three gear indicator lights that were not green. I immediately called for a go-around, and the captain complied. No further problems occurred and the rest of the check ride continued smoothly and successfully.
There were four qualified Learjet pilots in the aircraft during this event. I was the only one that caught and trapped the error at the last minute. And the reason I caught the error was simple: When it comes to critical flight items, even after the checklist has been completed, I do one final visual confirmation “just to be sure.” It also helps that I do not allow myself to get into the mindset that the aural warning system will dutifully protect me from impending danger. It may not. Not in this case, nor in the cases of Northwest Flight 255 and Delta Flight 1141. The latter accidents occurred because, among other things, the flaps were not set for takeoff. In both cases the takeoff configuration warning systems (TCWS) were inoperative. Although the pilots did not conduct the appropriate before-takeoff checklists (necessary and highly effective error trapping systems in and of themselves), their overdependence on the TCWS to alert them of a misconfiguration was a normal, albeit complacent, mindset. In both of these cases the active errors (flap settings) were left unmitigated and the consequences wound up being far more consequential than my own.
These examples clearly demonstrate that, even with technology that should warn us of impending danger, there are still opportunities for errors to continue their unmitigated trajectory toward an accident or incident. Checklists should always be used, but keep in mind that distractions and interruptions can create significant problems with their usage, particularly in situations that aren’t normal. The same holds true for standard operating procedures (SOPs).
Complacency can be a contributing factor in poor error management. Complacency can create the “see what you want to see” syndrome. For example, “I put the landing gear handle down so the landing gear must be down and locked.” Or complacency can create the “hear what you want to hear” syndrome. For example, “We are always assigned an initial altitude of 4,000 feet right after takeoff, but today we were assigned 3,000 feet and we were tagged for an altitude bust when leveling at 4,000 feet.” Both of these examples are highly representative of the type of complacency that aviators may experience after performing a task hundreds of times with the same successful outcome.
The bottom line is that we can do a better job of identifying and trapping errors before they become consequential. In many cases it can simply be a matter of conducting one final visual confirmation or check. In other cases it’s employing a high amount of vigilance and situation awareness for the entire duration of the flight. In fact, during cruise flight, when there may be very low workload for an extended period of time, I like to use this time to do a “cockpit inventory.” The aircraft may be on autopilot and the pilot(s) should have few distractions in this phase of flight. What better opportunity to really look at the big picture by scanning the flight instruments, navigation systems (flight plan entries/waypoints), engine parameters, circuit breakers and overall “behavior” of the aircraft? It is also an ideal time to brief the weather and the descent and approach (including missed approach) procedures for the destination airport to the extent possible.
Threat and error management is not just a new buzzword in aviation these days. It is a true, proactive approach to improving safety in flight operations. A good pilot will be able to trap, mitigate or even eliminate the common, inevitable errors that are ubiquitous in the flight environment. You can do this by using your written procedures (i.e., checklists, SOPs, Aircraft Operation Manual) as well as your nonwritten procedures (i.e., vigilance, situation awareness, decision skills). Combined, these procedures can be highly effective error management tools for all aviators, from recreational to airline transport pilots.
Sign-up for newsletters & special offers!
Get the latest FLYING stories & special offers delivered directly to your inbox