Skip to Content
Corporate Aviation

Aviation Human Factors Assessment

Adam Webster | February 3, 2017

On This Page
Small plane crash

One of the fundamental problems of the insurance industry is that it is quantitative. An actuary would say "but numbers are all we have!" … and they'd be right. But what any underwriter would tell you would be that if they had a better way to predict where an accident might happen, that would be interesting to them.

The good news is that such predictive powers are actually here today in the form of aviation human factors assessment. But to get to that, it's important to understand how we got to this problem in the first place.

The Audit Culture

In a culture that preaches "best practices," the sad truth is that many of us don't. Aviation psychology is handled with human factors banter that is put in a manual. We're bored, tired, uninspired, and the act of flying the airplane often reflects that. The accidents scream about it. Flight Safety International even trademarked: "The best safety device in an aircraft is a well-trained crew."

Healthy aviation psychology is a prerequisite for a well-trained crew. If your attitude or mental state is in the way, no manual is going to keep you safe. You can train anyone ad nauseum, But in the end, if they are miserable, uninspired, and uncurious, you've got blockage. To take a risk, one might go a step further. I would say, if you have flown charter or are in business aviation, you know that more often this is the case. In fact, putting a number to it, I'd guess that more than half of the time, you are not employing "the best safety device" as advertised.

And, since we aren't the military or airlines … we have less resources. We don't have the best and latest screening tools, and we end up with some strange stories.

Faking It in an Audit Culture

The challenge with auditors and certifications is that the pilot is being led by the nose to say what he or she is supposed to say. For example, the following exchange.

Auditor:           "We score this better if you do it this way. Do you do it this way?"

Chief pilot:      "Absolutely! Always have … always will!"

In 2014 in Bedford, Massachusetts, N121JM ran off the end of the runway at Hanscom Field, killing everyone aboard. The pilots were high-time veterans, fresh from training, audited to the highest standard possible in business aviation. And yet, it was the direct result of a series of all too obvious conditions that plagued the crew. It wasn't the airplane, the training, or the audits they had potentially faked their way through. The fact was that these guys were asleep at the switch. Without being an alarmist, I'd like to make the case that many pilots are.

James Albright referred to the accident as "involuntary manslaughter." (More on his dissection of the accident can be found here.) While these are strong words, there may be no other way to describe a crew who went off script when they shouldn't have. To debate whether a control lock is engaged (while running out of runway) is the highest form of treason. "Treason" sends the right message since the one thing we train—time and again—is to be categorical about whether to fly or not.

Pilot Wiring

After being absent from the cockpit for over 10 years, I had the opportunity to study my own species. I was able to look at my own risk prone (and averse) behavior and take an inventory of how lucky I was. As a 20-something, I worked in Africa, Maine, and Labrador, where doing stupid things meant near-certain fatal punishment. I spent most days wondering what neophyte move I might make and how to self-educate as quickly as possible. I survived a few but mostly watched in awe as much smarter/older/better pilots than myself met their fate.

As a 30-something, I operated a small Part 135 company in the Northeastern United States. I did this in twin-engine piston aircraft—the type that makes up "most" of all the fatal accidents. I was responsible for pilots, training, and compliance. Self-study was the only way to sort out what mattered from what was fluff.

Now, as a 40-something, I'm fortunate to be flying turbine aircraft and learning how safe and amazing life can be. I'd like to underscore the word "can." The human and aviator species are special in that if you give us an extra margin of safety, we'll gobble it up in complacency.

Looking back, after years of operating aircraft, it struck me how little we knew about safety. In my case, it seemed random luck that I knew to study this thing or that—to gain an edge over the elements and the machine, but most importantly, myself. Later, as a charter broker, I developed a risk mitigation tool for third-party charter flights.

This effort was born, in large part, due to the lack of trust I had in my fellow Part 135 operators. This lack of trust stemmed from a lack of faith—faith that their audit standard (displayed as a badge of legitimacy) meant anything. Were they into window dressing? Or actually learning how to manage risk?

The accident in Bedford was the most prominent example of what the accident record shows generally with professional aviators. The system itself won't save you—you actually have to adopt a mindset. This can be called a "growth mindset." If you have it, there's a good chance you operate at peak capability. If you don't, odds are you'll probably be OK (after all, technology is looking after you at every corner). But you open the door to risk when you can't embrace change or growth.

Covering Every Safety Angle

This is a culture where nonpunitive reporting was actually nonpunitive. When you underpay, threaten, or intimidate crew members, you might as well toss the manual, the audit, and any mission statement you have out the window. Years of consulting, advising, and finding holes in a current Federal Aviation Administration and auditor system led me to one place—psychology and mental health. More broadly, it's termed aviation psychology.

If aviators don't have the tools or ability to talk about human factors, then your foundation is weak. Any audit, representation, or standard that was sold to a client lacks teeth if the human factors aren't central.

Birth of the Aviation Human Factors Assessment (AHFA)

The AHFA was developed out of frustration with present models of safety. It doesn't replace, interrupt, or seek to minimize the importance of current tools. Rather, it enhances them by making aviation psychology the central focus of the entire effort.

Flight risk assessment begins by making good decisions. And, good decisions assume a healthy mind; a mind that is free of stress and any rigidity that prevents learning from new information as it comes in. This holds true both in flight and on the ground.

Elements and Expectation

An AHFA monitored company looks at the following four critical elements.

  • Structural—Risk mitigation requires a framework. Without a manual and procedures, there's no spine to the system. Without solid accountability and methods to show the how and why of operations, there is no starting point for a discussion.
  • Integral—The biggest failure of firms that have good structure is completely ignoring it. We call this the "window dressing versus core value problem." Accident data contains plenty of audited firms that simply didn't apply what they had in the books.
  • Corporal—Perhaps a bit too far into the touchy-feely for some, this element is simple. Corporal elements uncover the link between diet, alcohol use, physical stamina, and body mass index. For good measure, you might look at hobbies/interests and ongoing learning predisposition and safety. Anxiety and depression inhibit safety. Yet, individual aviators in many different roles suffer from both. Being anxious about the future or depressed about the past leads to one thing—a pilot that is not able to be "in the present." Lack of appreciating moments, present moments, is the single largest drain on human psyches.
  • Social—Finally, the elusive "culture of safety." What does it mean? And what is the interpersonal like in a corporation? Does it foster "question, never defend," or do underlings fear asking? A company's culture frequently has nothing to do with its mission statement. Its turnover, sick days, absentee rates, and other parameters can give insights into window dressing versus authenticity.

Employing the best elements of aviation psychology, a successful adoption of an AHFA, delivers the following results.

Your Gain You Lose
Retention Problem pilots
Morale A culture of scarcity/criticism
Better boundaries Barriers between/with management
Acceptance/resiliency Stubborn and brittle personalities
Technologically adaptive pilots Fear-based approach to change
Better insurance rates Surcharges/surprises

Conclusion

Pay attention to both best practices and human factors. Learn about and implement leading-edge audit and risk mitigation tools for insurance and financial underwriters, as well as flight departments of all sizes.


Opinions expressed in Expert Commentary articles are those of the author and are not necessarily held by the author's employer or IRMI. Expert Commentary articles and other IRMI Online content do not purport to provide legal, accounting, or other professional advice or opinion. If such advice is needed, consult with your attorney, accountant, or other qualified adviser.