It is a fairly unpopular notion to suggest to pilots that their days are numbered as meaningful participants in the actual flying of the machine. Even my own pilot ego gets a little confused at the notion that there isn't much need for me. But the stark reality is that pilots are increasingly monitors of systems; they are no longer professionals required to do a tremendous amount of analytical stuff, adjusting fussy components, and deciphering which information source is telling the truth.
There are impressive machines that do that for us. And they do a lot of stuff, from navigating to ensuring the pressurization is handled seamlessly from start to finish. But, not that long ago, pilots were turning big rings to get a bearing to a radio signal and then charging off in the night to hit a fix within an acceptable degree of accuracy. Prior to that (and not that much prior), there was a bubble in the top of an airplane for the navigator to get up there with his sextant to get a fix of the plane's position based on the position of the stars.
Yes, this happened. Those skills are not only gone, but today's generation of pilots may not even be able to explain what a sextant is. The risk manager's job is to analyze what the evolution of technology means and how to insulate from the unseen pitfalls that the comfort of automation brings us.
An Avoidable Tragedy
The most staggering modern-day example is the Air France Flight 447 (AF447) tragedy on June 1, 2009. The bulk of the investigation (and resultant training profile modifications) amazed veterans with the basic lesson learned: So long as we have all that automation up there, we need to remember to teach pilots (at the most basic level) how to fly an airplane when it wants to stop flying. Somehow, training became a matter of course teaching pilots to rely on the sheer power of the modern turbine engine and sophisticated flight deck instrumentation to fly the airplane out of the stall without hammering in the importance of reducing the nose high element of the stall. Simply put, pushing the nose down, gaining some airspeed, would have saved everyone.
The story is just that—an entirely avoidable accident if the crew could have simply recognized the condition they were in (wing stalled) and lowered the nose, built the airspeed back up, and gotten it flying again. Instead, they flew in a nose high attitude all the way to the ocean, convinced that just applying power and following their (flawed) procedures would save them. The famous Captain Chesley Burnett "Sully" Sullenberger III of the Hudson River accident offers a simple opinion about the accident being seminal.
Now, let's be fair: I'm glossing over a lot of stuff re: air data computers, what happened, the set up, how they got into the situation, control inputs not being properly understood, etc. But the takeaway that leaves many of us in awe is that the most basic element wasn't recognized and acted upon—wings stop working beyond a certain "angle of attack" or angle they face to the relative wind. It is one of the first things taught in all primary flight instruction—bring the aircraft to a stalled condition and recover without hitting the ground. Do that, and a few other things in a respectable fashion, and you earn your private pilot's license.
Managing the Risk
The risk manager can then take this recent bit of history (and trend in our culture in general) and use these basic questions to understand the future.
Where Have We Been?
With respect to any aircraft and crew that you are working with, what is their experience, and what is their history of going from the analog to the digital world? So long as we are stuck in both, it is really nice to have some generalists onboard who can fly the aircraft when everything goes offline, with the exception of the windows and basic controls.
Where Are We Now?
What type of machine are they flying in today? And what type of training have they had to milk the maximum performance out of the automated stuff while also being cognizant of where it will bite them? (This is perhaps the biggest trap with automation. It lulls all of us into a sense of complacency. I have said as much previously on my distrust of my iPad.)
Where Are We Going?
This question is broad in that it asks how you will train, monitor, and work with your crew to ensure that they are able to leverage the latest technology (to the extent you need or want to pay for it) while also understanding that, so long as there are humans up front, they should have a broad aeronautical background and a detailed understanding of the ship they are in.
Here is the most important takeaway—things are just going to be more automated in the future, and the passengers, as well as pilots, will be asked to give up even more control and authority. This will actually happen until the day when pilots really are redundant, cumbersome, and only adding complexity to a safe transportation system. The role of risk management will evolve into looking at the design of systems that prevent the tragedies from happening by their ability to predict, monitor, and come up with alternate plans when safety is compromised. Nothing is more important, however, than recognizing that, as a species, we will always push the envelope and make new stuff.
The increased layers of safety (provided by automation) actually erode the margin when you mix humans in. The prime example of this dynamic is the Freakonomics tale of two sets of taxi drivers in Frankfurt in the 1980s. One group had antilock braking systems; the other didn't. Guess who had more accidents? You guessed it. The folks with improved braking ability took bigger chances and had more accidents.
The risk manager's attention should be squarely faced on this dynamic—when there is increased safety margin through a new feature, the well-meaning human will erode all of it and usually take more. When the sales pitch is made for pilotless vehicles, this may yet be the most solid point made. The military is already there, and FedEx is actively looking to move in this direction.
But, for the time being, so long as there are humans up front, make sure that they see themselves as risk managers, too. Jumping into the air is inherently full of risk, so be sure the crew see themselves as risk-averse people whose entire job is to evaluate, mitigate, and to the extent possible, eliminate risk by knowing where they've been, where they are, and where they are going.
Opinions expressed in Expert Commentary articles are those of the author and are not necessarily held by the author's employer or IRMI. Expert Commentary articles and other IRMI Online content do not purport to provide legal, accounting, or other professional advice or opinion. If such advice is needed, consult with your attorney, accountant, or other qualified adviser.