The other morning, I was driving to the airport and realized
that I had forgotten my cell phone. I
panicked. How would I stay in touch with my wife for the next four days?
How would I be able to keep track of my schedule? How would I board the plane
without my digital boarding pass? What important events would I miss on social
media? How would I manage my hotel reservation? I was in the middle of rush
hour traffic and knew that if I turned around and went back home to get my
phone I might not make my flight. I turned around anyway.
Since the first primitive human picked up a rock and
fashioned it into a tool to change his or her environment, we humans have set
ourselves apart from other species by developing and using tools to advance our
way of life. Nowhere has the use of complex tools shaped our lives more than in
transportation and communication. We have developed increasingly complex tools
that both improve and dictate our daily lives. Clearly these innovations have
made our lives easier, connected us more with each other, and made the world a
smaller place.
We live in an increasingly automated world. Digital
assistants, self-driving cars, cleaning robots, package-delivering drones, and
AI customer service agents are just a few of the automated features in our
modern society. We interface with some form of semi-automated or intelligent
machine almost from the time we wake up in our smart houses until the time we
set our alarms on our smart phones before going to bed. We are never far from a
host of other wireless devices designed to make our life easier, better, and
more connected. But have the tools themselves transitioned from servant to
master? What happens when these intelligent machines become smarter than we
are?
The recent deadly accidents involving two Boeing 737 MAX8
airplanes highlights again that the epic tug of war between the human and the
machine continues. At one end of the rope is the creative, adaptable, and
intelligent human. At the other end is the increasingly cognitive and almost
sentient machine. It is a battle that will continue as long as humans employ
complex machines to do their bidding. The full investigation of these two
accidents will yield pertinent information about the design, training, and
employment of the aircraft, but already we can glean several important lessons
from the accidents that can be applied anywhere we employ intelligent machines
to improve human life.
Before I continue, let me state categorically that I love
the Boeing 737! It has been the most successful airliner in the history of
aviation. I have operated it safely for well over 12,000 hours across this
great continent and to Hawaii and back. I have flown it into low visibility
weather, short runways, and heavy rains. I have flown old versions ready for
the boneyard and new ones fresh from the factory floor. I have also deadheaded (a
passenger in the back being transported to my next assignment) or jumpseated
(on the fold-out seat in the cockpit) for thousands of hours as well. I have
flown multiple versions—the 200, 300, 500, 700, 800, and yes, the MAX8. The
Boeing 737 has always brought me safely home.
Not only has the Boeing 737 always brought me safely home,
it has been the workhorse that has afforded me a comfortable lifestyle as a
professional pilot. It has put food on my table, put diapers on my young
children, and put my young adults through college. It has allowed me to take my
family on good vacations, drive nice cars, and pay for braces for all my
children. The Boeing 737 has been the economic engine that put a nice roof over
my head and funded my 401K. That machine has afforded me a lifestyle that
otherwise may not have been possible. This discussion is not intended to cast
stones but to highlight lessons we can all use as we develop and employ the
intelligent machines in our lives.
In our modern world, we shouldn’t be thinking in terms of
which is better—human or machine. Instead we should be asking, “How can we
develop machines that humans can employ to advance our way of life?” It should
not be a question of human or machine
but more a question of human and
machine. Then our focus productively moves from the question of why should we employ these machines, to
the question of how should we employ
them. How do we interface with smart machines in a way that retains our
humanity and advances our way of life?
The world of aviation has struggled with this question for
many years, particularly as the automation tools in the flight deck have become
ubiquitous and dominant. The proper employment of automation has helped make
aviation the safest form of travel on the planet, statistically speaking. Pilots
have learned to use automation as a tool, without allowing it to become the
master. In the process of improving and employing automation in commercial
aviation, pilots developed automation philosophies, policies, and procedures
that ensure the proper employment of, and interface with, the aircraft
automation. These rules have not been developed in a vacuum or in some
ivy-covered tower of academia, but in the real world where the cost of the
lesson is paid in the blood of pilots and their passengers along with the
capital of crumpled aluminum and broken glass.
Essentially, three basic principles provide a framework for
managing the tug of war between the human and the machine.
·
The human
should possess the same basic skills performed by the machine.
·
The human
and the machine should be able to interface easily.
·
The human
should always have the power (and knowledge) to override the machine.
The human should
possess the same basic skills performed by the machine.
Smart machines are wonderful tools that accomplish complex,
repetitive, or mundane, tasks efficiently. They unburden the human from complex
tasks such as in-depth mathematical equations. They reduce the workload of
repetitive tasks such as routing incoming phone calls to the right department.
They liberate us from mundane tasks like washing dirty dishes. However, what
happens when the machine malfunctions?
Because of the more powerful, and more efficient engines on
the Boeing MAX8, an adjustment had to be made to the flight controls to inhibit
the aircraft from entering a stall (a condition where the wings are no longer
producing lift and the aircraft begins to fall out of the sky) in certain
unusual situations. Designers of the new airplane relied on intelligent
technology operating in the background (without the knowledge of the pilot) to
keep the aircraft safely within the flight envelope. The engineers and
designers called it Maneuver Characteristics Augmentation System (MCAS). If the thrust of the new, powerful engines
forced the nose of the aircraft past a predetermined angle, the MCAS adjusted
the pitch by moving the large horizontal stabilizer and pushing the nose of the
aircraft downward to prevent a stall. In and of itself, this design
feature is robust and almost transparent to the pilot, until something goes
wrong.
Computers, no matter
how intelligent, are only as good as the sensory information they can gather.
Garbage in. Garbage out. In the case of both the Lion Air and the Ethiopian Air
accidents, it appears that the angle of attack sensor (the sensor that warns of
an approaching stall) was giving the flight computer erroneous information.
With no other sources of information to crosscheck, the computer activated the
MCAS and began forcing the nose downward to prevent a stall that was not
occurring. Additionally, other false warnings sounded and displayed in the
cockpit. The pilots became disoriented and did not analyze the problem
correctly or apply the designated corrective action. Like the computer, the
sensory information they were receiving did not correlate with their knowledge
and they were unable to save the airplane. Garbage in. Garbage out. Tragically,
everyone onboard the aircraft paid for the breakdown of both machine and human
with their lives.
When the machine malfunctioned, the
human in the equation did not apply the basic skills to continue safe operation
of the machine. Why they were unable to do so is not clear yet. The training
and ability of the pilots will come under scrutiny, as it always does in an
accident. Likewise, the distractions caused by the malfunction and the design
features of the pilot interface with the aircraft will be analyzed. Changes in
the aircraft design will not be enough to ensure safety. After the investigation,
professional pilots will study the report. Every pilot of the Boeing 737 MAX8
will receive training to handle the malfunction if it occurs again. They
understand that their lives, and the lives of their passengers, depend on the
basic skills of the human operator.
What happens when we no longer know
how to do simple mathematical equations, drive a car, or fly an airplane? If we
rely entirely on the increasingly capable and intelligent machines to perform
complex and dangerous tasks, we may find that the human ability to perform
those same tasks will atrophy and perhaps disappear entirely. We must ensure
that the human in the equation possesses the basic skills to perform the tasks
performed by the machine, or when the machines fail, we will not be able to
survive.
The human and the
machine should be able to interface easily.
When the automation does something unexpected, pilots will
often jokingly say, “What’s it doing to me now?” It’s an indication that the
pilots either lack knowledge, or the system does not allow for proper
interface, or both. Either way, the result is technology in charge of the
process, and pilots that have become passengers.
When the ill-fated Lion Air 610 took off, the stick shaker
(a small vibrating motor attached to the base of the pilot’s yoke) immediately
activated on the captain’s side. The annoying motor is a warning to the pilot
that the aircraft is dangerously close to stalling and that a recovery is
needed. In addition, the pilot’s screens indicated “IAS DISAGREE” a warning
that the right and left side airspeed indicators (indicators that let the pilot
know how fast the airplane is flying through the air) did not agree. Most
likely, the captain’s instruments also displayed other unusual and erroneous
information. The noise of the stick shaker and the unusual display information
are powerful distractors, but professional pilots train for these situations
and should be able to follow emergency procedures and safely land the aircraft.
But what if you add one more thing?
In the case of the Lion Air accident, the one extra thing
was the moving horizontal stabilizer trim. The horizontal stabilizer on the
tail of the aircraft is controlled by an electric motor that can be operated
manually by the pilot or automatically by the autopilot. The stabilizer is a
larger aerodynamic control surface than the elevator controlled by the pilot’s
yoke. Because it is a larger control surface, the horizontal stabilizer can
override the elevator. In other words, if the nose of the aircraft is pushed
down by the horizontal stabilizer, the pilot can pull back on the yoke as hard
as he wants and the aircraft will still enter into a dive. This is what
happened when the MCAS, being driven by erroneous information from a bad
sensor, pushed the aircraft into a dive using the horizontal stabilizer. The
pilots had the power to turn it off (and should have), but because of poor
interface and the distractions caused by the malfunction, they did not. (Note:
A jumpseating pilot on the same airplane the day before recognized the
malfunction and instructed the crew on how to alleviate the problem. This is
most likely because of the vantage point he or she had when the malfunction
occurred.) (https://reports.aviation-safety.net/2018/20181029-0_B38M_PK-LQP_PRELIMINARY.pdf)
The more that automated
systems confuse or distract the human, the more dangerous and ineffective those
systems become. Ease of interface with the machine is essential.
The human should
always have the power (and knowledge) to override the machine.
There’s an old joke among pilots. The flight deck of the
future will have one pilot and a dog. The pilot will be there to monitor the
aircraft automation and ensure that it performs correctly. The dog will be
there to bite the pilot if he tries to turn off the automation and actually fly
the airplane. Humorous as this sounds, the human should always have the power,
and knowledge, to override the machine.
Somewhere during the certification process of the Boeing
MAX8 the engineers had to decide how much information to tell the pilots that
would be flying the airplane. Too much information and the FAA might require
lengthy and unnecessary training. Not enough information and the pilots would
not understand what the aircraft was doing when an automated system took
control. It is reported that Boeing “…decided against disclosing more details
to cockpit crews due to concerns about inundating average pilots with too much
information—and significantly more technical data—than they needed or could
digest.” (https://theaircurrent.com/aviation-safety/what-is-the-boeing-737-max-maneuvering-characteristics-augmentation-system-mcas-jt610/)
It was assumed that a malfunctioning MCAS would appear like the similar
malfunction of a runaway horizontal stabilizer in which all pilots are trained.
They walked a dangerous tightrope between education and information overload.
In Boeing’s defense, the decision to keep updating the B-737
and not redesigning it entirely provided pilots with experience in previous
models a knowledge and experiential base from which to learn the new
iterations. Although this philosophy comes with the inherent danger of not
explaining significant changes to newer generations of the aircraft, it is
countered by the improved safety of familiarity. In other words, familiarity of
pilots with past versions of the aircraft increases the chances of safe
operation of future versions of the aircraft, as long as differences are
properly explained and trained.
According to the accident report, the pilots of Lion Air 610
counteracted the MCAS over twenty times before it caused the aircraft to crash.
(https://spectrum.ieee.org/riskfactor/aerospace/aviation/indonesias-safety-committee-releases-preliminary-report-into-lion-air-crash)
Each time the MCAS forced the nose of the aircraft downward by using the
horizontal stabilizer trim motor. Each time, either the captain or the first
officer used the manual switch on the yoke to override it. Boeing procedures
dictate that when the stabilizer runs away, that the motors controlling it
should be turned off using two small switches on the throttle quadrant. In
effect the pilots revert to manually controlling the large horizontal control
surface and don’t allow the automated system to make any further inputs. Because
these switches are critical during a malfunction and runaway of the horizontal
stabilizer, this cutout feature gets tested during every initial preflight
check.
It appears that the pilots were overwhelmed by the other
distracting events in the cockpit and did not recognize that the stabilizer was
moving without their input. As the flight progressed, for some reason they
stopped countering the automated response of the MCAS with the manual switch on
the yoke and the aircraft became uncontrollable. They had the power to override
the automation, but perhaps because of insufficient training, confusion, and
distraction caused by other warnings, it appears they didn’t.
Automated systems should never have ultimate control of the
machines we operate. Humans should always have the power, and knowledge, to
override the machine.
In spite of these recent high-profile accidents and the
tragic loss of life, air travel is by far the safest form of transportation.
According to Ian Savage, a professor at Northwestern University, between 2000
and 2009 air travel accounted for 0.07 deaths per billion miles of travel. Cars
accounted for 7.28 deaths per billion miles. (http://www.cityam.com/215834/one-chart-showing-safest-ways-travel)
You are one hundred times more likely to be part of an accident driving to and
from the airport than you are while flying on your flight. This safety record
didn’t happen by accident. It is the combined effort of aircraft manufacturers,
government regulators, airlines, and the pilots that fly the airplanes. We will
take lessons from these tragedies as well to ensure that they don’t happen
again.
Advances in automation have helped make aviation the safest
form of travel today. It is the combination of human and machine working
together in harmony that allows for that safety. The tug of war between man and
intelligent machine will continue, but in the end if we don’t follow some of
the basic principles learned through these tragic accidents, we will lose that
tug of war and humans will pay for it with more loss of life. Even worse, we
may lose our own human autonomy to intelligent machines.
The most optimum way to move forward in this age of
increasing technology is a combination of human and machine where the machine serves the human, not the opposite.
We must be able to employ increasingly complex and capable tools without fear
that they will one day become our masters. Like the first human that fashioned
a rock into a tool, we must maintain basic skills, learn to properly interface
with tools, and when necessary, abandon the tool that no longer serves our best
interest.
I hurried home that morning and got my cell phone because it
has become an integral tool in my everyday life. I only made my flight because I
employed another new technology. I drove my Tesla in the HOV lane. The
combination of human intellect and intelligent machine saved the day.