jump to navigation

Will We Have Completely Autonomous Airliners? January 2, 2020

Posted by Peter Varhol in aviation, Machine Learning, Technology and Culture.
Tags: , , , ,
add a comment

This has been the long term trend, and two recent stories have added to the debate.  First, the new FAA appropriations bill includes a directive to study single-pilot airliners used for cargo.  Second is this story in Wall Street Journal (paywall), discussing how the Boeing 737 MAX crashes has caused the company to advocate even more for fully autonomous airliners.

I have issues with that.  First, Boeing’s reasoning is fallacious.  The 737 MAX crashes were not pilot error, but rather design and implementation errors, and inadequate information for documentation and training.  Boeing as a culture apparently still refuses to acknowledge that.

Second, as I have said many times before, automation is great when used in normal operations.  When something goes wrong, automation more often than not does the opposite of the right thing, attempting to continue normal operations in an abnormal situation.

As for a single pilot, when things go wrong, a single pilot is likely to be too focused on the most immediate, rather than carrying out a division of labor.  It seems like in an emergency situation, two experienced heads are better than one.  And there are instances, albeit rare, where a pilot becomes incapacitated, and a second person is needed.

Boeing is claiming that AI will provide the equivalent of a competent second pilot.  That’s not what AI is all about.  Despite the ability to learn, a machine learning system would have to have seen the circumstances of the failure before, and have a solution, or at least an approximation of a solution, as a part of its training.  This is not black magic, as Boeing seems to think.  It is a straightforward process of data and training.

AI does only what it is trained to do.  Boeing says that pilot error is the leading cause of airliner incidents.  They are correct, but it’s not as simple as that.  Pilot error is a catch-phrase that includes a number of different things, including wrong decisions, poor information, or inadequate training, among others.  While they can easily be traced back to the pilot, they are the result of several different causes of errors and omissions.

So I have my doubts as to whether full automation is possible or even desirable.  And the same applies to a single pilot.  Under normal operations, it might be a good approach.  But life is full of unexpected surprises.