If you’re inclined to trust the boastful announcements, they’re already here: Autonomous cars that are capable of getting you from A to B through dense traffic as if guided by an invisible hand – safe, comfortable and energy efficient. Thanks to software. Meanwhile, another innovation has arrived for sure: Planes that crash as if dragged down by an invisible hand. Thanks to software.
In our Suspicious Computing series, we present real-world cases demonstrating the need for perspicuous computing.
The aviation industry is subject to strict regulation and works diligently to produce what are likely some of the safest products that humans directly interact with, as faults in these products cost lives – many lives. Still, within six months, two new Boeing 737 MAX 8 planes have crashed in a very similar way. This runs afoul of a strict principle that is normally prevalent in the aviation industry: Each fault should be analyzed immediately, preventing the same problem from occurring again. Apparently, the manufacturer failed to do so after the tragic crash of the Lion Air 610 that happened near Jakarta at the end of October. The former circumstances closely resemble what we already know about the crash of flight ET 302 in Ethiopia on the 10th of March 2019: Both cases feature a 737 MAX 8 airplane that was barely a few weeks old, problems with flight stability, a failed attempt to return to the departure airport, and finally an uncontrolled descent and crash.
The center of attention is not a piece of software called Maneuvering Characteristics Augmentation System (MCAS) designed to stabilize the plane during critical situations. However, it is apparently programmed such that it can occasionally force the plane to crash, without the pilots being able to intervene. Boeing has now promised a software update.
Conceptually, MCAS is a secondary autopilot system, added to compensate for some peculiarities in the MAX 8 design. The structural form of the 737 series has been mostly unchanged for over 50 year, while turbines have changed in shape, size, power and fuel consumption, partly causing critical aerodynamic properties. In fact, Boeing concealed the existence of the MCAS software from pilots of the new MAX 8 planes until the incident in Jakarta, which means that pilots were incapable of operating it. During that flight, the digital invisible hand pushed down the nose of the plane 26 times during the initial ascent. On the 27th time, it finally won over the pilots. Here, the Ethiopian incident differs. We have to assume that the pilots of the ET 302 had been aware of the existence of the invisible hand, but this apparently did not help them. A software update initially announced for mid April is promising to salvage both Boeing itself and the hundreds of grounded planes. A new digital invisible hand is poised to take over.
Digital invisible hands make up the core of many innovations. They clean our clothes better with less detergent and water, they prepare out espresso like a barista, and they clean the exhaust fumes of our cars. They often contain artificial intelligence, machine learning and similar modern forms of magic. Some invisible hands are awesome, some are awful. The latter encompass programs that can harm, or even kill, humans. The autopilot of an autonomous car is potentially deadly – for driver or pedestrian alike. If autonomous driving ultimately turns into reality, many potentially deadly invisible hands will act in unison. IN case of a disaster, identifying which hand did what and why will become a Herculean task.
The automotive industry hasn’t become the target of regulation as much as the aviation industry. Here, the industry isn’t governed by the principle of immediately learning from mistakes to prevent software issues from striking twice. The most disturbing facet of the ET 302 crash is that is shows that even the otherwise exemplary aviation industry is no longer able to fully control the impact of their software. Other industries, and their customers, should take this as bad news.
However, not every piece of software acts as an invisible hand. There is software that is so simple, open and understandable that it is possible to rigorously prove that it always acts as specified, without causing damage. Such pieces can than be combined to form complex systems that deserve our trust. However, right now, this is not sufficient to, for example, bring reliable, understandable and provable optimal emission control systems into our diesel cars. In addition to more scrupulous regulation for automotive software, this also requires some scientific advancement and expertise within the regulatory bodies. Unfortunately, developing software up to these standards is still involved and therefore expensive. While this is not appropriate for the software in our espresso machines, it is crucial for software that, is programmed badly, can harm lives. The alternative is a dystopia in which we are surrounded by systems that usually support us comfortably and efficiently, but fail randomly without any apparent reason we can learn from.