Between 2002 and 2009, cars made by Toyota suffered from a fault known as ‘unintended acceleration’, which basically means that some of their cars would suddenly start speeding up uncontrollably and crash. In a few of those cases, people actually died.
Initially people came up with a number of different conclusions as to why these cars were accelerating, and one of these was that the gas pedals were sticky, the other was that the floor mats of the vehicle were pressing down on the accelerator. To this day Toyota has never found the root of the problem and is still not entirely sure of the reason behind the issue. This is just one case out of many that propelled complexity scientist Samuel Arbesman to write a book entitled Overcomplicated: Technology at the Limits of Comprehension.
This book deals with a looming new threat based around the inherent complexities of today’s technologies. Every day we use products such as the iPhone or iPad, or even the white goods now ubiquitous in our household, that have enormous potential to cause disruption. Their software is written by engineers, but even the engineers plugging in this code sometimes lose track of just how layered and complex it all is.
As Samuel explains, “Technology has become so pertinent to our lives yet so overwhelmingly complex that even its own engineers that create it don’t understand it, and it’s having catastrophic consequences as you can see. We are living in this world of increasing incomprehensibility, and so therefore it is incumbent on each and every one of us to find out how we want to live in this world, and how we want to try and approach these complex technologies.”
The idea that Toyota never found the glitch because of the massive complexity behind the software is a frightening prospect. There is even the case of NY Stock Exchange shutting down momentarily for no reason in 2015. So, who are we to trust in this scenario? Where are we to turn? And what do we switch on? Samuel Arbesman is not even entirely sure how bad the problem is, explaining, “Example after example of technology is just becoming more and more complex but even where the people who are actually building these technologies don’t fully understand them…in the process of building it, we end up with systems that we don’t fully understand.”
The systems are becoming so interwoven that the language they speak becomes a vastly different landscape than we were seeing decades ago. Humans at their best aren’t able to handle the complexities of these interfaces and software.
Think of your common algorithm and the data that goes into them. In some instances like a computer program, there can be many different algorithms together that would allow you to compute. However as Arbesman makes clear, “When we build something it might be really complex and messy, and therefore it will act in mysterious ways, it might have bugs or glitches, because there is a certain amount of failure of understanding.” As we enter this new realm even the smartest engineers building these programs and hardware will need to retain at least some understanding of what they are investing in.
As Arbesman puts into perspective, “We are increasingly bumping up against the very finite limits of our brains – the number of different things we can hold in our head, the number of different interaction components that we can understand how they all operate. but this is not terribly surprising.”
Take the cutting edge, machine learning and A.I., we may understand the algorithms of how they learn, often times the resulting system and how it works, but it is not fully understandable, even by the people who built them. “We don’t understand the underlying logic within the trained system where there might be millions of parameters that have been set. Imagine exponential change, plus huge amounts of feedback, plus millions of parameters all interacting, and things that have been built maybe decades ago interacting with things that are brand new, suddenly our brains are going to cower in the face of all that complexity.”
Sometimes, the term that best describes these dilapidated systems is ‘kluge’, basically a nerd word that describes a system not written particularly elegantly or beautifully but just allows it to get by. As the years pass the technologies build up more dust, becoming more layered and complex and nothing is ever done to unscramble it or usher the complexity out. Take the Internal Revenue Service in the US, it is still using systems that were built during the Kennedy administration in the 1960s with layers of new software built on top of it.
Samuel suggests the first step for people should be that they are able to see the incredible amount of complexity that we deal with every day. That doesn’t mean that all bugs will be fixed but it is still important to see what is under the hood, especially in this age where people have iPads, which are incredibly powerful but most people who use them don’t really understand how files are stored on them. “Whether or not we are going to fully understand these systems is an entirely separate thing, but I think we need the possibility of confronting the true complexities of the systems that are all around us.”