“Software runs the world,” according to agile guru Robert Martin.
If that’s not quite true today, it will be tomorrow. Continued advances in robotics, big data and machine learning, coupled with ever cheaper and more compact CPU and memory, have extended the reach of software systems into every aspect of people’s lives. Complex algorithms and learning systems are taking over critical decisions that used to be made by people.
Some software is just fun or harmlessly useful. Systems help us develop and test other software. Software runs our dishwashers and our ovens, our home heating systems and our clothes washing machines.
The potential benefits of all this software are enormous. But so are the risks. With increasing complexity come systems that are difficult to understand, predict the outcomes of, and manage. Safety regulations and standards have not kept pace with the technological advances and sheer ubiquity of software systems. And some very complex and unregulated systems—such as those operating cars—are safety-critical.
Your car may have 100 million lines of software code conveying messages from your feet on the pedals, your hands on the wheel—and its own sensors—to the actual mechanical systems that (just barely) run the car. Such software is not always trouble-free. In one case, investigators of a popular car model found “spaghetti code” with software errors that could cause sudden unintended and uncontrollable acceleration. In another, hackers remotely took control of a car’s brakes and steering via its internet-enabled entertainment system.
Even regulated medical devices such as heart pacemakers have ways they can be hacked. And some experts are concerned that the vast connected electricity grid in North America could be at risk because of the multiplicity of contributing utilities, some of them small and not as cyber-security conscious as they could be.
In 2012, safety systems expert Nancy Leveson wrote, “We are attempting to build systems that are beyond our ability to intellectually manage.” We have built many more since then.
Where will testers be in a world run by software?
Our roles are changing, but some things haven’t changed. Testers still know that there is no bug-free software. We know that a simple software error like a typo in code or a misinterpreted requirement can sometimes result in disaster. We have the mindset to imagine what could go wrong before a catastrophe happens and help to prevent it.
Let’s talk about what we can do to test for a safer world, and what we need to do to get there.