The joke goes that only two industries refer to their customers as “users.” But here’s the real punch line: Drug users and software users are about equally likely to recover damages for whatever harms those wares cause them.
Let’s face it. Dazzled by what software makes possible—the highs—we have embedded into our lives a technological medium capable of bringing society to its knees, but from which we demand virtually no quality assurance. The $150 billion U.S. software industry has built itself on a mantra that has become the natural order: user beware.
Unfortunately, software vulnerabilities don’t just cost end-users billions annually in antivirus products. The problem is bigger than that. In 2011, the U.S.government warned critical-infrastructure operators about an exploit that was targeting a stack overflow vulnerability in software deployed in utilities and manufacturing plants around the world. In 2012, a researcher found almost two dozen vulnerabilities in industrial control systems (ICS) software used in power plants, airports and manufacturing facilities. In its 2013 threat update, Symantec, the world’s largest security software corporation, surprised no one when it announced that criminals were finding and exploiting new vulnerabilities faster than software vendors were proving able to release patches. Cybersecurity is a very big set of problems, and bad software is a big part of the mess.
Mobile phones aren’t the only products to benefit from nifty touch screen displays. A whole range of medical devices now sport them, also – as any trip to your local emergency department (or dentist’s office) will reveal. Unfortunately, many of those devices are just as balky and bug ridden as your average mobile phone -despite the fact that patients’ lives can rely on them.
Two examples of the ethical imperative for good design:
The first has to do with the crash of Air France Flight 447, in which an apparent software feedback error led to the death of 228 passengers and crew. From the article:
A feature designed to make things better for pilots has unintentionally made it harder for them to monitor colleagues in stressful situations.
The second involves the design of control systems for the US Air Force’s remotely piloted aircraft systems. There are a number of ethics concerns surrounding the use of drone targeting systems, but this article focuses on the need for user-appropriate design.