Articles
- An Investigation of the Therac-25 Accidents by Nancy Leveson and Clark Turner
- After Stroke Scans, Patients Face Serious Health Risks by Walt Bogdanich
- Motor Vehicles Increasingly Vulnerable to Remote Exploits a PSA from the Federal Bureau of Investigation, the Department of Transportation and the National Highway Traffic Safety Administration
- The Role of Software in Spacecraft Accidents by Nancy Leveson
- Who Killed the Virtual Case File? by Harry Goldstein
- FBI Sentinel project is over budget and behind schedule, say IG auditors by Jeff Stein
- Years Late and Millions Over Budget, FBI's Sentinel Finally On Line by Damon Poeter
- FBI’s Sentinel System Still Not In Total Shape to Surveil by Robert N. Charette
- Textbook readings (Software Engineering by Ian Sommerville) chapters 11 through 14
Reflections
Each of the above readings revolve around a series of detrimental software failures or potential software failures due to a multitude of improper software engineering practices. Some of the incidents, such as the accidents described in The Role of Software in Spacecraft Accidents were financially costly, whereas others, such as the Therac-25 accidents, cost some patients their lives. To me, it is obvious that ethical software systems should never cause financial or physical harm to anyone or any environment. Yet, it seems to be far too common. As I read these articles, I wondered how these issues slipped through the cracks during development, and how could they have been prevented.
Leveson described the most common reasons for security and safety failures best: tight budgets, poor communication, unnecessary software complexities, reused code assumed to be safe, poor or missing specifications, poor engineering standards and practices, etc. One of the most upsetting things to read as a software engineer — and most frightening as a consumer — is that when budgets are slim, “the first things to be cut are often system safety, system engineering, mission assurance, and operations, which are assigned a low priority and assumed to be the least critical parts of the project” (Leveson 5). What’s worse, is that one a system has proven itself successful, the engineers of its descendant systems tend to assume that the new systems will function safely and securely as well. For example, the Mars Climate Orbiter used a navigation system that worked well for over 30 years, so the focus of the project was shifted to meeting budgets and deadlines instead of mission risk. Additionally, the Therac-25 radiation therapy machine failed on multiple occasions due to a software system that was assumed to be safe due to the success of the Therac-20. The assumption was misguided however, because the same software flaw was present in the Therac-20, except the Therac-20 never harmed any patients because it featured hardware safety checks that were falsely deemed unnecessary in the Therac-25. In order to avoid issues like these in the future, systems must be checked for safety, regardless of how much of that system was repurposed from an older system. In fact, for safety-critical and security-critical systems, it might be safer and more secure to rewrite new software to avoid overconfidence in a new system.
Another point that caught my interest, was the following statement from our textbook regarding security challenges on page 368: “When considering safety, you can assume that the environment in which the system is installed is not hostile. No one is trying to cause a safety-related incident.” I found this to be relevant to the FBI’s warning about modern vehicles becoming increasingly vulnerable to remote exploits. Before cars commonly featured USB ports, diagnostics ports, Bluetooth, or Wi-Fi, the main concern when designing a car was safety, and it remains the most important aspect of vehicle design today. However, as vehicles become more reliant on software, and more connections to that software are introduced, security becomes another major concern in an industry that hardly focused on security in the past. As I read the public service announcement from the FBI, I wondered if vehicle manufacturers were slow to address security concerns because it has never been an issue before. Maybe, those manufacturers assumed that, like safety concerns, nobody would deliberately attempt to hack into a car. After all, security concerns in vehicles are sometimes safety concerns as well if the attacker can gain access to the vehicle’s steering or brakes in transit. Now that the FBI has brought those concerns to light, hopefully manufacturers will begin to address those concerns as well.