Software Development and Engineering Blog


HW28: Chapter 25

Sam Word | 21 Nov 2017

HW27: Chapter 24

Sam Word | 16 Nov 2017

HW19: Team Progress II

Sam Word | 14 Nov 2017

HW25: Chapter 23

Sam Word | 09 Nov 2017

HW24: Chapter 22

Sam Word | 06 Nov 2017

HW22: Chapter 21

Sam Word | 02 Nov 2017

HW19: Team Progress I

Sam Word | 31 Oct 2017

HW21: Chapter 20

Sam Word | 26 Oct 2017

HW20: Chapter 19

Sam Word | 24 Oct 2017

HW18: Chapter 18

Sam Word | 18 Oct 2017

HW17-B: Chapter 17

Sam Word | 11 Oct 2017

HW17-A: Chapter 16

Sam Word | 08 Oct 2017

HW16: Chapter 9

Sam Word | 05 Oct 2017

HW15: Chapter 15

Sam Word | 02 Oct 2017

HW14: Testing Reflections

Sam Word | 28 Sep 2017

HW13: Chapter 8

Sam Word | 28 Sep 2017

HW12: Mythical Man Month

Sam Word | 26 Sep 2017

HW11: Chapter 6

Sam Word | 21 Sep 2017

HW10: Chapter 5

Sam Word | 13 Sep 2017

HW9: Reflections

Sam Word | 12 Sep 2017

HW8: Chapter 2

Sam Word | 12 Sep 2017

HW7: Reflections

Sam Word | 07 Sep 2017

HW6: Chapter 4

Sam Word | 07 Sep 2017

HW5: Reflections

Sam Word | 05 Sep 2017

HW4: Chapters 11 & 12

Sam Word | 30 Aug 2017

HW3: Chapter 10

Sam Word | 28 Aug 2017

HW2: Responses

Sam Word | 27 Aug 2017

HW1: Chapter 1

Sam Word | 24 Aug 2017

HW0: Introduction

Sam Word | 23 Aug 2017

HW5: Reflections


05 Sep 2017

Articles

Reflections

Each of the above readings revolve around a series of detrimental software failures or potential software failures due to a multitude of improper software engineering practices. Some of the incidents, such as the accidents described in The Role of Software in Spacecraft Accidents were financially costly, whereas others, such as the Therac-25 accidents, cost some patients their lives. To me, it is obvious that ethical software systems should never cause financial or physical harm to anyone or any environment. Yet, it seems to be far too common. As I read these articles, I wondered how these issues slipped through the cracks during development, and how could they have been prevented.

Leveson described the most common reasons for security and safety failures best: tight budgets, poor communication, unnecessary software complexities, reused code assumed to be safe, poor or missing specifications, poor engineering standards and practices, etc. One of the most upsetting things to read as a software engineer — and most frightening as a consumer — is that when budgets are slim, “the first things to be cut are often system safety, system engineering, mission assurance, and operations, which are assigned a low priority and assumed to be the least critical parts of the project” (Leveson 5). What’s worse, is that one a system has proven itself successful, the engineers of its descendant systems tend to assume that the new systems will function safely and securely as well. For example, the Mars Climate Orbiter used a navigation system that worked well for over 30 years, so the focus of the project was shifted to meeting budgets and deadlines instead of mission risk. Additionally, the Therac-25 radiation therapy machine failed on multiple occasions due to a software system that was assumed to be safe due to the success of the Therac-20. The assumption was misguided however, because the same software flaw was present in the Therac-20, except the Therac-20 never harmed any patients because it featured hardware safety checks that were falsely deemed unnecessary in the Therac-25. In order to avoid issues like these in the future, systems must be checked for safety, regardless of how much of that system was repurposed from an older system. In fact, for safety-critical and security-critical systems, it might be safer and more secure to rewrite new software to avoid overconfidence in a new system.

Another point that caught my interest, was the following statement from our textbook regarding security challenges on page 368: “When considering safety, you can assume that the environment in which the system is installed is not hostile. No one is trying to cause a safety-related incident.” I found this to be relevant to the FBI’s warning about modern vehicles becoming increasingly vulnerable to remote exploits. Before cars commonly featured USB ports, diagnostics ports, Bluetooth, or Wi-Fi, the main concern when designing a car was safety, and it remains the most important aspect of vehicle design today. However, as vehicles become more reliant on software, and more connections to that software are introduced, security becomes another major concern in an industry that hardly focused on security in the past. As I read the public service announcement from the FBI, I wondered if vehicle manufacturers were slow to address security concerns because it has never been an issue before. Maybe, those manufacturers assumed that, like safety concerns, nobody would deliberately attempt to hack into a car. After all, security concerns in vehicles are sometimes safety concerns as well if the attacker can gain access to the vehicle’s steering or brakes in transit. Now that the FBI has brought those concerns to light, hopefully manufacturers will begin to address those concerns as well.