Software Bugs – Benign to Lethal

A recent Wired article on the Top 10 Worst Software Bugs in History is an interesting read, but of course I have something to add.

On the list are two medical-device related software bugs. The first is the Therac-25 bug wherein patients were inadvertently exposed to large doses of radiation resulting in at least five deaths. This is a straightforward engineering bug. It’s interesting to note the various causes included in that Wikipedia link. It warms my heart to see so many systems being examined to see if they contributed to the error.

The second one occurred relatively recently (2000). The device’s designs were not followed or understood properly and technicians ended up killing at least 8 patients through massive radiation overdoses. More on this one below.

Here are a few links to other disasters or problems caused by software bugs:

  1. USS Yorktown: a US Navy ship suffers full system shutdown when someone enters 0 for a database value on ship.
  2. 2003 Blackout in North America: part of the reason the blackout was so extensive was alarms going off weren’t displayed due to a software bug.
  3. Full scale Soviet attack: in 1979 NORAD quite literally sat stunned as it witnessed a full-scale Soviet missile attack


So you see, computers are keeping us on our toes. Either that or software programming is hard, complicated work. Maybe that.

But back to the 2000 incident in Panama City. Many people report this as a problem between the physicians and the software, stating that they were using the software in a manner that was not provided for. In fact, according to the Wired article, all three were indicted for murder because they were supposed to double-check the calculations by hand.

But look at the problem in another way:

Multidata’s software allows a radiation therapist to draw on a computer screen the placement of metal shields called “blocks” designed to protect healthy tissue from the radiation. But the software will only allow technicians to use four shielding blocks, and the Panamanian doctors wish to use five.

The doctors discover that they can trick the software by drawing all five blocks as a single large block with a hole in the middle. What the doctors don’t realize is that the Multidata software gives different answers in this configuration depending on how the hole is drawn: draw it in one direction and the correct dose is calculated, draw in another direction and the software recommends twice the necessary exposure.

To me, it sounds like the software designer missed the truck here. Why limit the user to 4 blocks? Why not have better detection of users trying to use more than four blocks? Why not have the software indicate to the user that the direction of the drawing action influences the outcome? To me, it sounds like having the ability to draw a circle with a hole in the middle is a valid use of the software – why was this not found out during the requirements elicitation?

If I were on that jury those physicians would walk free. Who expects someone to use a complex machine to perform an action, only to then double check the results? Do you double check your calculator’s results? Today we have a certain level of trust when it comes to computers (misplaced though it may be) and these physicians obviously did not do any of this on purpose. I believe that the machine should have been trustworthy, and that’s where the fault lies.

Leave a Reply

Your email address will not be published. Required fields are marked *