views:

771

answers:

11

Many computer science curricula include a class or at least a lecture on disasters caused by software bugs, such as the Therac-25 incidents or Ariane 5 Flight 501. Indeed, Wikipedia has a list of software bugs with serious consequences, and a question on StackOverflow addresses some of them too.

We study the failures of the past so that we don't repeat them, and I believe that rather than ignoring them or excusing them, it's important to look at these failures squarely and remind ourselves exactly how the mistakes made by people in our profession cost real money and real lives.

By studying failures caused by uncaught bugs and bad process, we learn certain lessons about rigorous testing and accountability, and we make sure that our innocent mistakes are caught before they cause major problems.

There are kinds of less innocent failure in software engineering, however, and I think it's just as important to study the serious consequences caused by programmers motivated by malice, greed, or just plain amorality. Thus we can learn about the ethical questions that arise in our profession, and how to respond when we are faced with them ourselves.

Unfortunately, it's much harder to find lists of these failures--the only one I can come up with is that apocryphal "DOS ain't done 'til Lotus won't run" story.

What are the worst examples of moral failure in the history of software engineering?

A: 

Pre-Y2K software development where (and I was there) we were told not to "waste" space on a four-digit year because the software "won't be in use for more than 10 years."

S.Lott
That would still give you 99 years, I worked at a place that used single digit years.
fuzzy lollipop
99 years from when? It was the late 80's.
S.Lott
Start counting from 0, and remember to add the offset (1988)?
pete the pagan-gerbil
This isn't a case of moral turpitude so much as lack of foresight. How about the DOS for the TRS-80 Model 4, which allotted three bits for the year (running, I think, from 1980 to 1987)?
David Thornley
@pete the pagan-gerbil: Not the hack were were told to use.
S.Lott
@David Thornley: I think intentionally bad engineering is right at the edge of immoral. The design patterns for dates that we used when I was a young programmer were all flawed in numerous ways. Those folks should not have been designing software. They were creating software that was more liability than asset. It was right at the edge of being an immoral use of the company's investment dollars.
S.Lott
@S.Lott: Except that it wasn't intentionally bad engineering, just bad engineering. Your organizational superiors were making a trade-off that you saw (correctly) as bad. They weren't trying to hurt the company, they just made bad decisions. I've been amazed at how large organizations largely composed of well-meaning and intelligent people can screw up before.
David Thornley
A: 

Zero based Arrays and Lists ;-)

fuzzy lollipop
not a moral failure
Paul Nathan
+1  A: 

I look at a lot of these issues and blame management. I see programmers pressured to work on projects that they know are beyond their ability, large programming teams working on serious projects without any testing personnel and ultimately someone with no experience with or knowledge about the challenges developers face telling them what to do.

Spencer Ruport
Blaming Management is too easy. "Management are dumb, they can't code, blah blah blah"
John
@John: You've said nothing I disagree with :)
Jason Punyon
Easy? Of course. Incorrect? No. It's been a problem for well over a decade with no real shift towards improvement.
Spencer Ruport
"A score of 12 is perfect, 11 is tolerable, but 10 or lower and you've got serious problems. The truth is that most software organizations are running with a score of 2 or 3, and they need serious help" - http://www.joelonsoftware.com/articles/fog0000000043.html
Spencer Ruport
@Jason, @Spencer... techy people have ALWAYS blamed management. But techies are as dumb for thinking that management not knowing C++ makes them inferior as managers are for thinking developers are interchangeable.
John
How do you separate out the moral culpability? Much of what went on with the radiation machines was the desire to cut costs, including removing hardware interlocks and replacing them with software.
David Thornley
A: 

Little Endian vs Big Endian byte ordering.

fuzzy lollipop
where's the moral failure in that?
peterchen
+19  A: 

A fairly big one is the Sony BMG copy protection scandal, which installed a rootkit on Windows.

This software was automatically installed on Windows desktop computers when customers tried to play the CDs. The software interferes with the normal way in which the Microsoft Windows operating system plays CDs by installing a rootkit which creates vulnerabilities for other malware to exploit.

Whoever authorised and wrote that software knew they were up to no good.

Greg Beech
+4  A: 

IBM's business relations with Nazi Germany.

Rook
A whole lot of companies did business with Nazi Germany, and the militaristic government of Japan, for that matter.
David Thornley
@David, so then that makes it moral?
Rook
@The Rook: That makes it part of a much larger situation, and makes it a lot messier. It also brings up the question of when it's immoral to sell somebody something and when it's moral to refuse to. There's also the question of what IBM knew at the time (before 1939 if we're talking about the Polish subsidiary). It's not a simple moral situation.
David Thornley
@Rook, no that makes it not software engineering. The software would have been extant, the businesspeople decided to sell it to the Nazis.
Novelocrat
+11  A: 

I thought of this and was disgusted. It isn't Nazis or Rootkits, but this is a moral failure that helps perpetuate the incorrect notion stakeholders have that progress in software development must always be accompanied by something visible to the end user.

Jason Punyon
One of my professors talked about this--apparently the company he worked for was paid millions of dollars to speed up their application...and all they did was comment out that one line of code.
Amanda S
+6  A: 

Developers who, every single day, cross over to the "dark side", and build crimeware and sell it to criminals, I think, are severely morally challenged. In other endeavours, making or possessing tools meant for committing crimes are illegal; strangely, this doesn't seem to apply to malware and botnet infrastructure. And just because it's legal doesn't make it moral.

Ben Fowler
A: 

Software that banks used (and still use) to perform "program trades" and to create magical "low-risk" structured debt products full of toxic assets.

Both of these classes of software were partially to blame for the real estate credit bubble and, thus, the current nearly-worldwide recession.

richardtallent
Here is an article by wired on the formula that killed wall street http://www.wired.com/techbiz/it/magazine/17-03/wp_quant?currentPage=all
tbischel
That software was just a smokescreen for willful fraud.
Daniel Newby
It wasn't the software. Read the article tbischel linked to. There was a pretty formula, sure. It was widely applied, sure. However, that was not the fault of the formula or the software. People did warn about the problems, and were ignored. Managers liked the formula because it was simple, and didn't pay attention to the drawbacks and uncertainties. Money managers optimized their portfolios to reduce risk in the best 99% of cases, and nobody asked what would happen 1% of the time. Some companies found the formulas urging caution, and disregarded that. It was one big greed rush.
David Thornley
+8  A: 

Digital Rights Management is immoral in that it replaces our fair-use rights with weird technical restrictions.

Since it removes our rights, one can make the case that it would be immoral.

http://craphound.com/msftdrm.txt

It's impossible to tell "legitimate" from "illegal" use of media. Yet. Everyone thinks they have a DRM scheme that somehow allows some people to do things and magically prevents other other from doing things.

S.Lott
A: 

How about that despite of previous failures we perpetuate them on smaller and larger scale? All the idealistic academic ideas and "lessons learned" get booted the first moment a project slips behind schedule.

It all starts so innocently: unrealistic deadlines, lone coders, changing specs so on and so forth. Soon you get a piece of code that is approved because it passed a first test for success. The we pile up more crap code on top and start applying bandages on top of bandages.

Any attempts to reason with "bean counters" that software engineers need to step back and do fixes fall on deaf ears. The business users starts screaming that their needs are more important so we keep plowing ahead.

Honestly there is no end to it. Every now and then you get legislators trying to step in and fix those issues but it's all smoke and mirrors.

For example companies get audited on HIPAA compliance; Some clueless person walks between cubes looking for papers face up on people's desks while an Oracle installation is open to the world with default passwords.

SOX compliance is another example of a monumental failure. Financial companies do not process information in the US anymore. All of it is outsourced to lowest bidder in the Far East. Anyone needing any privileged information ought to pay some belittled employee out there $50 and the next day will receive not one but hundreds SSNs, addresses, credit cards etc.

In essence it all comes down to money. Human life has a price and business knows it. We keep building shoddy houses, cars and bad software as long as someone can make a $1.

hdk
It's amazed me for years. There are lots of books on software project management in a bookstore. None are perfect, but most are fairly good. There are lots of knowledge about how software should be managed. There's a distinct shortage of software production that's managed that way.
David Thornley