Many computer science curricula include a class or at least a lecture on disasters caused by software bugs, such as the Therac-25 incidents [1] or Ariane 5 Flight 501 [2]. Indeed, Wikipedia has a list of software bugs with serious consequences [3], and a question on StackOverflow [4] addresses some of them too.
We study the failures of the past so that we don't repeat them, and I believe that rather than ignoring them or excusing them, it's important to look at these failures squarely and remind ourselves exactly how the mistakes made by people in our profession cost real money and real lives.
By studying failures caused by uncaught bugs and bad process, we learn certain lessons about rigorous testing and accountability, and we make sure that our innocent mistakes are caught before they cause major problems.
There are kinds of less innocent failure in software engineering, however, and I think it's just as important to study the serious consequences caused by programmers motivated by malice, greed, or just plain amorality. Thus we can learn about the ethical questions that arise in our profession, and how to respond when we are faced with them ourselves.
Unfortunately, it's much harder to find lists of these failures--the only one I can come up with is that apocryphal "DOS ain't done 'til Lotus won't run" story [5].
What are the worst examples of moral failure in the history of software engineering?
A fairly big one is the Sony BMG copy protection scandal [1], which installed a rootkit [2] on Windows.
This software was automatically installed on Windows desktop computers when customers tried to play the CDs. The software interferes with the normal way in which the Microsoft Windows operating system plays CDs by installing a rootkit which creates vulnerabilities for other malware to exploit.
Whoever authorised and wrote that software knew they were up to no good.
[1] http://en.wikipedia.org/wiki/Sony_BMG_CD_copy_protection_scandalI thought of this [1] and was disgusted. It isn't Nazis or Rootkits, but this is a moral failure that helps perpetuate the incorrect notion stakeholders have that progress in software development must always be accompanied by something visible to the end user.
[1] http://thedailywtf.com/Comments/The-Speedup-Loop.aspxDigital Rights Management is immoral in that it replaces our fair-use rights with weird technical restrictions.
Since it removes our rights, one can make the case that it would be immoral.
http://craphound.com/msftdrm.txt
It's impossible to tell "legitimate" from "illegal" use of media. Yet. Everyone thinks they have a DRM scheme that somehow allows some people to do things and magically prevents other other from doing things.
Developers who, every single day, cross over to the "dark side", and build crimeware and sell it to criminals, I think, are severely morally challenged. In other endeavours, making or possessing tools meant for committing crimes are illegal; strangely, this doesn't seem to apply to malware and botnet infrastructure. And just because it's legal doesn't make it moral.
IBM's business relations with Nazi Germany. [1]
[1] http://en.wikipedia.org/wiki/IBM#Business_relations_with_Nazi_GermanyI look at a lot of these issues and blame management. I see programmers pressured to work on projects that they know are beyond their ability, large programming teams working on serious projects without any testing personnel and ultimately someone with no experience with or knowledge about the challenges developers face telling them what to do.
Software that banks used (and still use) to perform "program trades" and to create magical "low-risk" structured debt products full of toxic assets.
Both of these classes of software were partially to blame for the real estate credit bubble and, thus, the current nearly-worldwide recession.
Pre-Y2K software development where (and I was there) we were told not to "waste" space on a four-digit year because the software "won't be in use for more than 10 years."
How about that despite of previous failures we perpetuate them on smaller and larger scale? All the idealistic academic ideas and "lessons learned" get booted the first moment a project slips behind schedule.
It all starts so innocently: unrealistic deadlines, lone coders, changing specs so on and so forth. Soon you get a piece of code that is approved because it passed a first test for success. The we pile up more crap code on top and start applying bandages on top of bandages.
Any attempts to reason with "bean counters" that software engineers need to step back and do fixes fall on deaf ears. The business users starts screaming that their needs are more important so we keep plowing ahead.
Honestly there is no end to it. Every now and then you get legislators trying to step in and fix those issues but it's all smoke and mirrors.
For example companies get audited on HIPAA compliance; Some clueless person walks between cubes looking for papers face up on people's desks while an Oracle installation is open to the world with default passwords.
SOX compliance is another example of a monumental failure. Financial companies do not process information in the US anymore. All of it is outsourced to lowest bidder in the Far East. Anyone needing any privileged information ought to pay some belittled employee out there $50 and the next day will receive not one but hundreds SSNs, addresses, credit cards etc.
In essence it all comes down to money. Human life has a price and business knows it. We keep building shoddy houses, cars and bad software as long as someone can make a $1.
Zero based Arrays and Lists ;-)