958

33
+12  Q:

## Pivotal Suboptimal Decisions in the History of Software

Throughout the history of software development, it sometimes happens that some person (usually unknown, probably unwittingly) made what, at the time, seemed a trivial, short-term decision that changed the world of programming. What events of this nature come to mind, and what have been our industry's response to mitigate the pain?

Illustration (the biggest one I can think of): When IBM designed the original PC, and decided to save a couple dollars in manufacturing costs by choosing the half-brain-dead 8088 with 8-bit-addressable memory, instead of one of the 16-bit options (8086, 680n, etc.), dooming us to 20 years of address offset address calculations.

(In response, a lot of careers in unix platform development were begun.)

Somewhere toward the other end of the scale lies the decision someone made to have a monster Shift Lock key at the left end of the keyboard, instead of a Ctrl key.

+26  A:

Allocating only 2 digits for the year field.

And the mitigation was to spend huge amounts of money and time just before the fields overflowed to extend them and fix the code.

I'd upvote this one 1,000,000 if I could, after having to spend my New Year's Eve at work to babysit all the systems that might crash at the stroke of midnight.
In the mid-80s, I was out-voted on a decision to use 4 digits for year, because they wanted to fit more data into a single 80 column punch card. Yes, they were still using punch cards then.
You won't be saying that in 2038...
+4  A:

Apple ousting Steve Jobs (the first time) to be led by a succession of sugar-water salemen and uninspired and uninspiring bean counters.

+20  A:

Paul Allen deciding to use the / character for command line options in MS DOS.

Excuse my ignorance, but what has been the negative effect of that decision?
See the Microsoft answer below, making interoperability more difficult than it needed to be in a roundabout way.
It lead to ambiguity in command lines, especially once MS DOS programmers started allowing "-" as command line options. It forced Microsoft to use \ for path separators, which conflicts with Unix, URIs, regular expressions, and XPath, leading to hacks like @"C:\..." and Regex.Escape.
I once provided tech support, and users cannot distinguish between slashes and backslashes. Since they have different meanings in different OSes and commands, typing commands is a minefield. It's gotten so bad Browsers have to allow both; try http:\\stackoverflow.com.
Not to mention all the dofusses (dofii?) who wonder why 'char* str = C:\foo\bar";' doesn't do what they expect.
How about Paul Alan selecting the "\" as a file separator. Apart from the compatability issues the "\" is an alt-gr character on most european keyboards.
+4  A:

Using 4 bytes for time_t and in the internet protocols' timestamps.

This has not bitten us yet - give it a bit more time.

30 year mortgages hit that limit this year (2008).
When is that, 2038 or something? Another present for our grandkids.
Don't worry, we'll be using 64 bit computers before then. A 64 bit time_t won't overflow until the sun goes dark.
And IPv6 no doubt? </scorn>
And flying cars!Oh, you closed the tag. My bad.
+3  A:

Using the qwerty keyboard on computers instead of dvorak.

Utter BS: Read this: http://www.reason.com/news/show/29944.html ; You've been flim-flammed friend.
Not utter BS. A few percentage points is still a few percentage points, even if they go to great lengths to brush over that; and since they don't source that claim, I'm not convinced.
That article also talks only about typing speed, and doesn't even mention repetitive-motion disorders. How could having the statistically most-frequently-used keys on home row fail to reduce the chances of carpal tunnel?
The whole keyboard layout was originally done to slow down typing productivity anyway. I think thats a bigger flaw that which mucked up layout...
@rally25rs: Misstatement. The idea behind QWERTY is not that it slows down productivity, but that it minimizes the odds of consecutive typewriter hammers from being pressed in close sequence. The end result is that the hands and fingers stay in constant motion. Whether this is good or bad....
So the question then is why not adjust the keyboard layout for the best performance on a computer. Not just the character keys, but all the keys!
+8  A:

There were a lot of suboptimal decisions in the design of C (operator precedence, the silly case statement, etc.), that are embedded in a whole lot of software in many languages (C, C++, Java, Objective-C, maybe C# - not familiar with that one).

I believe Dennis Ritchie remarked that he rethought precedence fairly soon, but wasn't going to change it. Not with a whole three installations and hundreds of thousands of lines of source code in the world.

You forgot = for assignment and == for equality. Oh, and C# has them all too.
Thank you two for things I hadn't remembered when writing my comment.
Leading zeros for octal was a staggeringly bad decision. I'd love to know the rationale for this.
+1000 for scourge of the earth
+14  A:

Microsoft deciding to use backslash rather than forwardslash as the path delimiter. And failing to virtualize the drive letter.

Technically the choice of backslash was constrained by their earlier mistake of using slash to specify command line options. Which they copied from both DEC operating systems and CP/M.
+5  A:

Deciding that HTML should be used for anything other than marking up hypertext documents.

+2  A:

Thinking that a password would be a neat way to control access.

Versus what at the time? Punchcards?
Depends what you mean by 'at the time'. This is a suboptimal decision that has been made repeatedly. Besides, using two digits for the year probably made sense at the time (versus doubling memory requirements).
+5  A:

Gary Kildall not making a deal with IBM to license CP/M 86 to them, so they wouldn't use MS-DOS.

+2  A:

Every language designer who has made their syntax different when the only reason was "just to be different". I'm thinking of S and R, where comments start with #, and _ is an assignment operator.

Languages with comments starting with # are pretty common in the Unix world.
Well, the rest of the S/R syntax is more like C, so I wish they had stuck to that. I work a lot with inter-language translators, and this is a needless headache, especially the underscore. They also made '.' an alphabetic character. Grr...
I would say the real problem is languages that derive their (awful) syntax from C, just to avoid being different.
+1  A:

Deciding that "network order" for multi-byte numbers in the Internet Protocol would be high order byte first.

(At the time the heterogenous nature of the net meant this was a coin toss decision. Thirty years later, Intel-derived processors so completely dominate the marketplace it seems lower-order-byte first would have been a better choice).

"network order" was de-facto decided by Sun, who at the time were using M680x0 and SPARC processors, both of which use the high order byte first.
+2  A:

Lisp's use of the names "CAR" and "CDR" instead of something reasonable for those basic functions.

I think Lisp might have been more popular and had greater influence on other programming languages if it had not been so "weird"
Contents of Address Register, and Contents of Decrement Register - two halves of the 36-bit word on what? the PDP-6? CADDAR never bothered me. And newer languages are continually "discovering" what Lisp had 40 years ago. But I take your point...
A:

Netscape's decision to support Java in their browser.

I'd say "Netscape's decision to BADLY support Java in their browser", myself.
+3  A:

Important web sites like banks still using "security questions" as secondary security for people who forget their passwords. Ask Sarah Palin how well that works when everybody can look up your mother's maiden name on Wikipedia. Or better yet, find the blog post that Bruce Schneier wrote about it.

I just love when they are case sensitive too. Did I capitalize that answer when I typed it in 4 years ago? hmmm...
+11  A:

Actually the 8088 & 8086 have same memory model and same number of address bits (20). Only difference is width of external data bus which is 8 bit for 8088 & 16 bit for 8086.

I would say that use of inconsistent line endings by different operating systems (\n - UNIX, \r\n - DOS, \r - Mac) was a bad decision. Eventually Apple relented by making \n default for OS-X but Microsoft is stubbornly sticking to \r\n. Even in Vista, Notepad can not properly display a text file using \n as line ending.

Best example of this problem is the ASCII mode of FTP which just adds /r to each /n in a file transferred from a UNIX server to Windows client even though the file originally contained /r/n.

Not to mention what ASCII mode of FTP does to a binary file!
+3  A:

HTML as a browser display language.

HTML was originally designed a content markup language, whose goal was to describe the contents of a document without making too many judgments about how that document should be displayed. Which was great except that appearance is very important for most web pages and especially important for web applications.

So, we've been patching HTML ever since with CSS, XHTML, Javascript, Flash, Silverlight and Ajax all in order to provide consistent cross-browser display rendering, dynamic content and the client-side intelligence that web applications demand.

How many times have you wished that browser control languages had been done right in the first place?

A:

Re-arranging the letters on the keyboard to slow down typing productivity, back in the original mechanical typewritter days, and carrying that over to digital computers.

The History of Qwerty

+4  A:

Microsoft's decision not to add *NIX-like execute/noexecute file permissions and security in MS-DOS. I'd say that ninety percent of the windows viruses (and spyware) that we have today would be eliminated if every executable file needed to be marked as executable before it can even execute (and much less wreak havoc) on a system.

That one decision alone gave rise to the birth of the Antivirus industry.

Why do people feel the need to use "*nix" or something similar instead of just typing the word "unix"?
*nix means unix-like so it's more broad of a term.
Originally, people used *nix because the name "Unix" was a voraciously protected trademark.
-1 from me. The only case where I can see this helping is for files surreptitiously "placed" (e.g. using an application's macro system to "Save As" to EXPLORER.EXE somewhere earlier in the PATH). But those locations should have read-only perms anyway. And if a malicious program can change the PATH, then it can set an execute bit on a file.
+13  A:

Ending Alan Turing's career when he was only 42.

Wow, that was my first thought when I saw the question too.
Let us thank all the homophobes out there for holding back computer science by decades.
A:

The term Translation Lookaside Buffer (which should be called something along the lines of Page Cache or Address Cache).

+1  A:

Microsoft copying the shortcut keys from the original Mac but using Ctrl instead of a Command key for Undo, Cut, Copy, Paste, etc. (Z, X, C, V, etc.), and adding a near worthless Windows key in the thumb position that does almost nothing compared to the pinky's numerous Ctrl key duties. (Modern Macs get a useful Ctrl key (for terminal commands), and a Command key in the thumb position (for program or system shortcuts) and an Alt (Option) key for typing weird characters.) (See this article.)

A:
I seem to remember that IBM did the same with PC Juniors at one point.
.. Apple Lisas too
I have that game!
A:

Having a key for Caps Lock instead of for Shift Lock, in effect it's a Caps Reverse key, but with Shift Lock it could have been controllable.

+1  A:

Null-terminated strings

+3  A:

Microsoft's decision to use "C:\Program Files" as the standard folder name where programs should be installed in Windows. Suddenly working from a command prompt became much more complicated because of that wordy location with an embedded space. You couldn't just type:

cd \program files\MyCompany\MyProgram


Anytime you have a space in a directory name, you have to encase the entire thing in quotes, like this:

cd "\program files\MyCompany\MyProgram"


Why couldn't they have just called it c:\programs or something like that?

cd \progra~1\my-company\my-prog
@Tobias You're missing the point. A tilde-numeric after a shortened-string is even further from an elegant solution.
You live in the wrong country, my friend. Come to Germany and you get C:\Programme. No spaces.
A:

Null References - a billion dollar mistake.

+1  A:

Microsoft's decision to base Window NT on DEC VMS instead of Unix.

There's a lot of nice things in both VMS and NT that are missing in Unix. Of course, the opposite is also true. Also, I don't think they consciously decided to clone VMS. DEC fired one of the most brilliant OS designers ever, MS scooped him up, so naturally his decisions influenced NT.
I mean, how could they consciously decide to base Windows NT on VMS if they didn't even decide to build Windows NT? After all, what Dave Cutler was hired to do, was to build a next-generation OS/2 for the N10 processor.
+5  A:

EBCDIC, the IBM "standard" character set for mainframes. The collation sequence was "insane" (the letters of the alphabet are not contiguous).

It actually makes sense if you are familiar with how the characters were entered on a punched card. There was a sort of logical contiguity that just isn't apparent if you look at it in terms of flat integers.
+1  A:

7-bits for text. And then "fixing" this with code pages. Encoding issues will kill me some day.

A:

We have that covered by helpful libraries these days, but the moron at Netscape that decided that cookie-expriation dates should be human readable should take measure to never meet me in person.

+3  A:

Netscape's decision to rewrite their browser from scratch. This is arguably one of the factors that contributed to Internet Explorer running away with browser market share between Netscape 4.0 and Netscape 6.0.

+4  A:

DOS's 8Dot3 file names, and Windows' adoption of using the file extension to determine what application to launch.