+21  A: 

I once tried to write an mp3 decoder. It didn't work

PlacidBox
Assuming the pun was intended, you get my upvote sir, nice one, and it's not like those subjective questions deserve any serious answer anyway.
Seldaek
I don't see the pun...
Adam Rosenfield
"unsound", I guess. Not the best pun I ever heard :P
Blorgbeard
Bonus points for being tough to detect, though.
Erik Forbes
Teehee. Well done that man, and well spotted that punfinder :) +1
Andrew Rollings
+4  A: 

The one I have just started on.

  1. No Source control.
  2. All source is edited live. To stop mistakes, there are backup files like db-access.php.070821 littering the source tree.
  3. The code is exceptionally brittle - there is very little in the way of error checking and absolutely no fall back if it does.
graham.reeds
If you haven't already, get it into your own personal version control. Something like git works great because it doesn't require a server. Periodically commit snapshots. Then when someone fucks up, brightly offer to get them an older, working version.
Schwern
Be sure to wait a few days while they flounder about trying to undo their mistake. It makes the resulting facial expressions much more interesting.
Erik Forbes
It's a distributed internet project but with real cash. I've been hammering on about svn and so far nothing has happened.
graham.reeds
just break the build. Then let it be broken for a week. Then maybe they'll see the benefit of using version control =D
Alex Baranosky
Absolutely not. There is real money involved in this project.
graham.reeds
+13  A: 

I maintain ExtUtils::MakeMaker. MakeMaker is certainly not worst code I've had to maintain, its actually an engineering marvel, however it is in that unique class of coding horrors wherein the most mission critical code is also the most terrifying.

MakeMaker is the installer for most Perl modules. When you run "Makefile.PL" you are invoking MakeMaker. If MakeMaker breaks, Perl breaks. Perl runs on everything, so MakeMaker has to run on everything. When I say everything I mean EVERYTHING. Every bizarre Unix variant. Windows 95 on up. And VMS. Yes, VMS.

What does MakeMaker do? Makefile.PL is a Perl program that writes a Makefile which contains shell commands, which often run perl, to build and install a Perl module. Let me repeat: It writes shell commands to run perl. Perl, the language which replaces shell scripts.

Oh, it can also compile and link C code. And it can also statically link Perl modules into perl. Oh, and it can manage RCS checkouts. Oh, and roll tarballs of your distribution... and zip files. And do all this other stuff vaguely related to installing modules.

And it has to do all this in a portable, backwards compatible fashion. It has to deal with variants of and bugs in...

  • make (GNU make, BSD make, nmake, dmake, mms, mmk to name a few)
  • shell
  • perl
  • the filesystem (if you don't think that's a big deal, try VMS)
  • C compilers & linkers

It absolutely, positively can not fail and must remain 100% backwards compatible.

Oh, and it has very little in the way of real extension API, so it has to remain compatible with the ad hoc Makefile hackery people have to do to extend it.

Why does it do all this? 15 years ago when Perl only ran on Unix this seemed like a great idea. Why write a whole build system when you can just use make? Perl's a text processing language, we'll just use it to write a Makefile!

Fortunately there is a replacement, Module::Build, and I pinned my hopes that it would swiftly kill MakeMaker. But its uptake has been slow and the community very resistant to the change, so I'm stuck maintaining MakeMaker.

Schwern
A: 

I'm maintaining a scheduling web-application we use in our intranet. When I've been asked if I could remove an agent from the scheduler I thought, sure why not. When I took a look into the source code I figured out that every hour of this agent's day was coded separately. So was every day of his week. And so was every week of every agent of this region. And so was every region of about 5 regions. Html fies that hold asp-code all over the place.

One day I took some time to guess how many lines of code are in these various files and I estimated about 300000. Three-hundred thousand lines of code of once handwritten and then copy and pasted code.

But this number convinced my manager pretty quickly that we would need a new scheduling app very quickly.

Tobias
+7  A: 

What’s the most unsound program you’ve had to maintain?

Everything I've ever written!

Seriously. The more I read blogs, listen to podcasts, and follow sites like this, the more I learn every day. And every day I basically realize everything I wrote yesterday is wrong in some way. I feel for the poor saps that are maintaining the things I wrote early in my career.

Rob
Oh, me too. But there's a category difference between bad code and code that's fundamentally, horrifyingly wrong. It takes more than inexperience to write that kind of code; it takes huge amounts of unwarranted self-confidence.
Robert Rossney
+1  A: 

I used to be a COBOL programmer (shudder). All of our code fell into the "unsound" category. In COBOL, you have no namespaces, all variables are global and there's a lot of mandatory duplication of filenames and other resources. To call a procedure, you set global variables, call the procedure, and then inspect the contents of those global variables (or others that might get set).

The worst, though, was maintaining a COBOL program written before I was born (I was born in 1967) and its exclusive method of flow control was the GOTO. It was an absolute mess and impossible to follow. Minor changes to a variable type could take days to work out. There were no automated tests and manual test plans were never saved, so every change required a new manual test plan be written out, followed exhaustively, and turned in with the code.

Ironically, this is what makes COBOL so successful. COBOL is often executed by Job Control Language (JCL). Since COBOL is so weak, programs don't do a lot, so JCL would allocate some disk space (often down to the cylinder level), and execute a small COBOL program to read data and then write out just the data you need. Then JCL might call a sort program to sort the resulting file. Then another COBOL program would be called to read the sorted file and summarize the data and maybe re-extract the needed results. And maybe JCL would be used again to to move the file somewhere else, and yet another COBOL program would get called to read the results and store them in a database, and so on. Each COBOL program tended to only do one thing and a primitive version of the Unix pipeline model was created -- all because COBOL is too hard to maintain or do anything complicated with. We had loose coupling and tight cohesion (between programs, not in them) because it was almost impossible to write COBOL any other way.

Ovid
A: 

Maintaining ASP applications for a specific company who hires developers to maintain their previous hires... All these applications are not documented, neither there are any comments.

Every function is copied and paste in every ASP page. So no functions are defined or what so ever... Every day i'm crippled by their environments, because I have first to remote to a server, to bypass the DMZ. After that I have to remote in to a production server where I have to make the changes.

MysticSlayer
+3  A: 

I once had to maintain a legacy C application which had previously been written and maintained by some programmers who had lost the will to program (and possibly live). It had too many WTFs to mention but I do remember a boolean function which under various special cases would return TRUE+1, TRUE+2 etc.

Then I read Roedy Green's essay and laughed a lot, until I realised that the reason I found it funny was that I recognised most of the examples from the code I was maintaining. (That essay has become a bit bloated over years of additions, but It's still worth a look).

MikeJ-UK
A: 

I once got called in to help track down a periodic crash in a EDIF reader. Almost immediately I started getting a headache. The original author seemed to feel that yacc was going to penalize him for whitespace, and his yacc grammar was a dense, unreadable mess. I spent a couple hours formatting it, adding rules for missing terminals as they emerged, structuring the declarations to avoid stack growth, and voila, the crash was extinguished.

So remember, for every time you wait while yacc processes your grammar, there will be thousands of runs with the generated parser. Don't be cheap with the whitespace!

Don Wakefield
A: 

Anything I've ever written that's originally supposed to be a quick prototype and ends up staying around a while. My problem domain requires a lot of throwaway prototyping by its nature. For these prototypes, it's sometimes reasonable to violate every best practice and rule of good style, just get it done and clean up later if the prototype ends up being worth keeping. However, occasionally these prototypes end up being very difficult to get working properly, but then end up being keepers. In these cases, I usually end up putting off refactoring/rewriting the thing indefinitely because I'm afraid I'll never get it working again. Further lessening my motivation is that my boss is a domain expert who doesn't program at all.

dsimcha
A: 

A PHP/MySQL driven online contact management system, where the contact table did not have a natural key, there were numerous instance of database fields containing composite data in the form of a delimited string that subsequently needed to be parsed by application code, the HTML and logic were intertwined, almost no functions were used but rather code was cut and pasted into dozens of source code files, data was not sanitized and so fields with (e.g.) embedded vertical tabs caused the XML returned by AJAX calls to malfunction, and to top it all off, files with dozens if not hundreds of empty statements consisting of closing braces immediately followed by semi-colons: "};"

George Jempty
A: 

The interpreter for a CAD/CAM geometry processing language (P1 = 10,10; P2 = 20,20; L1 = P1,P2; - that sort of thing), written in Microsoft BASIC Professional Development System (PDS), with minimal length variable names (it ran out of single letters quickly, so moved on to double letters. PP, PQ, PR, anyone?). And, to be fair, some comments. In Italian.

Funnily enough, it actually worked, and I was able to add some functionality to it, but it was like amateur dentistry - painful, and certainly not recommended...

metadaddy
+1  A: 

Right out of grad school at Lucent I was given a compiler and interpreter to maintain, written in PL/I. The language being compiled described a complex set of integrity constraints, and the interpreter ran those constraints against a large data set that would later be formed into the initial database controlling the 4ESS switch. The 4ESS was, and still is, a circuit-switch for long distance voice traffic.

The code was a jumble. There was a label to branch to called "NORTH40". I asked the original developer what it meant.

"That's where the range checks are done, you know, the checks to make sure each field has a correct value."

"But why 'NORTH40'?"

"You know, 'Home, home on the range.'"

"Huh?"

Turned out 'NORTH40' meant a farm's north 40 acres, which in his city-bred mind had an obscure connection to a cattle ranch.

Another module had two parallel arrays called TORY and DIREC, which were updated in parallel and so were an obviously misguided attempt to model a single array containing pairs of data. I couldn't figure out the names and asked the developer. Turned out they were meant to be read together: "direc-tory". Great.

The poor guys that had to write the integrity constraints had no user manual to guide them, just an oral tradition plus hand-written notes. Worse, the compiler did no syntax-checking, and very often they wound up specifying a constraint that was mapped quietly into the wrong logic, often with very bad consequences.

Worse was to come. As I delved into the bowels of the interpreter, I discovered it was built around a giant sort process. Before the sort was an input process that generated the sort input data from the raw data and the integrity constraints. So if you had a table with 5,000 trunk definitions in it, and each trunk record had three field values that had to be unique across the whole input data set, the input process would create 3 * 5,000 = 15,000 sort input records, each a raw data record prefixed by the integrity constraint number and a copy of the field value to be sorted. Instead of doing three 5,000 record sorts, it did one 15,000 record sort. By the time you factored in hundreds of intra- and inter-table integrity constraints, and some very large tables, you had a combinatorial nightmare.

I was able to refactor a bit, document the language, and add syntax checking with comprehensible error messages, but a few months later jumped at a transfer to a new group.

Jim Ferrans
A: 

I once worked on a CAD application written in BASIC where the company policy was that every program must begin with the statement:

ON ERROR RESUME

jMM

jMM