views:

478

answers:

18

I hate the way this sounds but Joel says "It's important to remember that when you start from scratch there is absolutely no reason to believe that you are going to do a better job than you did the first time." I want to not believe this. Am I really doomed? Have you done a better job the second time around? Have you failed to do better? What were signs that each was happening? Should I suck it up and sink the time, money, and effort into plugging a very leaky ship instead?

+4  A: 

In my opinion, based on my gut instincts and various anecdotal third-hand experience, I think that 'size' is the key aspect here.

If you rewrite something small from scratch, you will do it better, based on the experienced you gained doing it the first time.

If you rewrite something big from scratch, you will fail miserably, for innumerable reasons.

I am not sure where the knee of the curve is for the transition from small to big (and success to failure). (Possible guesses are that when multiple developers are involved, on when multiple interacting components are involved, or when the people who are doing the rework are not the same as the people doing the original work, or when the original product is old enough that it does not have the same number/quality of tests as newer code, then all these things contribute negatively, and also all these things are correlated to bigger/older projects.)

(Corollary: thus the only way to improve/rewrite something big, is to do it one small component at a time.)

Brian
It of course is big.
ojblass
Is only because it already failed once that you say this?
ojblass
My impression is that everyone in the universe who has tried to 'rewrite something big from scratch' has failed. Give the question a few hours and see what other people think; I could be way off.
Brian
I will let it simmer. Damn you Joel I hope you have a sleepness night if you read this.
ojblass
I've seen a million-line product get rewritten successfully, although a lot was refactored rather than rewritten and the new version wasn't 100% improvement.
Mark Ransom
I am picking this for numerous reasons but mostly for being the first to point out the complexity as a parameter in success.
ojblass
+2  A: 

@Brian: I would have to wonder.. can you rewrite small components of something big better the second time around?

There's something to be said about the quality of tested, production level code. I think that's Joel's point. No matter how easy the component is to recreate, you will forget about that one edge case that breaks the system.

Nick Stinemates
+1  A: 

It's interesting that Joel says "you". Yes, re-writing large legacy systems is problematic, because no one person knows exactly what the system does. But to expect that an individual developer won't learn from mistakes, and would be unable to do a better job the second time around on an individual project, is overly cynical, IMHO.

EDIT: OK, I just looked at the citation (d'oh.) Joel is definitely using a plural "you" there. Specifically, he's talking about Netscape 6.0, which took three years to develop because they did a re-write of a large legacy system. And it was indeed a disaster.

Dan Breslau
I have seen very few of his opinions that I can fault. :(
ojblass
Replied in the answer. Yes, he was right -- and he was talking about a team re-writing a large legacy system.
Dan Breslau
Of course he meant "you", because "we" will get it done better obviously ;)
Robert Gould
@Robert :-) The point I was trying to make, though, was that in the quotes, it sounded like he was referring to individuals rewriting their own code. From reading the article, it was clear he was talking about rewrites of legacy systems -- a much different scenario.
Dan Breslau
+2  A: 

I think for big projects, "rewrite from scratch" is just not feasible. But there should be steady refinement, one module at a time, so that in effect everything gets brushed up eventually.

And I am absolutely convinced that you only get things right the third time around.

Thilo
The goldilocks too hot, too cold, just right syndrome.
ojblass
+9  A: 

there is absolutely no reason to believe that you are going to do a better job than you did the first time

Sometimes there is a reason. That reason might be:

  • You did a shocking job the first time, and you don't think it's possible to do something that bad again
  • You were solving a different problem the first time; now your needs are different and you need to solve a different problem
  • You didn't know exactly what problem you were solving, and now you do

In such cases you have significant assurance that a second attempt will have a better outcome.

However if you're just fixing something because your current system isn't quite what you would like it to be, you need to be careful that you don't waste time building a system just as bad as the old one. If you want to go ahead, try to take measures to reduce this risk (select the right language(s) for the job, analyse what went wrong last time, introduce better practices, etc.).

Artelius
I do agree, probably the third reason is the most frequent. When you have attacked the problem once, you certainly will get a better understanding of the problem itself. And that does help in doing it in a better way the second time.
Canopus
Hubris makes me want to belive all three reasons.
ojblass
+1 for #3, I've been there several times
Nifle
A: 

I usually do it better the second time around. I gained experiences the first time and can do a better design. After all, there's a reason why Joel says that programmers really like rewriting things: They like nice code and elegant solutions, and the code becomes nicer with every rewrite.

Having said that, I don't think it always make sense from an outside perspective. The rewrite might take too much time and might not be worth it for the company.

+2  A: 

Joel's article raises some very pertinent points and I mostly agree with his interpretation. The point to take away from his article is not that 'You will always fail the second time around', rather that 'don't throw away your first attempt because of inane reasons such as architectural problems, inefficiency and code ugliness'.

Joel's point seems to be that when rewriting a project of this scale from scratch you should not start with the assumption that you are going to have an easier time than the first time around.

That said, I think I always write my code better the second time. I am able to learn from the first time and apply new/more efficient techniques of accomplishing the same tasks. However, when working on rewriting a large project, I won't be the only person writing it and there are bound to be some people doing it for the first time, who would make the same mistakes that were made in the earlier version.

Cerebrus
+2  A: 

I agree with what Joel said. If there are problems in the current software then start fixing them. Start with small changes and over a period of time overall software will be better. If you start from scratch then there are chances you miss the good things of the current software. Also writing from scratch will take long time and in that time you need to maintain the current software as well so its a overhead.

I have done this practically, we had a very badly written crappy software. To write it again was a big problem as nobody had the full knowledge. We started with small stuff and overtime it was looking great.

Bhushan
+4  A: 

Where a new system concept or new technology is used, one has to build a system to throw away, for even the best planning is not so omniscient as to get it right the first time. Hence plan to throw one away; you will, anyhow.

-- Fred Brooks, The Mythical Man-Month

Alex B
That sounds so biblical it deseserve Bible style quoting Fred 4:23, NIV
ojblass
This is so true, no amount of architecture and design can replace a throw away prototype.
Robert Gould
Been a while since I read that book, but didn't it refer to "second system" being a bloated wishlist unachievable pile of dross?
Dead account
This dude needs to learn about agile development :P
cwap
+1  A: 

I would argue that you will always do a better job on a re-write (with a huge caveat).

The reason you will do better is that you will have a better idea of the failings of the previous version (the warts) and a clearer understanding of the requirements.

The caveat is that if the system is not fully testable (unit tests for example) and not fully documented then you will have a very hard (impossible probably) getting the new system working "right".

So it comes down to a matter of scope - if the scope is small enough then it is likely you will do a better job. You may have a huge project, but if you can fix small pieces in isolation then you have a chance at making it better. However the issue there is that each piece will be probably "broken" by the history that it needs to support when interfacing with other pieces.

TofuBeer
I think the focus on you is an important aspect of this. My second time will better for small things.
ojblass
+2  A: 

One big pitfall to avoid is the "second system syndrome" (named so by Fred Brooks), which essentially means that on the second try you're inclined to include all those bells, whistles and advanced algorithms you didn't think about/didn't have time for/didn't know the first time, so you'll end up with a super-complex behemoth (if it gets finished at all).

Another danger (especially few new programmers - I speak from experience here) is to redesign a system without fully understanding the system or its requirements. Do you know why the old system was designed the way it was?

If you remember to keep things simple and know what you're doing, I think you have a reasonable chance to make things better.

Niki
+2  A: 

It's hard to judge the Joel quote without any prior context. I'll give my opinion from a sustainable software development point of view which is: re-writing things is good and you should do it whenever you feel like things have spun out of control.

  • When things get out of control, you often write the same bit of code better the second time around because you have learnt from your previous experience
  • If you don't "rollback" and start afresh and decide to continue on, you may just be adding to the mess. Often developers write code and get too far into their first attempt that they think they can code their way out of it. If you are in that situation, rollback and start again and it'll be much easier to write the code you want and not have it turn out looking like a plate full of spaghetti
  • Get into the habit of "spiking" out code - forget about code quality and tests. Just spend a pre-determined amount of time to code a simple "hacked" version and then start again and you'll find that it'll be easier second time around when you do it with quality and testing in mind

I'm sure what I've just said isn't within the context of the quote as it probably refers to systems many years old and not to re-write the entire company's code base from scratch perhaps(!!)

digiarnie
+1 for spiking, it's probably the single most universally usable method for developing software, or anything for that matter.
Robert Gould
+1  A: 

I think it makes more sense to say a refactor can be better than a rewrite. After all if you are rewriting it means you are throwing everything out. At that point you are basically flipping a coin as to the bugs you will enter in all the new code.

Even if you are porting software, you can often just wrap some chunk of code in a library wrapper and make it available to the software port. If you don't reuse code, then you are just wasting time doing the same thing twice. Refactoring your system in chunks, treating each interface as a library or service that can be swapped out can often have vast leaps and bounds in terms of performance and reliability.

Zak
+1  A: 

While most people normally write better code the second time around, I think Joel was trying to make a point that there's this mystical, yet unfounded faith in the "complete rewrite," as if that would solve all problems and automatically result in beautiful, elegant, and quality code. Many of the pragmatic issues with this have already been addressed, but I think there's also a significant psychological issue at hand here: I think a lot of developers tend to get overwhelmed easily and can lose faith in a large, complicated system much too quickly. As Joel pointed out, there are other, much more productive ways of improving a large software system (e.g. refactoring code, optimizing and tweaking algorithms), but writing fresh code would seem much more "satisfying" than trying to work with a large, complicated codebase.

In other words, complete rewrites aren't inherently a Bad Thing, but they aren't always the cure-all that many (including myself) have sometimes thought they were.

htw
But occasionally they are, rewrites can make life better, it's the natural cycle of things, some die and others are born. Without rewrites we'd all still be using fortran and cobol.
Robert Gould
Interesting point. I guess I should have said that they aren't *always* a cure-all—edited to reflect that.
htw
A: 

It's taken me approx. 12 months to re-write a 30k line console client app written in C into a C# WPF app. Not all the time was for the re-write as the original app required maintenance and small updates as well. 20% of the effort was on the app's core functionality the rest was "discovering" and accommodating obscrue features used by 1 or 2 users.

That's a small app example the big question is as the size of the original app increases whether the re-write effort is linear, logarithmic or exponential. Joel's article points towards exponential which is a bit scarey as the 30k app I re-wrote was the client side of a 300k server app that also desperately needs a re-write (desperately == costing the business 10's if not 100's of thousands annually).

sipwiz
+1  A: 

Of course it's better the second time, for three important reasons.

  1. Solid Use Cases and Scope Definition. You understand the User Stories better, because you've seen them in action.

  2. Clear, Clean Business Rules. So you can simplify things, and you know what you can leave out (and what should be left out.)

  3. Most importantly Firm Patterns. You have a solid handle on the patterns and techniques that you will use consistently throughout, based on the final 25% of the first try, and subsequent refactorings, while it all came together.

I don't understand Joel's assertion. It's like saying that practicing a composition for the piano doesn't make you any better.

Conversely, of course hubris and pride can cause your wheels to fall off, but that's a social problem, not a software development problem.

le dorfier
A: 

Of course you will do better the second time around. That's what refactoring is all about.

JP Alioto
+3  A: 

I just watched a presentation from David Heinemeier Hansson he has given at RailsConf 08 about dealing with legacy software. In this presentation he talks about exactly what this question is about: How to deal with old, messy code. One point he talks about, is that code becomes legacy the moment you commit it. I becomes legacy without changing -- you change, get more experienced. That's the reason, that if you rewrite your software from scratch, than 2 years later, when you're done, you will again think the code is ugly, because you have improved even further.

Try to take parts of your code and improve these parts in small steps. If you have a new, better code that does less than your old code, your customers -- if there are any left -- will not be very happy.

Tobias Schulte

related questions