views:

296

answers:

8

We should develop on slow boxen because it forces us to optimize early.

Randall Hyde points out in The Fallacy of Premature Optimization, there are plenty of misconceptions around the Hoare quote:

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.

In particular, even though machines these days scream compared with those in Hoare's day, it doesn't mean "optimization should be avoided." So does my respected colleague have a point when he suggests that we should develop on boxes of modest tempo? The idea is that performance bottlenecks are more irritating on a slow box and so they are likely to receive attention.

+20  A: 

This should be community wiki since it's pretty subjective and there's no "right" answer.

That said, you should develop on the fastest machine available to you. Yes, anything slower will introduce irritation and encourage you to fix the slowdowns, but only at a very high price:

Your productivity as a programmer is directly related to the number of things you can hold in your head, and anything which slows down your process or impedes you at all lengthens the amount of time you have to hold those ideas in short term-memory, making you more likely to forget them, and have to go re-learn them.

Waiting for a program to compile allows the stack of bugs, potential issues, and fixes to drop out of your head as you get distracted. Waiting for a dialog to load, or a query to finish interrupts you similarly.

Even if you ignore that effect, you've still got the truth of the later statement - early optimization will leave you chasing yourself round in circles, breaking code that already works, and guessing (with often poor accuracy) about where things might get bogged down. Design your code properly in the first place, and you can forget about optimization until it's had a chance to settle for a bit, at which point any necessary optimization will be obvious.

Paul McMillan
That's my opinion too. Development speed is often crucial for being competitive. I'd like to add however that it might be good to *test* things on slow machines, especially if the code is user interface.
liori
Test on what your customers are going to use, which probably means low-end machines with restricted user accounts (if you're using Windows) or user accounts with no sudo privileges (OSX/Linux/other Unix-based systems).
David Thornley
A: 

Depends on your time to delivery. If you are in a 12 month delivery cycle then you should develop on a box with decent speed since your customers' 12 months from now will have better "average" boxes than the current "average".

As your development cycle approaches "today", your development machines should approach the current "average" speed of your clients' boxes.

jmucchiello
Someone doesn't like programming for the real world? Yes, it would be nice to always have to biggest baddest box for development. That's not good if your customers can't run the app when you finish it.
jmucchiello
+2  A: 

I guess it would depend on what you're making and what the intended audience is.

If you're writing software for fixed hardware (say, console games) then use equipment (at least test equipment) that is similar or the same as what you will deploy on.

If you're developing desktop apps or something in that realm then develop on whatever machine you want and then tune it afterward to run on the desired min-spec hardware. Likewise, if you're developing in-house software, there is likely to be a min-spec for the machines that the company wants to buy. In that case, develop on a fast machine (to decrease development time and therefore costs) and test against that min-spec.

Bottom line, develop on the fastest machine you can get your hands on, and test on the minimum or exact hardware that you'll be supporting.

SnOrfus
A: 

I typically develop on the fastest machine I can get my hands on.

Most of the time I'm running a debug build, which is slow enough already.

tfinniga
+10  A: 

Slow computers are not going to help you find your performance problems.

If your test data is only a few hundred rows in a table your db will cache it all and you'll never find badly written queries or bad table/index design. If your server application is not multi-threaded server you will not find that out until you stress test it with 500 users. Or if the app bottlenecks on bandwidth.

Optimization is "A Good Thing" but as I say to new developers who have all sorts of ideas about how to do it better 'I don't care how quickly you give me the wrong answer'. Get it right first, then make it faster when you find a bottleneck. An experienced programmer is going to design and build it reasonably well to start with.

If performance is really critical (real time? millisecond-transactions?) then you need to design and implement a set of benchmarks and tools to scientifically prove to yourselves that your changes are making it faster. There are way too many variables out there that affect performance.

Plus there's the classic programmer excuse they will bring out - 'but it's running slow because we have deliberately picked slow computers, it will run much faster when we deploy it.'

If your colleague thinks its important give him a slow computer and put him in charge of 'performance' :-)

james
without slow developer boxes, when would I have time to go on SO ;)
JoelFan
+1  A: 

for the love of Codd, use profiling tools, not slow development machines!

Steven A. Lowe
Nice! http://en.wikipedia.org/wiki/Edgar_F._Codd
Ewan Todd
A: 

Optimization should be avoided, didn't that give us Vista? :p

But in all seriousness, its always a matter of tradeoffs. Important questions to ask yourself

What platform will your end users be using? Can I drop cycles? What will happen if I do?

I agree with most that initial development should be done on the fastest or most efficient (not neccesarily the same) machine available to you. But for running tests, run it on your target platform, and test often and early.

Moki
+1  A: 

If you are programming on hardware that is close to the final test and production environments, you tend to find that there are less nasty surprises when it comes time to release the code.

I've seen enough programmers get side-swiped by serious, but unexpected problems caused by their machines being way faster than their most of their users. But also, I've seen the same problem occur with data. The code is tested on a small dataset and then "crumbles" on a large one.

Any differences in development and deployment environments can be the source of unexpected problems.

Still, since programming is expensive and time-consuming, if the end-user is running slow out-of-date equipment, the better solution is to deal with it at testing time (and schedule in a few early tests just to check usability and timing).

Why cripple your programmers just because you're worried about missing a potential problem? That's not a sane development strategy.

Paul.

Paul W Homer