tags:

views:

86

answers:

4

Programmers often whine that if only they had a faster/more memory/more cores machine they would be much happier (they never say more productive).

Can anyone think of a programming benift for whatever language/protocol system of having two idle quad quad core servers available in addition to the programmers workstation.

I can't because most compilers/interpreters are often single treaded per say and follow a compile/check syntax phase then a link/binding sequence. And no ability to farm the compile phase, where many source files are compiled, exists even in modern IDEs.

+5  A: 

distcc: a fast, free distributed C/C++ compiler. Systems really do build much faster.

Now you could as well run distributed build tools on your developers machines, as they are unlikely to be building at the same time.. but well having dedicated build clusters is way cooler.

It also makes practical daily automated builds and'll help you score highly on the Joel Test.

Will
I did not know this existed. Excellent
There are many others. Here's a handful, but more can be found by googling: http://wiki.gridengine.info/wiki/index.php/Distributed-Compilation
Will
+2  A: 

Is your code modular? If so, your compilation can definitely be farmed out. You just have to build several modules in parallel. Only the later (less-intensive) phases, like linking, cannot be parallelized.

And then the things don't stop after compilation. You can run various analysis and you can run your tests. All of these are easily parallelizable.

As for IDE support... I hope you don't need to recompile the entire project in your daily usage of your IDE. You can take benefit from those servers by using them for automated builds.

Martinho Fernandes
Yes. Automated builds might be a good idea and analyzing the code or unit tests on the just completed build which normally keep a programmers workstation busy and therfore unproductive.
+1  A: 

In the earliest years of my programmers career I thought about direct relation between workstations horse powers and my personal productivity. As my experience evolved I found I was totally wrong. What is more important - productivity is gained by the tools.

But yes, I will be happier if my workstation will have more MHz/memory that I currently have. Why? Comfort. From development point of view I doubt it will free even couple of minutes per day only because compiler is running faster.

Andrejs Cainikovs
The faster by workstation is the more likely I am to play games on it. Which is not productive
Any box I can put my hands on can run this: http://en.wikipedia.org/wiki/Nethack, so I can't use low specs to prevent myself from playing it... :)
Martinho Fernandes
Now we are showing out experience. I used to play Rogue back in the good old days on a Control Data box using the PLATO educational system which gave the game primitive graphics. You actually had a 3D effect walking round the dungen levels. Monsters and objects were not rendered special.
Or it might of been called Moria or something. The memory syapses are offline.
I think I'll upvote this answer for the incredibly good taste demonstrated in the comments...
Martinho Fernandes
A: 

Systems exist that make several separate devices networked together appear as a single machine, which is termed "single system image". This is especially widespread in Linux, e.g. kerrighed.

Will
I wish Windows did this properly without asking for extra fees for licences or wierd configuration setups. It would be nice to add the servers somehow to my workstation and the kernal will farm thread/programmes out to all cores not just the ones present on my dev box.