views:

501

answers:

16

Why would developers need the latest or best in software and hardware?

When this came up in discussion in my team, I thought we need things more stable than "the latest" can provide, even if that means sacrificing some measure of "best".

A: 

Developers need to be running the same software that production will run. A developer developing on a personal database that is SQL Server 2008 may be shocked when he tries to put something on production (which is SQL Server 2000) that his code doesn't work bwecasue the prod system doesn't allow 2008 syntax. Same thing with other software, if prod is not the current version, devs need to stick with the version prod has unless they are doing the code changes for the upgrade.

Devs often need better hardware than users, but they also need access to a typical user machine to test on. It's works on my machine is an excuse that doesn't fly to the user who doen't have access to the same kind of power.

HLGEM
A: 

The most subjective thing here is "best". For some that might be newest, quickest, for others that might be Stable, Slow (deliberately slow hardware is good for developers!)

blissapp
I disagree that deliberately slow hardware is good for developers. I assume you mean so that they don't write programs that run fast on their machines, but not on the customers' machines? The better way to ensure this is to develop on a fast machine, but test on a slower one. Developing on a slow machine will cut productivity.
Mark Byers
+100 to Mark. Nothing is more irritating than a stodgy manager that wants to give developers slow machines, working under the mistaken assumption that otherwise they'd be writing bloated, slow code.
Adam Robinson
We'll have to agree to disagree then, I love developing my front end using my netbook. It guaranteed that the interface will be nice and snappy. Problem with my desktop is that it's way better than stadard consumer hardware. Don't get me wrong, everything has its place, but I heartily recommend the slow hardware thing.
blissapp
nothing subjective about the word "best" it has a very objective definition http://dictionary.reference.com/browse/best
fuzzy lollipop
@blissapp, I mean no disrespect here, but if you can't develop a "snappy" user interface without hamstringing yourself with substandard hardware, I would suggest allowing someone else to do that.
Adam Robinson
@fuzzy lollipop - ok, then - what's the best operating system then if it's a fully objective measure then you'll be able to answer that off pat.
blissapp
@Adam Robinson, you misunderstand, I enjoy using my lower grade hardware, especially using my little netbook. I take the opportunity to use that hardware when developing portions of my UI to ensure that it works well on crappy hardware
blissapp
you don't understand, the definition of "best" is not subjective, it is completely defined and agreed upon by everyone that uses the English language. That is what dictionaries are for. Your statement was not about the "best" OS, it was that the word "best" is subjective. You are just being argumentative to be arugmenative
fuzzy lollipop
i now see what area of pedoniticism you're going down, but you're the only one arguing or discussing the meaning of the actual word. I'm not meaning to be argumentative, but you've spun the discussion off at a tangent. tip - "best" is quoted.
blissapp
+8  A: 

Personally, I think it's better to have hardware and platform software that most closely replicates the production platform. Developing on a 64-bit server isn't a good idea when deploying to a 32-bit platform, or vice versa. If the server is running Apache 2.10 and PHP 5.2.4, then it's not a good idea to develop against Apache 2.11 and PHP 5.3.0. If you're developing on a fast box with scads of memory and a local database, you don't get a feel for the performance when deployed to a less powerfulk server with a remote database

Mark Baker
I agree. +1 from me.
Ninefingers
I don't think a developer has to use a slow machine just to judge the performance (maybe a bit egoistical, but I don't want to be stuck with a slow computer). But one should always try it on some hardware that is close to the one the program will run on in the end.
Fabian
-1 The developer should run the best hardware and software and MUST test and debug for the target(s) platform(s). Otherwise all iPhone developers would require to develop into iPhone??!!! that's unpractical.
OscarRyz
@Fabian I wodn't deliberately go for a slow machine, just try to reflect the production environment as closely as I can, so no deliberate performance boosts to the dev servers/tools... desktop editors and separate build servers can be fast/big/kewl (if I could get the budget for it)... though for web testing I'm still stuck withy needing to support IE6 <sigh> so no latest tools for me
Mark Baker
@Mark Baker My sympathies re IE6
Fabian
In this case the "best" tools would be a testing environment that closely matches production. The production environment doesn't have to compile, but the developer does. The "best" tools doesn't necessarily mean one environment any more than a carpenter having the "best" tools would mean one really good hammer.
David
A: 

If you're wasting much time waiting until your computer compiled, did other stuff ... a hardware upgrade makes sense. You (almost) never need the very newest stuff, but something current is always nice. Just ask yourself how much your developers earn and how much money you waste if they are waiting on the computer.

With software I would be much more conservative and try important upgrades (OS, dev environment,..) on a different computer and see if something breaks.

Fabian
+14  A: 

Yes, because otherwise you fail point 9 on The Joel Test.

9: Do you use the best tools money can buy?

If compiling takes even 15 seconds, programmers will get bored while the compiler runs and switch over to reading The Onion, which will suck them in and kill hours of productivity.

Even minor frustrations caused by using underpowered tools add up, making programmers grumpy and unhappy. And a grumpy programmer is an unproductive programmer.

Dolph
But what does best mean? Newest? Most featureful? Most expensive? Or most tried and tested?
Mark Byers
If another tool makes you objectively more productive, it's *better*. Simple.
Dolph
Joel Test is a very controversial thing. I can only vouch for 8 (quiet working environment). 1, 2, 4 are good. The rest depends.
Developer Art
Joel is a very, very smart guy. However, I disagree with this point. You certainly need GOOD tools (not just "good enough"), but it's usually not a good investment to buy the very best available.
Eric J.
http://dictionary.reference.com/browse/bestmost advantageous, suitable
fuzzy lollipop
If you can't measure a productivity difference between a "good enough" tool and the "best" tool, then the "best" tool is a very poor investment.
Dolph
@Eric J.: Investment in tools (hardware/software) is never wise in financial sense. Whatever you buy is worth around $15 on eBay after a couple of years. We need to talk about investment as something that makes you developers more productive.
Developer Art
@Developer Art: 8 (quiet working environment) is the only one I *can't* vouch for. And I personally interpret 7 (a spec) as unit-testing.
Dolph
"Yes, because otherwise you fail point 9 on The Joel Test." - don't idealise the JT.
John
/agree with John. Just cause Joel says it, doesn't mean it's true. It shouldn't be "Joel said this --> it's correct", it should be "Hey this is correct because of this reason. Also, Joel said this."
Claudiu
I wrote this answer with a bit of sarcasm, not expecting people to be so offended by a blog post from August 2000...
Dolph
@Developer: Just because my PC XT was worthless 15 years ago doesn't mean it wasn't a wise investment. I can clearly calculate how much it cost me and how many hours of development time it let me bill to my clients. I could also calculate how much my subsequent PC AT (now also a scrap heap) saved me in compile time, etc. There's a clear calculation that a wise business person should make when purchasing new hardware.
Eric J.
A: 

I've had jobs where I've had relatively recent and good hardware. I'm also had jobs where I've tried to juggle half a dozen large windows on a 15" display, on a system that swapped whenever I needed to edit an image. Guess which made me more productive?

Developers might not need the "latest" or the "best", but they do need something decent.

David Dorward
A: 

Depends on what the need is - if developers want the best software/hardware for job satisfaction, then its important to recognise that good developers are generally very interested with technology and keen to be on leading edge. And, job satisfaction for this class of developer is related not just to salary, but also to the conditions in which they work - of which the tools (software/hardware) they use is relevant. It can be a lot cheaper to provide uptodate technology than high salaries.

However, if your team members believe that applications must be deployed in production environment on latest & unproved technology, then they are wrong. Its critical to deploy on proven, stable technology. Even if that means staying a few revisions behind (the exception to this being the need to keep current with security patches).

Kevin
A: 

Yeah, of course we do! I use L'Oreal every day, because I'm worth it!

It depends - there's arguments either way. If you're developing say the Linux Kernel, yes you do. Why? You need to support the latest hardware out there, the next best idea etc. If you've developing a game, you need access to the latest DirectX-compatible hardware etc.

By contrast, if you're building line-of-business apps, you possibly don't. You'll be supporting clients on out of date hardware and ususally operating system (XP), so actually, the best approach is to mimmick the deployment environment.

This is subjective because no two software development projects are the same. Broadly - if you're doing something where the hardware really matters, newer is usually better, if not it makes no odds.

Ninefingers
A: 

It has more value to developers than to the business, usually.

We, developers, want to stay sharp and have the latest skills that increase our market value.

Business wants to minimize costs and avoid upgrading to new versions every year because it yields no value (most of the time).

It is a conflict of interests and should be taken as such. No less, no more.

Developer Art
+3  A: 

For most businesses, it's neither wise to be on the bleeding edge of tools and technologies, nor to fall too far behind.

If a firm goes with the latest and greatest development platform and language, a lot of time will go in to the developers learning to leverage that platform including working through the inevitable bugs and lack of tutorials, best practices, etc. That's way cool for the dev team but probably not what most companies want to spend salary on.

On the other hand, if you wait too long to take advantage of the latest development platform and language, your developers are less (sometimes much less) efficient than they could be and will likely develop a feeling of being considered second-rate, or at least that feel they work for a second-rate company.

On the hardware side, the very latest technology is usually not worth the price tag. Personally I buy hardware that's 1-2 generations older than the "best" for home use (I don't REALLY need 150 FPS for my favorite game, but I do need to take my family to Disneyland). That same rule seems to work well for business purchases as well. Might not be worth having a SSD (quite yet) for many development scenarios, but if your devs are struggling along with 1GB of RAM something is amiss.

Eric J.
+1  A: 

Hardware, absolutely. Nothing burns more cash than a developer sitting around waiting on things to complete/work/reboot/whatever. So no question, 100%, new hardware (I'd say every 2 years) for the dev. New PCs are cheap, people aren't.

Dev enviro (vstudio) - I would argue yes as well. The newer versions have things focused on developing more, faster and at higher quailty. Even more so now, they can be targeted to certain versions. Big win for older systems built on .net 2 ... you can still use 2k8 or 2010 so there's less worry of if it'll work in prod or not.

(Warning : this'll probably piss some people off) In regards to older software (such as someone commented on sql 2000) I would argue from a security/admin standpoint - "why are you on 10 year old software?" It wasn't designed to live in this world, that's why there's newer versions (are you on windows 2000? I hope not). If the only reason is "well it's because it's in production" then you should immediately begin exploring ways to upgrade to newer systems, period. I've heard time and time again how it's a horrible idea ... until it goes out of support, no one know's how to upgrade it and there's no longer tools to upgrade/migrate/recover/whatever. Cut the excuses, get that done as part of your development. Upgrading platforms should be part of the lifecycle of an application and that risk can be nulled by normal test deployments. Been there, done that, it works -- that's what UAT is for :-)

jeriley
Databases are rarely upgraded immediately as upgrading is extremely risky and data is business critical; there are still people using SQl Server 6.5 out there. It's hard to completely test a database with thousands of tables and thosands of queries, stored procs etc to make sure a new version will not break anything. We haven't upgraded because testing showed it will break something critical and we can't fix it without changing the process completely. Clients aren't willing to pay for that (it's not broke so why should I pay to fix it?), so no upgrades for now.
HLGEM
And just for the record, I would have taken the time to fix the problem and upgraded, but I got overruled.
HLGEM
Right, and that's a HUGE risk and that happens a lot... often ends up biting those who allowed it to sit -- if you've barked that up the chain, documented it, you've done your job. It's not all that complicated to extract a good set of test data and pound on it. Good dba will be able to do that in an afternoon (or at least schedule it). Last db I delt with was a few TB with about 10g of data being written every day. We still managed to pull some of it out ... then we pulled ALL of it out "just to be sure". It works, you can do it, but it's paying now or paying later :-)
jeriley
A: 

Yes and no.

Due to the type of software development I do (embedded), I tend to break this problem up into two categories--host and target. I develop my code on the host. I execute my code on the target. Always.

As far as host development goes, the latest and greatest can help cut down on software development time. As for target development, this is what the code runs on ... period. If it is not responsive or does not run properly on it, then the software must be fixed (ok sometimes the h/w does too, but that costs much more $$$).

Your mileage may vary. Hope this helps.

Sparky
A: 

Do developers need the latest and best tools?...

Definitely

...Some people in our team seem to think we need we need the best while I think we need things stable.

If is not stable I wouldn't call it the best.

The developer MUST test and debug in an environment as close as possible to production.

But that doesn't mean he have to develop in it!

What if you were an iPhone developer? You couldn't develop using an iPhone.
What if you were to deploy to a 100+ cores server? Would you need a 100+ cores machine to develop?

We should always use the best tools for the job and we should always test and debug in an environment as close as possible to production.

OscarRyz
A: 

I more productive with a stable environment than with new stuff. Constantly upgrading hardware and tools takes a lot of time and causes distraction. My sweetspot for the production environment seem to be a bit over a year behind on software and slightly more on hardware. (I am always trying out new stuff but not in production)

One exception was Windows 7 and SSD which I got as quick as possible.

adrianm
A: 

All other points aside regarding software and the absolute best hardware, at least start by getting the developers 2 or 3 monitors for each workstation. This can go a long way in terms of developer satisfaction with their setup, is relatively low cost, and low risk in terms of how it affects the development environment vs the production environment. Also it is an easy upgrade to do soon while you sort out the more complex changes for later.

2 squares or wide screens is good, 3 squares is awesome.

jigglee
A: 

No, we don't need the best. However, we will likely want the best tools. Best though is quite subjective as this can mean all kinds of different things really. Some may want the newest, some may want the most expensive, some may want the highest bang for the buck, etc. We all have subjective preferences. Just look at how many different places one could get a cup of coffee, hamburger or pizza in Canada or the U.S. and you'll see tons of choice for what some may prefer which can be all over the map.

Some may want the best to be more productive, some may want the best to prevent tool envy, some may want the best in order to be proud of where they work, etc. There are many reasons for wanting such things, but I'd doubt that they are needed. If someone wanted to put up a $1,000,000 test to see if someone could program for a year using very old tools I doubt there would be a shortage of people willing to try such a program.

JB King