Why would developers need the latest or best in software and hardware?
When this came up in discussion in my team, I thought we need things more stable than "the latest" can provide, even if that means sacrificing some measure of "best".
Why would developers need the latest or best in software and hardware?
When this came up in discussion in my team, I thought we need things more stable than "the latest" can provide, even if that means sacrificing some measure of "best".
Developers need to be running the same software that production will run. A developer developing on a personal database that is SQL Server 2008 may be shocked when he tries to put something on production (which is SQL Server 2000) that his code doesn't work bwecasue the prod system doesn't allow 2008 syntax. Same thing with other software, if prod is not the current version, devs need to stick with the version prod has unless they are doing the code changes for the upgrade.
Devs often need better hardware than users, but they also need access to a typical user machine to test on. It's works on my machine is an excuse that doesn't fly to the user who doen't have access to the same kind of power.
The most subjective thing here is "best". For some that might be newest, quickest, for others that might be Stable, Slow (deliberately slow hardware is good for developers!)
Personally, I think it's better to have hardware and platform software that most closely replicates the production platform. Developing on a 64-bit server isn't a good idea when deploying to a 32-bit platform, or vice versa. If the server is running Apache 2.10 and PHP 5.2.4, then it's not a good idea to develop against Apache 2.11 and PHP 5.3.0. If you're developing on a fast box with scads of memory and a local database, you don't get a feel for the performance when deployed to a less powerfulk server with a remote database
If you're wasting much time waiting until your computer compiled, did other stuff ... a hardware upgrade makes sense. You (almost) never need the very newest stuff, but something current is always nice. Just ask yourself how much your developers earn and how much money you waste if they are waiting on the computer.
With software I would be much more conservative and try important upgrades (OS, dev environment,..) on a different computer and see if something breaks.
Yes, because otherwise you fail point 9 on The Joel Test.
9: Do you use the best tools money can buy?
If compiling takes even 15 seconds, programmers will get bored while the compiler runs and switch over to reading The Onion, which will suck them in and kill hours of productivity.
Even minor frustrations caused by using underpowered tools add up, making programmers grumpy and unhappy. And a grumpy programmer is an unproductive programmer.
I've had jobs where I've had relatively recent and good hardware. I'm also had jobs where I've tried to juggle half a dozen large windows on a 15" display, on a system that swapped whenever I needed to edit an image. Guess which made me more productive?
Developers might not need the "latest" or the "best", but they do need something decent.
Depends on what the need is - if developers want the best software/hardware for job satisfaction, then its important to recognise that good developers are generally very interested with technology and keen to be on leading edge. And, job satisfaction for this class of developer is related not just to salary, but also to the conditions in which they work - of which the tools (software/hardware) they use is relevant. It can be a lot cheaper to provide uptodate technology than high salaries.
However, if your team members believe that applications must be deployed in production environment on latest & unproved technology, then they are wrong. Its critical to deploy on proven, stable technology. Even if that means staying a few revisions behind (the exception to this being the need to keep current with security patches).
Yeah, of course we do! I use L'Oreal every day, because I'm worth it!
It depends - there's arguments either way. If you're developing say the Linux Kernel, yes you do. Why? You need to support the latest hardware out there, the next best idea etc. If you've developing a game, you need access to the latest DirectX-compatible hardware etc.
By contrast, if you're building line-of-business apps, you possibly don't. You'll be supporting clients on out of date hardware and ususally operating system (XP), so actually, the best approach is to mimmick the deployment environment.
This is subjective because no two software development projects are the same. Broadly - if you're doing something where the hardware really matters, newer is usually better, if not it makes no odds.
It has more value to developers than to the business, usually.
We, developers, want to stay sharp and have the latest skills that increase our market value.
Business wants to minimize costs and avoid upgrading to new versions every year because it yields no value (most of the time).
It is a conflict of interests and should be taken as such. No less, no more.
For most businesses, it's neither wise to be on the bleeding edge of tools and technologies, nor to fall too far behind.
If a firm goes with the latest and greatest development platform and language, a lot of time will go in to the developers learning to leverage that platform including working through the inevitable bugs and lack of tutorials, best practices, etc. That's way cool for the dev team but probably not what most companies want to spend salary on.
On the other hand, if you wait too long to take advantage of the latest development platform and language, your developers are less (sometimes much less) efficient than they could be and will likely develop a feeling of being considered second-rate, or at least that feel they work for a second-rate company.
On the hardware side, the very latest technology is usually not worth the price tag. Personally I buy hardware that's 1-2 generations older than the "best" for home use (I don't REALLY need 150 FPS for my favorite game, but I do need to take my family to Disneyland). That same rule seems to work well for business purchases as well. Might not be worth having a SSD (quite yet) for many development scenarios, but if your devs are struggling along with 1GB of RAM something is amiss.
Hardware, absolutely. Nothing burns more cash than a developer sitting around waiting on things to complete/work/reboot/whatever. So no question, 100%, new hardware (I'd say every 2 years) for the dev. New PCs are cheap, people aren't.
Dev enviro (vstudio) - I would argue yes as well. The newer versions have things focused on developing more, faster and at higher quailty. Even more so now, they can be targeted to certain versions. Big win for older systems built on .net 2 ... you can still use 2k8 or 2010 so there's less worry of if it'll work in prod or not.
(Warning : this'll probably piss some people off) In regards to older software (such as someone commented on sql 2000) I would argue from a security/admin standpoint - "why are you on 10 year old software?" It wasn't designed to live in this world, that's why there's newer versions (are you on windows 2000? I hope not). If the only reason is "well it's because it's in production" then you should immediately begin exploring ways to upgrade to newer systems, period. I've heard time and time again how it's a horrible idea ... until it goes out of support, no one know's how to upgrade it and there's no longer tools to upgrade/migrate/recover/whatever. Cut the excuses, get that done as part of your development. Upgrading platforms should be part of the lifecycle of an application and that risk can be nulled by normal test deployments. Been there, done that, it works -- that's what UAT is for :-)
Yes and no.
Due to the type of software development I do (embedded), I tend to break this problem up into two categories--host and target. I develop my code on the host. I execute my code on the target. Always.
As far as host development goes, the latest and greatest can help cut down on software development time. As for target development, this is what the code runs on ... period. If it is not responsive or does not run properly on it, then the software must be fixed (ok sometimes the h/w does too, but that costs much more $$$).
Your mileage may vary. Hope this helps.
Do developers need the latest and best tools?...
Definitely
...Some people in our team seem to think we need we need the best while I think we need things stable.
If is not stable I wouldn't call it the best.
The developer MUST test and debug in an environment as close as possible to production.
But that doesn't mean he have to develop in it!
What if you were an iPhone developer? You couldn't develop using an iPhone.
What if you were to deploy to a 100+ cores server? Would you need a 100+ cores machine to develop?
We should always use the best tools for the job and we should always test and debug in an environment as close as possible to production.
I more productive with a stable environment than with new stuff. Constantly upgrading hardware and tools takes a lot of time and causes distraction. My sweetspot for the production environment seem to be a bit over a year behind on software and slightly more on hardware. (I am always trying out new stuff but not in production)
One exception was Windows 7 and SSD which I got as quick as possible.
All other points aside regarding software and the absolute best hardware, at least start by getting the developers 2 or 3 monitors for each workstation. This can go a long way in terms of developer satisfaction with their setup, is relatively low cost, and low risk in terms of how it affects the development environment vs the production environment. Also it is an easy upgrade to do soon while you sort out the more complex changes for later.
2 squares or wide screens is good, 3 squares is awesome.
No, we don't need the best. However, we will likely want the best tools. Best though is quite subjective as this can mean all kinds of different things really. Some may want the newest, some may want the most expensive, some may want the highest bang for the buck, etc. We all have subjective preferences. Just look at how many different places one could get a cup of coffee, hamburger or pizza in Canada or the U.S. and you'll see tons of choice for what some may prefer which can be all over the map.
Some may want the best to be more productive, some may want the best to prevent tool envy, some may want the best in order to be proud of where they work, etc. There are many reasons for wanting such things, but I'd doubt that they are needed. If someone wanted to put up a $1,000,000 test to see if someone could program for a year using very old tools I doubt there would be a shortage of people willing to try such a program.