views:

252

answers:

6

I have secured the budget to upgrade the individual workstations and latops. While newer, bigger screens were welcomed with enthusiasm, the thought of re-installation tools and settings caused most of them to blanch and I got one "Do I really have to?".

How much downtime do you usually have when you move to a new machine?

Do you employ tools or script to set up your dev environment, tools, db's, debuggers etc.specifically for a windows environment?

Is there a standard image that you keep and then let devs move in and tweak the machine as necessary ?

+2  A: 

The last time I upgraded to a new machine, I think it took about 4 hours to get most of the necessary tools reinstalled. Over time, I've had to re-install quite a few more tools, but I think it's worth it.

If you can get a ghost/image of the default tool set (Visual Studio 2003-2008, Eclipse, NetBeans, or whatever you're using), and all the major service packs, that would help a lot with the initial setup.

I think the downtime is definitely worth it, a new, faster machine will make anyone more productive.

Andy White
+2  A: 

You can have 0 downtime by having both machines available. You will not have as much productivity.

This depends on the number of tools needed by the development team. Tools such as Rational Software Architect can take hours to install on their own. The exercise of having the developers list the applications they need before moving in can help you optimize strategies to deploy effectively. Both machines should be available for a fixed period of time and having them available can allow develoers to both work and kick of long running installs at the same time.

Creating a standard image based on the list provided to you can improve efficiency. Having the relvant software on a share could also let them cherry pick as needed and give the development team the feeling that they can go back as necessary.

Tools to assist in catpuring user settings exist. I have only ever had experience with Doctor Mover. If you have 100 or more developers to move it may be worth the cost. I can't complain too much but it wasn't perfect.

ojblass
+2  A: 

One day is generally enough for upgrades. I do keep digital copies of VS.NET so much easier to install.

When it comes to other tools generally it's just better to go to websites and install the latest version.

Also it's a good idea to install tools whenever you need instead of trying to install everything at the same time.

dr. evil
so you make .iso images of things and then mount them and install over the net ? good idead. probably could create a mix .iso that has all the util installs (typically just a setup.exe) on it then ?
MikeJ
Yes that's what I do. Generally I don't bother with a multiple installer or something like that. To me it's convenient getting rid of all physical requirements such a DVD. But if you want you can drop a simple batch file to kick off all required installers.
dr. evil
nice. maybe if I package this up as a group exercise, we can build a script or a front end that can shell to the various installs.
MikeJ
+1  A: 

I have never had a problem with just getting a list of all the software a particular users uses. In fact I have never found the base install to be much of an issue. The parts I tend to spend the most time on are re-configuring all of the users custom settings (very common with developers I find). This is where it is very valuable to have the old machine around for awhile so that the user can at a minimum remote-desktop to it and see how they have things set up.

Wally Lawless
I woiuld agree. Do you know of any tool to find/capture these tweaks?
MikeJ
I wish I could find such a thing. Again, it comes down to the lack of any standard way of tracking such changes. A tool like this would either have to just copy everything, or be tailored to the specific tools that are being migrated. If you find something let me know ;-)
Wally Lawless
+4  A: 

My company essentially virtualized in order to stop wasting so much time with upgrades/system failures.

Whenever a desktop/laptop failed, we'd have to spend a better part of a day fixing it and reloading the software.

So, we went out, bought iMacs for everyone and loaded Parallels (a VMware like product for OSX) on them. Then we made a standard dev image for everyone, and just copied it to everyone's machines.

Essentially, if anyone's configuration got messed, we just loaded in a fresh image and kept on truckin'. Saved a lot of time.

Some additional benefits:

  1. When new software is out, we just make a new image and distribute it. Not OS re-installs or anything like that.
  2. If hardware changes, doesn't matter, just move the image.
  3. You can run multiple os's concurrently for testing
  4. You can take "snapshots" in your current image and revert if you really messed something up.
  5. Multiple builds on the same machine...since you can run multiple os's.

Surprisingly the overhead of a virtualized system is quite low.

We only run the software on a real machine for performance tuning/testing purposes.

Snazzer
this is very creative. A lot of folks dont want ot move off XP but how is the performance? what you gain by keeping XP you give up to virtualization?
MikeJ
Honestly, the virtualization is so good that most of the time I really can't tell the difference. It's important to ensure that you have enough ram to dedicate to your virtual machine, other than that it's pretty flawless. Quite incredible how much time it's saved.
Snazzer
+1  A: 

Depending on how your team works, I would highly recommend having every user receiving a new computer get the latest source tree from your source control repository rather than by copying entire directories. And, I would also recommend doing that before actually sending the old workstation elsewhere or even disconnecting it.

One of the great things about tools like CVS and SVN is that it is quite easy for developers to end up with an unofficial "personal branch" from things that are not properly checked in, merged, etc.

While it will cost time to deal with the shift if things are not properly synchronized, it is an invaluable opportunities to catch those things before they come to haunt you later.

Uri
we've had this problem in the past and its a good point. We set up a CI machine to mitigate this problem since we dont allow private builds to be released except for debug/performance tracking. I am wondering though if we should check in the .isos for installation
MikeJ