tags:

views:

295

answers:

5

I tend to think that keeping up-to-date with the latest version is a good thing. That doesn't mean upgrading as soon as the product is released - but within a relatively short timeframe afterwards (3-6 months).

Upgrades can be for application server, .NET Framework, Java SDK, database, third-party components, etc.

I've found that vendors generally provide a reasonably straightforward upgrade path from version to version and upgrading becomes more problematic if the upgrade must skip a few versions (e.g. .NET Framework 1.1 -> 3.5 sp1)

If an application is working well in production without any problems I can understand the desire not to change anything and avoid the overhead of regression testing. However, with more widespread adoption of automated unit testing, if a new version offers significant benefits in terms of performance and/or features is it worth upgrading?

I've found in my experience that products are not usually kept up-to-date and when the particular version of the product is no longer supported the task of upgrading to the latest version tends to be quite time consuming and expensive.

My question is:

Is it a good idea to keep up-to-date with the latest version of major system components such as operating systems, application servers, databases, frameworks, etc?

Edit

Currently we have Oracle Application Server 10.1.2.x which is certified to use up to Java SDK 1.4.2. The latest version of Java is now 1.6 (and 1.7 is currently in the works) and Oracle Application Server is up to 10.1.3 (which is certified for SDK 1.5+).

By upgrading to SDK 1.5 (at least), it opens up the possibility for new applications being developed to use existing infrastructure to use more current versions of supporting frameworks such as Hibernate 3.3 or iBATIS 2.3.4 as well as later standards for JSP, JSF, etc.

If these upgrades are not done the new application being developed is limited to using old versions and consequently will add to the time and cost of future upgrades.

+1  A: 

The "right" answer to this question honestly depends on the exact package / system you're talking about, and how you have it configured. Also, some upgrades can break backward compatibility - so the question is: "How much is going to break and how long will it take to fix it?"

Honestly, as a rule of thumb I prefer to attempt an upgrade in a development environment, and deploy to a testing environment to see how much work it will be.

The real reason I prefer to keep everything at the current version is this:

products are not usually kept up-to-date and when the particular version of the product is no longer supported the task of upgrading to the latest version tends to be quite time consuming and expensive.

I agree, and as such prefer to do a little work each time something can be upgraded rather than waiting and having to sort through everything later on.

Brett Bender
+2  A: 

I've found empirically that it's usually not worth the hassle to do incremental upgrades of a whole OS on personal linux boxes. After one too many times that yum or apt-get has hosed my system when doing a system wide upgrade, I generally limit myself to upgrading only those packages that I need. Especially for a non-GUi user, I can't remember the last time there was a killer feature in Ubuntu itself rather than (say) R or gcc.

Servers are another matter -- on security patches at least you need to stay cutting edge.

In terms of software, upgrading to a new version immediately doesn't give you as much of a head start as one might think. As you deal with the various breakages introduced by the new version, you'll be going through and clearing the underbrush with a machete, posting answers on Stackoverflow and elsewhere.

Then someone else follows 3 months later and benefits from the path that you and others have cut, all the answers that early adopters have posted online. Your effective lead is (maybe) only 1 month.

(The exception might be for something like IPhone platform or Facebook platform, where dev time is measured in days and first mover advantage is huge.)

IMO with hardware it's probably even more important to wait a while till you get a step function jump. Wait till you get to the equivalent of a 'quarter tank of gas' before upgrading, as the longer you wait the bigger the step function jump for your applications. The very first generation of Intel Macs might have been one of the few times when an early upgrade was warranted, because the jump was so noticeable overnight.

All that said, I think being an early adopter of concepts (especially math) is critical (obviously). Software is fine to play around with on your local box, but IMO definitely wait at least 2-3 months after release before bringing it into production.

ramanujan
+1  A: 

It's definitely a good idea to keep fairly close to the current release. By not upgrading you save on any immediate issues, but you save an enormous amount of work for later on in the project/application life cycle. Typically those who make the conservative decision today, will be long gone by the time the absolute critical need to upgrade occurs so they 'get away free'. There it is up to us as technical resouce to point out that short term pain is better than a complete project meltdown later.

My shining example of this is a massive system written on Oracle 8.0, which is still on Oracle 8.0 today - it fails regularly, it's next to impossible to patch as that version is too old, and will be thrown away as soon as anyone has the political cajones to make it so. The cost of not upgrading would currently be running at >$1M/year in downtime, specialist support, and lack of compatibilty with other technologies.

Bottom line is not upgrading today, means big dollars being spent later.

MrTelly
+1  A: 

Upgrade on a test system first. Validate. Wash rinse and repeat.

ojblass
+2  A: 

A good reason to upgrade is that old systems will become unsupported. Java 1.4.2 for example is already at it's end of service life and while there's Java SE for Business (which is basically the same product, but with longer, paid support), it's a good idea to upgrade.

There are few worse things than having a major security problem found in an unsupported product that you require and not having any update path, because newer versions aren't tested.

I've seen a few companies pay major bucks to get support for some products (OS, hardware, ...) after the end of life of those products, just because they didn't have any upgrade strategy at hand.

Joachim Sauer