I tend to think that keeping up-to-date with the latest version is a good thing. That doesn't mean upgrading as soon as the product is released - but within a relatively short timeframe afterwards (3-6 months).
Upgrades can be for application server, .NET Framework, Java SDK, database, third-party components, etc.
I've found that vendors generally provide a reasonably straightforward upgrade path from version to version and upgrading becomes more problematic if the upgrade must skip a few versions (e.g. .NET Framework 1.1 -> 3.5 sp1)
If an application is working well in production without any problems I can understand the desire not to change anything and avoid the overhead of regression testing. However, with more widespread adoption of automated unit testing, if a new version offers significant benefits in terms of performance and/or features is it worth upgrading?
I've found in my experience that products are not usually kept up-to-date and when the particular version of the product is no longer supported the task of upgrading to the latest version tends to be quite time consuming and expensive.
My question is:
Is it a good idea to keep up-to-date with the latest version of major system components such as operating systems, application servers, databases, frameworks, etc?
Edit
Currently we have Oracle Application Server 10.1.2.x which is certified to use up to Java SDK 1.4.2. The latest version of Java is now 1.6 (and 1.7 is currently in the works) and Oracle Application Server is up to 10.1.3 (which is certified for SDK 1.5+).
By upgrading to SDK 1.5 (at least), it opens up the possibility for new applications being developed to use existing infrastructure to use more current versions of supporting frameworks such as Hibernate 3.3 or iBATIS 2.3.4 as well as later standards for JSP, JSF, etc.
If these upgrades are not done the new application being developed is limited to using old versions and consequently will add to the time and cost of future upgrades.