When a new version of a framework or language appears (e.g. .NET 3.5, SQL2008), what approach do people take to when to adopt/upgrade?
Generally developers will say as soon as possible (they want it on their CV and from a management perspective giving them what they want provides a motivation boost) but commercially there is often little incentive (few clients demand the latest version) and from a cost perspective (retest, training) there is often a disincentive.
I'm particularly thinking of "on-going" systems and projects (such as in a software house) which exist and evolve over years where taking the "new projects use the new technology" approach doesn't work.
Are people driven by specific requirements (the need to use a new feature, a potential or existing client demanding support for it), do they formally assess it (in which case what are the criteria) or do they upgrade as a matter of routine (in which case when - leading edge vs. bleeding edge)?
Do people think that not being on the latest version of something should be considered technical debt and managed as such?
Or is "if it ain't broke don't fix it" a valid approach?