views:

141

answers:

6

When a new version of a framework or language appears (e.g. .NET 3.5, SQL2008), what approach do people take to when to adopt/upgrade?

Generally developers will say as soon as possible (they want it on their CV and from a management perspective giving them what they want provides a motivation boost) but commercially there is often little incentive (few clients demand the latest version) and from a cost perspective (retest, training) there is often a disincentive.

I'm particularly thinking of "on-going" systems and projects (such as in a software house) which exist and evolve over years where taking the "new projects use the new technology" approach doesn't work.

Are people driven by specific requirements (the need to use a new feature, a potential or existing client demanding support for it), do they formally assess it (in which case what are the criteria) or do they upgrade as a matter of routine (in which case when - leading edge vs. bleeding edge)?

Do people think that not being on the latest version of something should be considered technical debt and managed as such?

Or is "if it ain't broke don't fix it" a valid approach?

+2  A: 

When the benefits of upgrading (more features, or a bugfix you need) outweigh the risks/costs involved (new issues, breaking existing code).

Visage
So based on that you would potentially stay on an old version forever if there was no specific need?
Jon Hopkins
Absolutely, yes.
Visage
@Tyrannosaurs I think for business reasons that would be true. However with a legacy technology you will find less people to maintain it less vendors to support it, That's why people upgraded from horses to cars.
John Nolan
Yes. There's an obvious caveat that if the 'gap' between the version you are using and the current version gets too large then you have a risk that you will struggle to get support for the version you use. This risk should be regularly assessed.
Visage
This is why there are still databases out there running SQL Server 6.5 or Oracle 8. It can be very expensive to upgrade to a new version of database software (especially in testing all applications that affect the data not just one) and unless it has some new functionality you need (or necessary support is getting hard to come by), why break existing business critical data for no reason?
HLGEM
+2  A: 

We look at the support lifecycle costs. For how long are the older versions supported, and at what costs? Platforms like Windows and Java tend to move fast as compared to mainframe environments, and part of the cost of doing business on those platforms is to perform periodic upgrades. In a rational world, that is!

New versions can have killer features we need -- but that is rare in enterprise development. The main positive selling points of new versions (as opposed to negative ones such as expired support) tends to be greater developer efficiency, which is hard to measure. Against that, as you indicate, the cost of retraining must be considered, not only for the initial developers, but, crucially, for maintenance. In each upgrade, some applications tend to be left behind as too critical to retire, and too expensive/fragile to upgrade. Over time, the number of platforms and versions you have to support increases overall technical debt (no matter their age).

Another criterion for upgrading to new versions (which you note) is the ability to attract and retain staff. With the current economic phase, that's playing second fiddle, but still cannot be ignored completely. You want to have at least a seasoning of enthusiastic and knowledgeable developers.

Pontus Gagge
+4  A: 

Read up on Technical Debt. This is a simple cost-benefit decision.

The "if it ain't broke don't fix it" is a common management policy that says "tomorrow's dollars aren't worth as much as today's, so don't plan for future improvements." Eventually technical debt accumulates to the point where the product can no longer limp along.

The most common breaking point is when some piece of the infrastructure is no longer supported. By then, incremental change is impossible.

Reinventing from scratch is a new capital investment. Fixing existing code is an expense. The accounts force management to make technically crazy decisions.

In the case of open source software, it requires careful technical management since there's no official "support sunset" announcement from Oracle/Sun. Bad technical management, of course, leads to technical bankruptcy.

S.Lott
All of the "bad things" in the world can be traced back to accountants! :-) For example: Why is there crappy software? The bean counters said "ship it!" to make the quarter's numbers.
Dan
I prefer to think of it as "technically crazy" not "bad". Although, there are some accounting policies that actually are bad: i.e., "undeveloped" land is a liability and useless "development" is an investment and looks better on the books than holding land as "wilderness". Other than that example, most accounting isn't "bad", it just reinforces some crazy policies.
S.Lott
Open Source software has support issues, also, and you'll get the best support on relatively recent versions. You can always hire somebody to support that 8-year-old Linux system with the oddball applications, and they can learn enough, but it'll cost you big.
David Thornley
+1  A: 

I think the killer question is whether your app will survive long term if you NEVER upgrade the platform/language version. If you think it can't, you may as well upgrade sooner rather than later, as it will only become harder.

Think about how long your app should be actively developed until you need a full rewrite. If you never plan to rewrite it, I would upgrade continually. Consider how difficult it will become to find the best developers if you are working in an outdated technology. Consider how new framework/language features could speed up your development process in the long term, for a bit of short term pain.

JonoW
+1  A: 

When you develop for Microsoft based platforms, like a Windows Forms App for Windows or ASP.NET webapp for Windows Server, the nice time to migrate is for every two major versions of OS.For example, if your app has been developed for Windows 2000, you ought to migrate to Vista though XP can be neglected. Similarly, if it were designed for XP SP2, you can safely ignore Vista and target Win 7. Usually Microsoft never breaks (or rarely breaks) incremental OS updates. So an app running on today's OS will definitely run on the next. But never on the one following it. (It if runs how can M$ make money???)

Source: Self... Windows Developer for over 5 yrs)

Mugunth Kumar
+2  A: 

When you really need to. .NET 1.0 was crappy, 1.1 was a nice upgrade, but Web development with VS2003 was not so smooth. Things improved with VS2005 and .NET 2.0 – and I see still many developers and companies are stick to .NET 2.0. Previous versions were so fresh, version 2.0 was mature tech. So, if you were happy with 1.1, why would you upgrade? If you are happy now with 2.0, why upgrade to 3.5 or 4.0?

smok1