views:

241

answers:

5

Given that every software project only has so many programmer-hours dedicated to it, how much would you spend on making sure the product is backward compatible with the previous versions? Actually there are several points to consider:

  • Does the age of the software affect your decision? Will you invest less time in backward compatibility when the program is newer?
  • Is the decision based solely on the number of clients with installed copies?
  • Do you make an active effort to produce code and file formats that supports future changes?
  • When you're developing v1.0, do you try to built to make it easier for v2.0 to be backward compatible with v1.0? (Leaving "reserved" fields is an example.)
  • How do you decide that "No, we aren't going to support that anymore" on features?
+1  A: 

My take on backward compatibility of software:

1.)If its a widely used product already by many clients, then i would make sure that the new version of this product is still using the same "base code"(Code which achieves basic functionality of the application under development). The new features should be factored into this code base or built on top of this code base with as little change needed in the execution environment of this application as possible. You don't want to make your existing users perform lot of changes in their existing installations. So its a trade-off between supporting a new functionality and revamp in the existing setup and usage process for the client.

2.)In a new product, If possible identify all possible features of that application right in the beginning even before v1.0 is out. Identify which features u are going to ship in v1.0. and which ones would be kept for later releases. Wherever possible keep these "later time features" in mind while design, code implementation, finalizing the output from/of the application to accommodate features in future versions. e.g. Leave additional elements/bit fields in your data structures.

-AD.

goldenmean
+6  A: 

The client base is key into determining whether or not you should support large backward compatibility issue.

Basically, you need to evaluate that like any other non-functional requirements you need to implement, and you need to carefully specify what is included in a "backward compatibility" feature:

  • API compatibility. This means that subsequent versions of a library provide the same API that previous versions do, so programs written against the previous version will still be able to compile and run with the new version. In addition to actually leaving the same functions around, this also implies that those functions all do the same thing in the newer version that they did in the older ones
  • Application Binary Interface, or ABI, compatibility. This means that backward compatibility is preserved at the level of the binary object code produced when you compile the library.
    There is usually some overlap between API and ABI compatibility, but there are important differences. To maintain ABI compatibility, all you have to do is ensure that your program exports all of the same symbols.
    This means all the same functions and globally accessible objects need to be there, so that programs linked against the prior version will still be able to run with the new version.
    It's possible to maintain ABI compatibility while breaking API compatibility. In C code, leave symbols in the C files but remove them from the public headers, so new code that tries to access the symbols will fail to compile, while old code that users compiled against the previous version will continue to run
  • client-server protocol compatibility. This means that a client using the version of the network protocol provided in the older releases will continue to function when faced with a newer server, and that newer client programs will continue to work with an older server.
  • data format compatibility. Newer versions of the code need to be able to work with data files written out by older versions, and vice versa. Ideally you should also be able to build some forward compatibility into data formats. If your file-handling routines can ignore and preserve unrecognized fields, then new functionality can modify data formats in ways that do not break older versions. This is one of the most critical kinds of compatibility, simply because users become very upset when they install a new version of a program and suddenly cannot access their old data.

If you combine the previous criteria (nature of the backward compatibility) with the nature of your client base, you can decide that:

  • If your clients are internal to your company, the need is lower, and 2.0 can break significant functions.

  • If your clients are external, a 2.0 might still break things, but you may need to provide migration guide

  • On the extreme, if your clients are the all world, as I already mentionned in this SO question about java, you may end up providing new functionalities without ever deprecating old ones! Or even preserving BUGS of your old products, because client's applications depends on those bugs!!


  • Does the age of the software affect your decision? Will you invest less time in backward compatibility when the program is newer?
    I believe this has to do with what is already deployed: a recent program will have to deal with fewer backward compatibility needs than one which is around from 20 years.

  • Is the decision based solely on the number of clients with installed copies?
    It should be based on a business case: does your migration - if needed because of a lack of backward compatibility - is able to be "sold" effectively to your clients (because of all the new shiny features it brings ?)

  • Do you make an active effort to produce code and file formats that supports future changes?
    Trying to predict "future change" can be very counter-productive and quickly borderline to YAGNI (You Ain't Gonna Need It): a good set of migration tools can be much more effective.

  • When you're developing v1.0, do you try to built to make it easier for v2.0 to be backward compatible with v1.0? (Leaving "reserved" fields is an example.)
    For the internal applications I have worked on, no. A Parallel Run is our way to ensure a "functional" backward compatibility. But that is not a universal solution.

  • How do you decide that "No, we aren't going to support that anymore" on features?
    Again, for internal applications, the decision process can be very different than for an externally deployed one. If a feature does not bring any added value for the business, an internal "coherency" task is set to check with every other internal application the cost of their migration (i.e. "not using anymore this feature"). The same task is much harder to do with clients outside of your organization.

VonC
+1  A: 

A lot. If you don't want to piss off every one of your loyal customers!

Alex Baranosky
+2  A: 

The more your system is used day-to-day, the more you should focus on it.

The more your system is deeply embedded in the core processes of your clients, the more you should focus on it.

The more your system has competitors, the more you should focus on it.

The more users that use older versions, the more you should focus on it.

The more complex and deeper buy-in there is for a client to your system, in terms of how big of an impact your software has on their business, the more you should focus on backward compatibility.

If you can't help them along onto new versions through attractive pricing, etc., it might be worth considering the risk to forcing everyone up.

Like Vista, or Office 2007. Those were terrific in helping me to Apple.

Jas Panesar
A: 

My experience is with complex shrink wrap systems with relatively few (100 - 5000) users.
Marketing often has a gotta have it attitude on backward compatibility without a full appreciation of the lifecycle costs. For example, the savings for maintaining bugs in your system for the current user base can easily be dwarfed by the support costs for new users over the lifetime of the system.

Lee