I just finished listening to a very eye-opening podcast on Hanselminutes about the definition of "Done". So my question to everyone is "When do you consider a piece of software to be "Done"? Is it when it's fully unit tested? Is it when it's completely documented? What measurement do you use in your development process to determine Doneness of your software?
views:
183answers:
12What measurement do you use in your development process to determine *Doneness* of your software?
Surely dependent on context and purpose of the software?
Lunar Lander (the real thing) would have a very different definition of Done to Lunar Lander the Flash game.
Where I work, DONE is defined by a committee of non-technical managers. You can imagine the fun and games.
When they client(1) considers it done, it's checked in, backed up, and documented.
Also: "done" rarely exists in web dev.
(1) where client may be an internal PM or such
Test, unit test, integration test, webtest, peer QA and end user review in the sprint review. Peer QA decides if anything else is necessary, all tests must pass in CI environment. This is in a scrum web-project.
A good measurement is code churn. Using your source code control software, measure the rate of change. How many lines of code are being removed/added/changed per day. Graph this over time. As you approach being ready to release, this should trend downwards and give an indication of stability and readiness to ship. This assumes that you are actually testing well and making changes to fix bugs or respond to change requests. If your user acceptance test users and integration/unit test activity are continuing to regress and test and you aren't having to make code changes (because they aren't finding anything necessitating a change) then you are probably ready to ship.
If big chunks of code are churning a few days before an arbitrary or externally driven ship date, look out!
When the software can be used to satisfy the requirements that define the system.
But I've always thought, "software is never done, it just reaches an acceptable level of incompleteness."
From a development viewpoint 'done' is described quite well by my friend and mentor Simon Baker, here
Alistair Cockburn, Jeff Patton and Mike Cohn also have the following collected views
Shippable quality, which has to be exercised in a go-live, forces teams to really focus on ensuring that incremental work is more carefully thought through.
'Done' is something which all the above quoted would be the first to agree is always different per team and project; however to satisfy knowing that a given piece of work is done, the team must conduct an exercise at the start to flesh out the measure of done-ness and list those criteria.
In so doing, everyone has agreed by consensus what an acceptable completion point is - whether that includes noting the Task in Excel, or writing documentation (or not) becomes an implementation detail for that team/project. The overriding thing is that everyone's understanding of Done is uniform.
Equally, assuming you reach that definition by consensus, it can also be changed as required by consensus.
When the check clears?
Seriously, every time you write a piece of software, you should have defined what "done" means. First. If you have a customer, then there should be a contract -- specific, measurable, agreed, and testable -- that defines done.
If you don't know where you're going, how will you know when you get there?
Each project will have it's own definition of done, ours is code complete (compiles successfully, etc), unit tested (or some kind of local testing if not possible) and released within one of our packages (so it's available to the other teams).
But the MOST important thing in DoD is every parties should agree on what it is (team, product owner, manager, etc) and it should be some kind of public contract, published in a team portal is a good idea.
Any piece of software at any time is always 80% done. At least, that's what my experience teaches ...