views:

1287

answers:

15

What practically useful tips, tools, techniques, and general advise could you give to those development managers and contractors that are willing to make sure that the quality component of classic "budget, time, scope, quality equation" gets addressed appropriately as part of the initial project plan?

For instance, setting aside some budget and time for delivery of prototypes, proof of concept. Or including automated unit testing coverage of the proposed software as part of the project plan.

Many thanks.

This question is not about process improvement tips. The question is in esssence what you need to include into initial project plan in terms of time, tasks, bugdet, resources, contigency etc to make sure that everyone is clear how quality is going to be defined and achieved, proved as part of the project. Thank you for answers so far, appreciated.

A: 

Have you looked into the agile methodology?

Also, while this does not directly address your question and since this is on my mind at the moment, please make time for Code Review. It's a very important process that will pay dividends throughout the project's life cycle.

Ian P
+1  A: 

Including QA not an easy sell for projects that don't include it, even though it should be.

That being said, my company has a system of daily builds, which go to an integration department on a weekly basis. The daily builds also go to seperate engineering testers (in our department) on a daily basis. We have two levels of testing, daily 'smoke'/regression testing and weekly 'formal' testing.

No matter what, you're most likely going to have to hire testers if you don't already have them.

unforgiven3
+4  A: 

So far the best approach for me was to follow the principles of Agile methodology: You need to work in short iterations (2-5 weeks) iterations. At the end of each iteration the quality of your application should be so good that it could be shipped if needed.

Address all of the quality problems IMMEDIATELY. Don't wait for a 'stabilization' phase at the end of the project as you will never have enough time for that. If you testers find a problem it has to be recorded and fixed with all due haste.

P.S. Rule of the thumb: to get solid quality in your application for every 3 developers you need 1 tester. You could work well with 4:1 ratio. Everything less that 5:1 means that the you will probably not get enough testing done and therefore the quality will suffer.

P.P.S. Unit tests and automated functional testing helps with the quality alot and they could be done by the developers. Just be aware that there IS an overhead (especially with the functional testing automation) and that you will need experienced team to successfully use these techniques

Ilya Kochetov
A: 

Use test-driven development instead of development-driven testing (it really helps!).

ComSubVie
+13  A: 

Not sure I understand the question, since it seems to contain strange management lingo.

Short version:

  • Be agile.
  • Build quality into the process.

Long version:

  • Do not make complete specifications. They will be incomplete, wrong, a huge time sink, and are mainly useful for covering one's ass. (Of course, this does not apply in some niche domains such as embedded programming in transportation, energy or health).
  • Requirements will change. Put a prototype in the hands of your users as soon as possible, then make small (one month or less) iterations. Build your process around changing requirements.
  • Practice test-driven development. You will need at least one experienced TDD practitioner on the team, with some authority. That is the best way to get good test coverage.
  • Keep track of the technical debt. Pad feature development time with time needed to repay technical debt.
  • Practice continuous integration. Do not allow the quality of the trunk branch to drop between releases. Be ready to release at any time, that will force you to keep the quality high.
  • Make code reviews a mandatory part of the integration process. Do not allow any code that did not pass code review to be merged. That will keep developers or their heels about code quality.
ddaa
While being agile is great you will miss out on ROI if you don't pay attention to lean-agile http://en.wikipedia.org/wiki/Lean_software_development
gradbot
Why is that? I thought agile was all about focusing doing the most valuable part first, as long as the customer finds the price acceptable.
Thomas Eyde
Specifications are sometimes mandatory. If the customer wants a fixed price contract, then you need a contract with them for the development. That contract is the specification.
MatthieuF
A: 

Continuous integration with automated testing.

Dennis S.
+2  A: 

Quality Planned for is mostly Design and Testing and should have most of the following parameters:

  1. Well Defined Test Plans/Test Cases that is accepted by Users as valid Real World examples
  2. Proper Unit Testing, Internal Testing
  3. Proper User Acceptance Testing
  4. Stress Testing
  5. Proper Migration Control
  6. Configuration Management
  7. Change Management
  8. Formal Deviation Management
  9. User Training & Documentation

Best Regards

mm2010
+3  A: 

You can't "project-manage-in quality" into the software, at least not in a way that testing or QA improves quality. Stepping on a weight does not imply you loose weight.

But you can do many things regarding project management.

  • Make sure QA has voice in actual decisions. That it reviews software development plans, architecture, detailed design for components, etc. You can't just hand them code and say "now bring me quality".
  • Automate as much as you can. This means less error-prone process.
  • Let the build be fully automated ("press of a button"), including tests. Let the tests be repeatable and extensible.
  • Have bug tracking system and revision control system.
  • Make sure that each bug fix results in a regression test being added.
  • Code review can do wonders to code quality.
  • Have anonymous feedback channel for individual developers to upper management. Often there are some issues that need to be addressed, but are silenced to make a manager "look better".
phjr
+1  A: 

At this level of the project quality is some very soft concept and not addressable be concrete measures.

The project requirements must identify the required quality standards and assign a coefficient for the importance of each requirement.

A global quality acceptance standard has to be defined in terms of allowed defects. ex. No type 1 defect, up to 10 type 2 defects with a workaround or up to 5 type 2 defects without a work around available etc.

Let them define in advance what they will do when the quality goal is not met and prevent them from extending resources in the last moment.*. But force them to get a clear statement what is target quality, what is acceptable quality, thus quality reduction and under which condition the time/budget will be extended.

The budgeting is more difficult. I usually allocate the same amount of time as the design (functional plus technical) to testing. Allocating time for bug-fixing is a two-edged sword. Prepare for questions like: "What! You already expect to deliver bad quality that has to be fixed?" if the suits aren't used to software development projects.

If you already have statistical data available, you put the ratio of previous developments and bug fixing in a "buffer" task for bug fixing. Don't forget to allocate installation and supporting task for the testing period.

The first deliverable of the testing are the tests plans. During the preparation phase for testing, define for each requirement how to test it: preparation, actions to be performed, testing limits, combination, performance etc. LEt them have reviewed by the client and the developers, this helps all sides to ensure the function is fully understood.

During the testing phase, execute these test and keep track of the coverage and the test results.

I've made good experiences with this approach since it leaves most of the definition what is acceptable quality to the client. I try to keep the expectations very low to be sure to earn success in the end even in the worst case. Corportate IT is not a funny world **

Oli
+1  A: 

Caveat - Based on experience as developer and not in Program Management.

Including these metrics and constraints as part of the delivery would really help in ensuring the quality.

  1. Part contracts - see how it goes and then the next - sets expectations
  2. Response times
  3. scalability numbers
  4. CPU and memory constraints
  5. Max network bandwidth consumption
  6. Zero warnings from compilers at max warning level?
  7. Use of source control tool / static analysis tool at the very minimum
  8. Automated builds with max time required for builds
  9. No memory errors as validated using application verifier tool / valgrind etc
  10. Not one crash
  11. No run away logs
  12. Type and number of machines which will be required to run the program
  13. If user data is involved specs on that eg If emailing program, it should send 10MB of attachment within 2 secs on an ethernet link.
  14. Acceptable bug counts - 0 sev1,2 bugs. 10 sev3 bugs. 0 review bugs.
  15. Type and no of Automated tests cases
  16. Max no of concurrent users and max no of users
  17. Adherence of ALL code to a particular coding style (any but some should be there)
  18. Prototyping oriented designs and construction
  19. Fine grained task breakup for estimation
  20. Reputation and background of tech leads and managers on the project
  21. Code review time (done preferably by a competent person u trust) + factoring time required for mods
  22. Clause to be able to reject a module after 50% completion based on code review and design conformance

If you ask me there is nothing more important than a customer's insistence on code reviews and quality. Reviewing the designs and code fairly accurately and having a stick to beat them with is your best assurance for quality. Knowing that your code will be reviewed by someone more important than your colleague next cube always helps.

A shop i once worked with even had the clients calling in and ensuring the developers knew the basics and insisted on trainings for those who do not. (A very valid criteria depending on the type of shop you will be awarding the contract to)

Agile based development on UI interfaces would be the best as that takes care of the heavy feedback cycles that UI work involves most of the time. This is the only piece of work that might be impossible to design up-front unless of course you are able to provide ALL the screens to your customer upfront. Even if you do provide a sample / spec site / spec program that most closely matches your requirement it would help. (If you could summarize the characteristics of this sample you ;love and then spell it out, that would also help)

Thinking back, these are the items i feel that could have remedied some of the more unfortunate projects i have seen.

ps : The coding style should avoid all known bad code smells like functions bigger than x lines, pointy code, no error checks, using exceptions to convey normal errors, use of global variables, go to etc

+2  A: 

Have a listen to Measuring and Managing Project Quality on the PM Podcast. It includes an excellent paper on managing quality on projects.

The basic idea is that if you work in a gold dust factory you can measure quality by how many bags of gold dust go out the door at exactly 1Kg. How many over and how many under. This is obviously very important if you are selling your gold by the bag and not by the gram. Especially since your customers will return (and maybe sue you) for under filling bags, but happily keep overfilled bags.

This quality measurement concept also applies to producing cars, bags of titanium poisonate, or boxes of My Little Ponies.

However projects are by definition unique. Software development projects are one of a kind and often experimental in nature. This means quality measurement is difficult.

You'll have a few hard measurements such as "your system must have an uptime of 99% in the first year", "your software must work on 100% of SOEs deployed across the organisation". Unfortunately you won't know how you are going on these until the end of the project when it is very expensive to change anything.

Instead you need to focus on "quality indicators". These indicators plus the hard measurements all go into your plan. Your plan should list out exactly what measurements and indicators you'll be assessing and when. Also how you will communicate and change them.

You quality indicators can be very simple such as:

1) Create a requirements register at the start. All requirements must be listed and which deliverables in the scope they relate to. (Do this and get a score of 100%)

2) Create detailed BRS for a least 30% of the requirements (hopefully this is for the more complex ones :). (0% BRS gives a score of 0%, 30% gives a score of 100%)

3) Undertake five reviews of the requirements register with the full team during the project. One at the beginning, one at the end and the other three staged through out the project. (easy to work out that 3 full reviews halfway through the project means 100%)

4) Undertake three full reviews of the issues and risk register with the project sponsor during the project.

5) Undertake weekly meeting (part of your communication plan as well) with the sponsor. Each week produce 1 page status report with progress to milestones, budget, and top 5 issues and top 5 risks.

These are five things we know we should do but it's easy to let slip when you "just need to get it out the door". The idea of the quality plan is that you don't slip on what you agreed to do.

Work out your indicators and list them all at the start of the project. Track and report how you are going during the project execution.

Going back to your original question of how you include quality in the PM Plan; in your plan you need to describe the use of Measurements, and in-flight Indicators and how you will track and report these. Then keep a separate quality register and list and track all of your indicators and measurements. Your plan should also describe how changes will be managed and who approves adding or removing items from the register*

* hint: it should be your sponsor, your change review board or your steering committee. ie: some one that is not the PM.

Mark Nold
+6  A: 
Martin Bauer
+1  A: 

If it applies to your project, be sure to scope out beta testing cycles in the project schedule. Be aware that beta testing consumes substantial headcount, but can be a productive way of catching real-world bugs.

Beta cycles require time to recruit, vet, and train users. Only a fraction of them will actually contribute useful bugs, so you will need to gather a larger pool than you might think. And your developers and QA testers will have to devote time to communicating with your beta testers.

mseery
A: 

It sounds a lot like you're looking for a silver bullet. The question seems overly general; surely what methods will allow you to "define, support and monitor the quality of the deliverables" will very much depend on the nature of the project.

Nick Higgs
Nick, not quite. Brooks "No Silver Bullet" talks about development of technology to tackle software complexity and increase developers productivity. This question is about structuring project to include QA from the day one. Whilst certain QA methods may differ depending on the type of software →
Totophil
there are certainly some things as "continious integration" or "testing as close to real thing as possible" that can be included into project plan and from PM point of view have a material impact on the project scope, time, budget and quality.
Totophil
I believe Brooks' paper refers to developments in both technology and managerial techniques. While I agree that certain method are usually a good idea, the compilation of a definitive checklist that will ensure quality on an arbitrary project seems a pretty ambitious aim.
Nick Higgs
Nick, technology definition is wider. Managerial techniques are also technology. Thinking about the matter, they are very similar to, programming language in a sense that programming language is just a bunch of concepts created by rational thought, so is Scrum and Waterfall.
Totophil
This question asks: what planning technology to use to improve coverage of quality?
Totophil
A: 

at the bare minimum, you should have some kind of quality or feature test plan (e.g. Writing a System Test Plan)

this is just the bare minimum to be able to call yourself a 'professional'. if budget was tight on a project and i couldnt get unit testing, this would be the fall-back 'cheaper' testing protocol i would go for

hope this helps

--LM

louism