views:

397

answers:

4

Hi, I just stumbled on a principle I can't understand.

Does "Test what you fly, fly what you test" mean that you should develop and test for the real thing all the time?

Thinking about this, make me wonder

  1. Should we prepare for production conditions in advance?
  2. Should we launch the system on day one? (may be not inform end users)

For example,

  1. Build tools to ensure error logs can be retrieved.
  2. Ensure error logs can be analyzed (statistic tools and/or use of good Log Level Design)
  3. Ensure we store the changes made to the system. History of changes.
  4. Ensure we have a short update cycle in case of bugs.

Are there more examples? That will ensure a low risk launch of a new system?

I'm a bit confused. That's all.

Nasa

+13  A: 

It means that if you don't test exactly what you plan to launch, you don't know how what you do launch will behave.

A similar principle is also expressed as "dogfooding" or "eating your own dogfood." Assuming your product is something which people in your company would use, get them using your product long before you launch it. They're likely to be a much better source of usability bugs, data corruption bugs etc than a QA team which has very specific tasks and might well not hit all the corner cases which real users do.

In addition, it means that by the time you launch, your internal needs will have forced you to work out which support tools you need (logging etc).

Jon Skeet
+2  A: 

An example of TWYF would be to make sure that functional testing is performed on the Release configuration, not (only) the Debug configuration - or whatever you choose to call those things in your site. If the only differences between Release and Debug are assertion-checking or extra logging, you still can't be sure that software tested in Debug works in Release, due to something like a timing issue.

FWYT means that when you're satisfied with the quality of a release build candidate, you ship that build, rather than doing a fresh "production master" and hoping that the configuration of the two builds was the same.

Graham Lee
Over optimization of the compiler is also dangerous, especially GCC. I've been bit by -O3 way too many times to trust anything else that release tests.
Robert Gould
+2  A: 

My last project had the mantra "We sell telephones not simulators", to drill home the point that we should always test our code on the target hardware. In reality only those coders who enjoyed getting their hands dirty on hardware would actually do this and the daily production builds would invariably fail about half the time. Sometimes this would hold up the whole project while the production testers tried got to the bottom of the problem.

The other mantra was "we're not at home to Mr Cockup", which was a laugh, given that he seemed to have taken up permanent residence.

Noel Walters
Thanks. Real world stories are insanely useful. I can tell that this is real. I will think about mantras for my current project. Thanks again.
Flinkman
A: 

I thought Nasa's motto was. Don't test what you fly - but document the test procedure and test the documents match the procedure.
And when the mass of test documents is greater than the vehicle weigth - it's ready to fly.

(At least it was for Hubble.)

Martin Beckett