tags:

views:

353

answers:

9

Or maybe with our implementation of it. I'm new to the team that wants to go through the process and it seemed to confuse some of our members on the team including me. We were given what I would consider a user acceptance test or list of manual actions that you would click on through to see if everything was working. I was thinking it would have just been more simple than that, as we'd just divide up the sections of the application and just generally go across it looking for huge errors. This would involve no scripting, just clicking each page, maybe filling stuff out, making sure it submits ok. And that gets me to my other point, it seems to me that smoke testing is basically worthless. I would think that instead you'd have unit tests or even automated tests that would go through a requirement list to make sure this sort of thing would be happening correctly. Even if that was not completed at the very least I'd think the developer who checked in the code in the first place would have at least done a mini smoke test that made sure that their functionality worked in the first place. We brought this up in the meeting so you can imagine the confusion, and it gets me back to my question, what value are we going to get out of this type of smoke testing?

+9  A: 

The value of any testing is to increase your confidence in an implementation. Doing a 'smoke test" of this sort is here to increase the confidence that it did get built correctly, and that there are no major and embarrassing errors, like the UI won't come up. So it's basically a low-effort test to confirm that nothing really major went wrong.

It may not sound very useful, but I've seen heavily tested code fail a "smoke test" because of, for example, an unexpected failure in the build or a corrupted image. Usually when it's being demonstrated to an important customer.

Charlie Martin
Great point, so essentially the app be tested and working in a existing environment but the settings on the deployment server would be incorrect or something that might not allow the app to work in that environment. In that case though, how deep do you go? We've got 150 pages, with 15 roles...
rball
...which was brought up because we have essentially till the end of day today to get this tested. It also with your example doesn't make sense because we're testing on a staging server, and not the production. There could essentially be another problem on the production machine, that didn't...
rball
...come up on the staging. I also don't see how an automated suite wouldn't work better in this case. It'd be more specific, and a lot faster I'd think?
rball
But does it give the boss confidence it's not going to embarrass him? "Snoke test" doesn't have a rigorous definition: it's whatever test the responsible person asks for before they'll accept. It's just a slang term for a lightweight test.
Charlie Martin
"Usually when it's being demonstrated to an important customer." Yes, I can confirm that this is typically the quickest and most reliable method of finding critical errors in any application you may be building.
philistyne
+3  A: 

We do this type of smoke testing where I work.

The big value is that we can make sure that all of the pieces fit together before turning it over to the users for user acceptance testing. The app that I'm thinking of lives on three servers, plus talks to a legacy system. Making sure that we can run through and that every piece can talk to the other pieces that it needs to is an easy and important test.

Dave
+5  A: 

If you define smoke test as, "Run through the basic features of the system to make sure they work" then I think there is value in this. Unit tests, while necessary are not all inclusive. They don't find any integration bugs. As far as automating the testing, I have yet to see a system that can be completely tested with automation, and even if you can, such automation takes time and effort to accomplish.

Basically, the value I see in it is lets make sure that changes we made to the code today didn't break anything that was working yesterday. Depending on how diciplined your coding practices are, this is not always a given.

Craig H
+1  A: 

I'll assume that your meaning of "smoke test" is the same as mine -- i.e. try to blow the program up in any way you can.

I learned the value of this 20+ years ago when I had just finished (or so I thought) a major piece of code for a WYSIWYG editor. I proudly showed it to my officemate (Hey Dbell!) who said, "Neat!". He immediately created a paragraph with about 1000 characters in it, copied it thousands of times, and created a document of about 32MB, deleted everything, undid it, and the program absolutely exploded.

I was completely horrified and said, "You can't do that!". He grinned and said, "Why not?"

Finding and fixing the problem (which turned out to be easy to replicate, once you knew what the threshold events looked like) uncovered a subtle and serious bug in memory management. Simple click-the-button testing would have never uncovered it.

The modern day equivalent of this is Fuzz testing (http://en.wikipedia.org/wiki/Fuzz_testing). It is not substitute for step-by-step requirements testing, but it will often expose problems that no other method will.

Peter Rowell
The wierd part here is that we're supposed to be avoiding edge cases within our smoke tests. I totally agree with you though that giving it to another person (preferably non-programmer) would be ideal. I would honestly think though that this would be done right after a piece of code is checked in.
rball
No, that's not the generally accepted meaning of "smoke test".
Michael Borgwardt
+1  A: 

Smoke test is because putting a broken software in the hand of users can be extremely costly: it can waste many people's time, it can require management, meetings, etc.. to fix. Thus, it's worth spending a little bit of time for every build, to makes sure it's not broken, can save you a lot of time if a problem ever occurs.

The general rule is: it's always cheaper to find problems as early as possible.

And, unless you have tried it exactly as a user would do, you cannot be sure that it will work exactly like it should. There are always unexpected issues that only a human can catch, e.g. visual issues, installation issues...

Thus, automated tests should always be complemented by a (small) manual test, before putting the work in the hand of users.

+5  A: 

To answer the question "what is a smoke test?", imagine that you were testing a piece of electrical or electronic equipment.

Plug it in and turn the power on:

  • Do you see it switch on (e.g. the screen, or the power indicator)? Good!
  • Do you see any smoke coming out of it? Bad!

And that's all! A "smoke test" is a minimal test: not a rigorous test.

The value of a "smoke test" is that it's cheap, or cost-effective: for example, for 1% of the cost of a complete test it it catches 90% of most likely errors. For example in a primitive toaster factory you might do:

  • Costly test of the initial design
  • Costly test of the prototype
  • Costly test of the first item off the mass-production line
  • Cheap, smoke test of every subsequent item off the mass-production line

I'm not sure what place "smoke tests" have these days in software development, now that automated testing has become more popular. In the old days, people advocated "daily build" as a QA measure (specifically, to help with "continuous integration") ... and then, advocated "daily build and smoke test" as a way of doing better than just a daily build (i.e., do a daily build and verify whether it runs at all) ... but these days, better than that might be "daily build and an extensive automated test suite".

ChrisW
Agreed, It seems that most of the answers are doing far more that a "smoke test".
chills42
I agree with you too, I don't think we should be making scripts. Just gently browsing the section of the site and making sure any glaring errors don't come out.
rball
Ah, but you can make a script that gently browses the section of the site and makes sure no glaring errors come out. Then you're down from a bunch of steps to exactly 1 ("run the smoke test"). Is it worth it? Maybe.
Ian Varley
+1  A: 

Smoke testing is definitely worth the effort.

Here's an excellent article by Steve McConnell called "Daily Build and Smoke Test". This was what introduced me to the idea.

For more info on Continuous Integration and daily builds take a look at Martin Fowler's excellent intro to the topic.

HTH

cheers,

Rob

Rob Wells
See I think our definition is broken then. We're essentially testing right before a large build is pushed out the door. I could definitely see the benefit to smoke testing daily, or even pushing builds off each day to an independent QA team with a list of changes to check.
rball
I also totally agree with the CI part. I'm new to the team and as far as I can tell I don't see any CI taking part.
rball
+1  A: 

In my opinion, a "Smoke Test" just before a final release, whether the release is to a user or a QA team, is absolutely essential. Automated testing can only test as well as the scripts were written, and there is always the potential that something will get missed.

A simple interface flaw could taint the perception of your app to the QA or user and cause them to suspect or question fundamental base components. Without a once over before releasing it you are opening yourself up to having your software perceived as being poor quality just because of some small interface bugs.

Jamie Barrows
A: 

Like any best practice, you really need to believe in it to follow through on it's returns. A smoke test or Build Verification Test should be done at least every day with every build, or if you don't have a daily build, then every day anyway to check that your environment is up.

Knowing what your capable of every day of your testing cycle is the benefit. If you don't run a smoke test every day to test basic connectivity or functionality, you won't be able to report that you are blocked on testing and the development team won't have that direction on what are the highest priority bugs to fix.

I would recommend a recorded timeline metric of your smoke tests compared to blocking defects and test execution rate. This may prevent you from looking like the bottleneck and could save your job if anyone takes you seriously enough.

ModelTester