I'm currently working on a critical monthly Accounts payable report. This report i just wrote will be the basis for my client in which how much is he going to pay to his vendors. So far i have spent about 5 hrs building automated tests and found zero errors so far. Do you think i am already spending too much time testing? What should be the ideal time-frame for testing?
I do remember seeing some statistics mapping percentage of time spent testing for different projects. It broadly varied from about 30% to 50% of the total development time with smaller projects taking a smaller percentage. This is consistent with my experience as well.
Regards
There is no "exact" amount of numbers need to be spent on writing test codes after writing the function specification. Since you have created several test cases as well as encountering minimum / zero errors so far, i would say you're in the right track.
Those test cases will act as "safety net" when you add more codes or refactor existing codes.
Ideally you should spend as little time as possible writing tests, which means your code should be testable using simple and straight forward unit tests, and you achieve that by trying to reduce Cyclomatic complexity of your code.
So try to concentrate on writing code that requires as few unit tests as possible, simple clean cone with low Cyclomatic complexity.
There isn't a standard time ration of how much time should be spent writing tests or code, the only measure is test coverage, if your code is simple your unit tests will also be simple and require less of them therefore less time spent writing unit tests will be spent.
Before it becomes non-trivial to start testing your code.
I usually start testing a feature as soon as I add it.
I try not to test my code with anything other than automated tests. When I'm building my feature, instead of trying it out by hand as I build it, I try it out with tests. That way it's no more work than you'd have done anyway, and you have the tests afterward.
After that I add tests as I discover bugs, or occasionally to cover bugs that I think could be added by a careless maintainer. The idea is for testing to help you, not get in your way!
There is no fixed amount of time to spend on testing. It's a little like asking "How long is appropriate to write any feature?" It really depends on how complex the item being written is and on how wide the surface area is. The more the user can do with your tool, the more you should test it.
Testing of anything end-user facing should be of two sorts: 1) Automated testing. Regression tests, unit tests, etc. 2) Manual tests. Wait until you are mostly done, then try to hit all the corners as the user would. You won't cover everything in your automated tests and might not notice side effects there so you need a human eye on it before shipping.
Rather than deciding how much time to spend testing, decide what you think needs to be tested and spend whatever time that takes.