views:

161

answers:

4

Hello everyone :)

I was reading http://stackoverflow.com/questions/2512504/tdd-how-to-start-really-thinking-tdd and I noticed many of the answers indicate that tests + application should take less time than just writing the application. In my experience, this is not true. My problem though is that some 90% of the code I write has a TON of operating system calls. The time spent to actually mock these up takes much longer than just writing the code in the first place. Sometimes 4 or 5 times as long to write the test as to write the actual code.

I'm curious if there are other developers in this kind of a scenario.

A: 

You don't have to achieve 100% code coverage. If a piece of code is a simple wrapper around an OS call, then there has to come a time that you assume that the OS call will do what it's supposed to do (ie. you don't need to call the simple wrapper).

Now, if you have complex logic around that OS call, then it would make sense to mock the os call and unit test the logic. Of course, if you have a good abstraction layer on top of the OS call, this wouldn't be hard.

You just have to focus your efforts on the pieces that give you the best bang for your maintenance buck

Joel Martinez
And therein is the problem. 90% of the code I write does nothing but suck data out of the OS and print it.
Billy ONeal
@BillyONeal Abstract the OS out and TDD the rest. It will pay off. Maybe not from day one, but (hopefully) there will be a day where your code is so large that it will help you to develop quicker or the day come when someone else have to take over and then it is for great benefit.
Rickard von Essen
A: 

Unfortgunately, this is NOT language agnostic. In properly mockable languages (my experience is with Perl), mocking ANYTHING - including system calls - is, given a proper mocking library, VERY cheap, fast and easy.

DVK
This is language agnostic. Even if it's easy to do the mock, you still have to do the work to setup the mock. If it's a 5 line method with 4 O/S calls, then you still have to setup 4 mocks for every freaking test.
Billy ONeal
Proper mocking system records its own mocks - which is why I said it's dependent on a language. Can't do that in C++ as far as I know, but can in Perl
DVK
@DVK: It can be done in C++ -- it's creating the sample data that the API calls are supposed to return that takes the time, not the "ExpectCall" stuff.
Billy ONeal
@BillyONeal - that's what I mean. In Perl, you can take ANY random Perl program, plug mocking infrastrucrture ON TOP of it, and run it in "record mocking" mode
DVK
Yes, you can do the same in C++ with `#define` s. Even with a perfect mocking library, you still need to create sample input data and mimic the functionality of every O/S call in the target code. That has absolutely nothing to do with the language you choose.
Billy ONeal
+4  A: 

In general, when people have the experience that TDD makes the time it takes to get a piece of work done take longer it is because they have an improper definition of "done" or of "piece of work." Typically these people believe in the myth of "code complete."

Anyone can bang out some code faster than they can bang out some code and some tests. However, typing is not really where the time goes. If you start measuring the whole time - from concept to deployment - that a whole feature - from soup to nuts - takes, you will stop having the experience of TDD "taking longer."

Also, the OS thing isn't that important, as Joel implies: mock out the OS so that you can text your complex uses of the OS calls but don't bother testing the OS unless you have a reason you need to call out an assumption as a test.

What exactly do you mean by that? What else is there to deploy other than the product? I'm a one-man development team here and the only "deployment" process I have is 1. Build code, and 2. Put on FTP server. Perhaps in a big team with many developers working on the same code the tests help you know you won't break somebody else's code, but I do not have that particular problem.
Billy ONeal
Regarding your last paragraph, I agree. The problem is the time it takes to actually setup the mocks.
Billy ONeal
@BillyONeal: Do you really just make a change, build it, and deploy it hoping/"knowing" that what you deliver will work correctly every time? Unless you are writing code that nobody uses, there's at least one more step: someone gets what you've put on FTP, uses it, and decides if it was good enough or not. If you wrote a bug, the time it takes to deploy a feature is measured from when you started working on the feature until you found and fixed the bug in that feature, not until the time you build it and throw it over the wall for someone to try.
Also, do you never spend any time doing design or analysis work before actually coding? What about documentation?
I know this is blunt but my basic point is that if you have the experience of TDD increasing cost, you probably are not doing TDD.
@MaxGuernseylll: Documentation for what I spend most of my time working on consists of "Click the Go button". No, most of the time I do not spend time doing analysis work. As I said, 99% of what I spend time doing just sucks data out of WindowsAPIs and dumps it into a report. There's no complicated logic around things. And mocking each API call required takes at least 5 times as long as writing the code in the first place.
Billy ONeal
+1  A: 

TDD isn't mocking. Sometimes, good TDD employs mocks, but plenty of TDD can be done without mocks, and if you're confronting too much mocking with your TDD, perhaps you need to go "old school" and write simpler tests.

Carl Manaster
99% of the code I write does nothing but grab API data and print it. How on earth would you test code like that without mocks?
Billy ONeal