This question about unit tests sparked another thing that's been bothering me. I've gone back and forth on three ways to do unit tests when hitting a database.
- Create mock objects and plug them in. This has the advantage of not needing a database, but it's time consuming and I'm not sure how much return on investment I'm getting. I've been getting into IOC and moq a little bit, but it still seems painful.
- Create a setup and teardown database scripts to create known cases, and test for those. Again, can be time intensive, but still easier to create than mock objects most of the time. And other people at work can still run it assuming they have SQL server on their localhost.
- Manually check the dev database and modify unit tests. Intensively manual work, but if I have a "test set" that doesn't change, it seems to work OK. On my machine, at least :-).
I know option 1 is the "proper" way to do unit tests, but of the three, that's probably the option I've used the least (although the latest projects have been with IOC, so that door is open to me). I realize a lot of it depends on what exactly is being mocked and what is being tested, but what am I missing here?
If context helps, I'm in a C# shop, writing in-house applications, only a few developers.