tags:

views:

542

answers:

5

I consider myself still pretty new to the TDD scene. But find that no matter which method I use (mock framework or stubbing my own objects) I find that I have to write a lot of code to create mock data. I like the idea of loading up objects to create an in-memory database. But what I don't like is cluttering up my tests with a ton of code for the sole purpose of creating mock data. This is especially the case when the data needs to account for all the different cases.

I'd love some suggestions for a better way of doing this.

It would seem to me that I should be able to load the data once into a known state from some data store and then I could use a snapshot of that state which is loaded in the test setup/initialize before each test method is executed. This would satisfy proper testing practices while providing convenience and let me focus on writing tests instead of writing code to create test data "by hand".

A: 

I know exactly what you mean. I think a good approach to solving this problem is to actually have a separate MockFramework project that houses all your mock data, outside the test project. This way you can generate mock data separately, store it in memory if you want to, or not, and then reference the mock framework from the test project. If you use a third party framework to do this, all the better, but you can still wrap that third party framework in your own mock framework so you can get all that "glue" that creates the mock data the way you need it out of your tests so the tests can really be only what they need to be.

Joseph
This solves the issue of clutter, but I still need to mock up all the data, just in a separate project. Maybe, as you suggested, I could use a 3rd party framework to load up the data and translate it to my object model. nDbUnit could work, as suggested by webjedi.
Mark J Miller
+1  A: 

You can have Builder class(es) that helps you building the instances you need / in this case ones you would use related to the repository. Have the Builder use appropiate defaults, and on your tests you can overwride what you need. This helps you avoid needing to put have every single case of "data" mixed up for all the different tests (which introduces problems, because usually there are cases that aren't compatible for different tests).

Update 1:Take a look at www.markhneedham.com/blog/2009/01/21/c-builder-pattern-still-useful-for-test-data

eglasius
Thanks for the link, that helped me to better see what you meant. This still requires me to write the objects by hand, just differently. However, you make a good point about not putting every single case into the "data".
Mark J Miller
Notice that the point of having defaults, is to enable you to have that shared config/data, but not have it all missed with specific cases.
eglasius
+1  A: 

If your are using .Net Try NDBUnit

You populate your store and then it reverts your DB to a known state at test time, for each test. The Autumn of Agile screen cast series shows this in pretty good detail.

Or you can do this manually...build a stored procedure or whatever to truncate your tables and copy in the data in your teardown method.

Webjedi
That would work only for integration tests.
eglasius
Freddy Rios is right about your 2nd comment. nDbUnit is close, but I can't seem to find any docs. I downloaded the code sample from Autumn of Agile and it looks like I need an xsd and I'm already using Entity Framework so I'd have to copy everything from xsd to EF.
Mark J Miller
The XSD is only for purposes of spitting out the current data to an xml file and then reading it back in after the test is done.@Freddy Dunno about that...I'm doing it from step one...wouldn't consider that an integration test per se.
Webjedi
A: 

Thanks for all the suggestions, I think the solution requires a little bit of everything. I don't want these tests to end up being regression tests, but w/o some kind of existing data store everything still boils down to creating the data by "manually" building the objects.

What would really be nice would be a framework that allowed me to use my existing DAL to either script the data to code for me or get the data in memory and access it like an in memory database.

Mark J Miller
A: 

Untils.org covers this way better than I ever could.

Their whole guide is actually very good.

But basically, if your units require "a lot of data" they may not be unit tests anymore. I'd recommend attempting testing the smaller pieces individually.

Trampas Kirk