views:

44

answers:

3

Everyone loves unit testing. But testing persistence of entities is a bit different. You are testing a process occurring across multiple layers using different languages. Your tests have side effects (in the sense that rows are being added / modified etc).

I would like to know how you do this. For example, do your tests create a whole new database schema and drop it each time? Do you have to maintain SQL scripts for test schema creation and keep them in step with your production database? Do you even test against the same database product that you use in production? Do you randomly generate your entities' state, or always use the same values? How do you configure your tests to ensure that they are executed against the test database instead of the production one?

There's probably a bunch of important questions I haven't thought of in this area. For the benefit of points-chasers, I will mark the answer that seems to have the least side-effects and be easiest to implement.

+1  A: 

It is pretty much impossible to unit test data persistence, so I usually do it on the integration level.

Regarding the database, in my current project the integration test suit indeed drops the whole schema and recreates everything from scratch (this is used when the tests are run from the build server). However, you can also run the tests against an already created database - this makes sense if you are testing/debugging from your machine and don't want to waste time or loose test data. You SHOULD maintain your database scripts (they should be the same as the ones for production) - this way you test your scripts as well as your .Net code. In general the scripts don't create any data (apart from static data maybe) - it should be part of the tests to create test data, do some operations on it and verify expectations - do that you can run your tests against every database with the correct schema. When creating test data we usually take random identifiers and unique fields and hardcode everything else.

Regarding environment management, you should already have some mechanism in place to configure the database connection (so that you can have test and production environments) - there are many ways to do it, including Microsoft products and in-house solutions - so you should use the same way to configure your build machine.

Grzenio
I guess this is pretty much what I imagined. A fair bit of work to implement, plus potentially annoying to maintain I would guess. But no pain no gain!
David
A: 

My approach is to create set of integration tests which are used to test data access layer (repositories and mapping). I think this is very important if you use ORM tool which uses "convention over configuration" approach - like POCO mapping in EF. I have DB initialization script which creates new database (same as current development DB) and creates initial set of test data. This script runs only once at the beginning of tests run. The database is deleted at the end of tests run. Each integration test uses transaction which is rolled back at the end of the test so test data are same for all tests. To validate data in DB I'm using helper classes with standard SqlCommand approach. To use SqlCommand you have to use ReadUncommited transaction isolation level for your tests because you usually don't share connection between SqlCommand and EF context.

Ladislav Mrnka
Thanks - interesting. I'm a little unsure why you roll back the transactions. You then have to use the ReadUncommitted isolation level as you say. Why not just run some SQL to delete rows from the relevant tables using your SqlCommand helper classes?
David
Because in that case you have to create clean up code for each test which has the same result as rollback.
Ladislav Mrnka
+1  A: 

For the last six years or so, I've mainly used NHibernate for persistence.

On the unit test level, I use SQLite in-memory to test the persistence/entity mappings, and on the integration test level I've used the real database server (both locally, and on the build server).

In both cases, I set up the database with the scripts that NHibernate/Fluent NHibernate can create for you for each test (and kill the DB afterwards in the integration case). This takes longer to run, but IMHO the risk of clean-up code gone bad is worse (BTW, there's a discussion about this in the xUnit Test Patterns book).

Martin R-L
But this works only if you use model first approach, doesn't it?
Ladislav Mrnka
Yes. I'm a DDD/"Model + POCOs first" guy myself.
Martin R-L