views:

2284

answers:

5

I want to write unit tests with NUnit that hit the database. I'd like to have the database in a consistent state for each test. I thought transactions would allow me to "undo" each test so I searched around and found several articles from 2004-05 on the topic:

These seem to resolve around implementing a custom attribute for NUnit which builds in the ability to rollback DB operations after each test executes.

That's great but...

  1. Does this functionality exists somewhere in NUnit natively?
  2. Has this technique been improved upon in the last 4 years?
  3. Is this still the best way to test database-related code?


Edit: it's not that I want to test my DAL specifically, it's more that I want to test pieces of my code that interact with the database. For these tests to be "no-touch" and repeatable, it'd be awesome if I could reset the database after each one.

Further, I want to ease this into an existing project that has no testing place at the moment. For that reason, I can't practically script up a database and data from scratch for each test.

A: 

I would call these integration tests, but no matter. What I have done for such tests is have my setup methods in the test class clear all the tables of interest before each test. I generally hand write the SQL to do this so that I'm not using the classes under test.

Generally, I rely on an ORM for my datalayer and thus I don't write unit tests for much there. I don't feel a need to unit test code that I don't write. For code that I add in the layer, I generally use dependency injection to abstract out the actual connection to the database so that when I test my code, it doesn't touch the actual database. Do this in conjunction with a mocking framework for best results.

tvanfosson
Unfortunately this approach is not practical for my projects (hundreds of tables, procedures, gigs of data). This is too high-friction to justify on an existing project.
Michael Haren
But your unit tests should be broken up in to smaller, more focused classes that don't touch all of the tables. You only need to deal with the tables this particular test class touches.
tvanfosson
Also, retrofitting unit tests on existing projects is probably best done on an "as needed" basis -- like when you need to refactor or fix a bug. Then you can write a "box" of tests around the existing code to guarantee that your changes don't break things (or fix the bug).
tvanfosson
I wish that were true. I really do. Plus, I don't want to have to write lots and lots of fixture code just to get the db into a "ready to go" state.
Michael Haren
tvanfosson: yes, I agree.
Michael Haren
+1  A: 

I just went to a .NET user group and the presenter said he used SQLlite in test setup and teardown and used the in memory option. He had to fudge the connection a little and explicit destroy the connection, but it would give a clean DB every time.

http://houseofbilz.com/archive/2008/11/14/update-for-the-activerecord-quotmockquot-framework.aspx

nportelli
A: 

For this sort of testing, I experimented with NDbUnit (working in concert with NUnit). If memory serves, it was a port of DbUnit from the Java platform. It had a lot of slick commands for just the sort of thing you're trying to do. The project appears to have moved here:

http://code.google.com/p/ndbunit/

(it used to be at http://ndbunit.org).

The source appears to be available via this link: http://ndbunit.googlecode.com/svn/trunk/

Scott A. Lawrence
+18  A: 

NUnit now has a [Rollback] attribute, but I prefer to do it a different way. I use the TransactionScope class. There are a couple of ways to use it.

[Test]
public void YourTest() 
{
    using (TransactionScope scope = new TransactionScope())
    {
        // your test code here
    }
}

Since you didn't tell the TransactionScope to commit it will rollback automatically. It works even if an assertion fails or some other exception is thrown.

The other way is to use the [SetUp] to create the TransactionScope and [TearDown] to call Dispose on it. It cuts out some code duplication, but accomplishes the same thing.

[TestFixture]
public class YourFixture
{
    private TransactionScope scope;

    [SetUp]
    public void SetUp()
    {
        scope = new TransactionScope();
    }

    [TearDown]
    public void TearDown()
    {
        scope.Dispose();
    }


    [Test]
    public void YourTest() 
    {
        // your test code here
    }
}

This is as safe as the using statement in an individual test because NUnit will guarantee that TearDown is called.

Having said all that I do think that tests that hit the database are not really unit tests. I still write them, but I think of them as integration tests. I still see them as providing value. One place I use them often is in testing LINQ to SQL code. I don't use the designer. I hand write the DTO's and attributes. I've been known to get it wrong. The integration tests help catch my mistake.

Mike Two
After using this approach for a couple weeks, I'm very happy with it, thanks again!
Michael Haren
I ended up using a very similar pattern, but with a base class that deals with the database trivia, including setting up the connections and whatnot.
Bruno Lopes
The only problem is surely if you don't commit, you cannot then test the data has been committed to the database? i.e. I'd like my test to call code that calls the DB, then do some asserts on the DB to verify the data, but finally rollback all those changes when the test or test suite is complete.Though it's a valid point to say these aren't really unit tests. Personally I mock the DAL generally, but it's useful to have explicit DB tests that aren't run on an automated run.
tjmoore
@tjmoore - as long as you query the data using the same connection you can see the rows since you are "inside" the transaction. In the sample code in the answer above you would call something that did some inserts perhaps and then query the data back and check that it contains the expected values. Once the test completes and the `TransactionScope` rolls back nothing will be left. Since your query is enlisted in the same transaction as the insert the rows will be found. Generally I also mock the DAL, but as you said, sometimes you just have to prove the insert is going to work.
Mike Two
Aha, you're right, just tried it out. Thanks.
tjmoore
Of course, what I've realised is the code I have under test uses a DAL that maintains the DB connections (actually via the Enterprise Library) and without a re-write just for the tests, the connection is opened and closed on each DAL operation. Ah well, another approach then.
tjmoore
@tjmoore - It should still work. That's the thing about `TransactionScope`. When a new connection is created it will automatically enlist in the current `TransactionScope`. Since the transaction scope will span multiple connections you will need to enable the MSDTC (Microsfot Distributed Transaction Coordinator). But it will work. I've used this code in that situation.
Mike Two
A: 

Consider creating a database script so that you can run it automatically from NUnit as well as manually for other types of testing. For example, if using Oracle then kick off SqlPlus from within NUnit and run the scripts. These scripts are usually faster to write and easier to read. Also, very importantly, running SQL from Toad or equivalent is more illuminating than running SQL from code or going through an ORM from code. Generally I'll create both a setup and teardown script and put them in setup and teardown methods.

Whether you should be going through the DB at all from unit tests is another discussion. I believe it often does make sense to do so. For many apps the database is the absolute center of action, the logic is highly set based, and all the other technologies and languages and techniques are passing ghosts. And with the rise of functional languages we are starting to realize that SQL, like JavaScript, is actually a great language that was right there under our noses all these years.

Just as an aside, Linq to SQL (which I like in concept though have never used) almost seems to me like a way to do raw SQL from within code without admitting what we are doing. Some people like SQL and know they like it, others like it and don't know they like it. :)

Mike