views:

76

answers:

3

I have written quite a bit of code which uses the Linq2Sql table relationships provided to me just by having foreign keys on my database. But, this is proving to be a bit laborious to mock data for my unit tests. I have to manually set up any relationships in my test harness.

So, I am wondering if writing Linq joins rather than relying on the relationships would give me more easily testable and possibly more performant code.

        var query =
            from orderItem in data.OrderItems
            select new
            {
                orderItem.Order.Reference,
                orderItem.SKU,
                orderItem.Quantity,
            };

        Console.WriteLine("Relationship Method");
        query.ToList().ForEach(x => Console.WriteLine(string.Format("Reference = {0}, {1} x {2}", x.Reference, x.Quantity, x.SKU)));

        var query2 =
            from orderItem in data.OrderItems
            join order in data.Orders
                on orderItem.OrderID equals order.OrderID
            select new
            {
                order.Reference,
                orderItem.SKU,
                orderItem.Quantity,
            };

        Console.WriteLine();
        Console.WriteLine("Join Method");
        query2.ToList().ForEach(x => Console.WriteLine(string.Format("Reference = {0}, {1} x {2}", x.Reference, x.Quantity, x.SKU)));

Both queries above give me the same result, but is one better than the other in terms of performance and in terms of testability?

+1  A: 

In terms of performance, both queries will evaluate to the same SQL (Scott Guthrie has a blog post on how to view the SQL generated by LINQ queries). I don't think that either option is inherently more "testable" than the other. However, I prefer to use the foreign keys and relationships because when using SQL Metal it lets you know really quickly that your database has the appropriate keys.

sgriffinusa
i also prefer to use FKs, but I am talking about mocking my Linq2Sql repository so I can unit test any complex queries I am doing, also to unit test any surrounding logic. Perhaps I am missing a trick in making my code testable, I am just using the code generated in visual studio when I create my dbml. Does SQLMetal make more easily mockable (and hence testable) classes?
Antony Scott
SQLMetal will give you true POCO classes instead of hiding everything under the designer as DBML does. However, I don't think that it will save you much as you are worried about relationships between particular instances, not just the structure, in testing.
sgriffinusa
+1  A: 

What are you testing? Linq to SQL's ability to read data? It is generally assumed that, linq to sql being a thin veneer over a database, that the linq to sql code itself is considered "pristine," and therefore doesn't need to be tested.

I am hugely not in favor of complicating your code in this way, just so that you can mock out the linq to sql DBML. If you want to test your business logic, it is far better to just hook up a test database to the DBML (there is a constructor overload for the datacontext that allows you to do this) and use database transactions to test your data interactions. That way, you can roll the transaction back to undo the changes to the database, leaving the test database in its original state.

Robert Harvey
interesting, that's something i've been considering recently
Antony Scott
From a *unit* testing point of view, isolation from the database is a good thing. However, I take your point that from a pragmatic POV, just running against a database is much, much easier. The purist in me wants to call that an integration test, though.
Damian Powell
agreed, it's really an integration test and also much slower to create a database. I have a mock Linq2Sql repository which uses generic lists into which I am "injecting" objects in-lieu of the records which would be in the database. what I am struggling with is how to easily re-create the POCO objects (if you can call the Linq2Sql generated classes that!) along with all the implied relationships you get with FKs.
Antony Scott
+1  A: 

I don't think either approach has an advantage in either performance or testability. The first form is easier to read though, and so I would personally go with that. It's a subjective matter though.

It seems to me that your problem lies with being able to setup your data in an easy way, and have the foreign key values and entity references remain consistent. I don't think that's an easy thing to solve. You could write some sort of framework which creates object proxies and uses the entity metadata to intercept FK and related entity property setters in order to sync them up, but before you know it, you'll have implemented an in-memory database!

Damian Powell
that is exactly the problem i face, and you're right it's a big task to build some kind of framework to "wire up" all the relationships
Antony Scott
I have started using the moq framework and stuck with the query syntax and explicit joins. This is working nicely for me right now and allows me to have test around all of my business logic (including the linq queries)
Antony Scott