views:

419

answers:

2

I'm working on a large scale performance critical asp web application with a pretty much denormalized database (lots of data is duplicated across tables for performance reasons). The application is highly performance critical and large. There is almost no sense of any kind of n-tiered design. I can't touch the database, it's carved in stone.

Now I need to build some additional applications accessing the very same data in c# (.net 3.5). While doing so, I'm asked to build a new "core" which later can be used when moving on to MVC, so I guess it would be a good idea to introduce some kind of data-acess-layer holding the entity classes and a repository, which is responsible for the crud actions.

I've been reading about the repository pattern and LinqToSql so I guess I know the very basics. The new applications that should be build are of course due 2 weeks ago, so I can't really take tons of time to create a huge framework or attend classes before I get to work. I'm willing to read a book a day but I need to get started quickly.

My question is, am I on the right path when thinking about LinqToSql and the repository pattern as a solution? Why not generate the entities using your favorite code generator and query the database old style using disconnected recordsets? Do I loose control over the sql queries when using LinqToSql? Should I worry about performance? Am I completely on the wrong way?

I hope I explained the issue at hand properly. If there are any open questions, feel free to ask.

+1  A: 

There are tons of mapping tools out there that are intended to help with this task, but in practicality as far as maintainability goes, you're probably better off just doing it yourself (especially if it is unlikely that this database model will change).

If I were in your situation, I'd create general classes that represent what you're storing in your database, and then create classes with static methods for your data layer. Each method simply does one task that your application needs (i.e. add, edit, get, search, etc), and uses the built in SQLCommand/SQLDataReader objects to do it, preferably calling stored procedures. For the 'get' type methods, they would then return your first set of classes (or lists of them).

This process will be more time consuming than using a framework/linqtosql, but you will not find another way that performs better than this, or is more flexible (since you're actually writing the code; it's not generated).

John
To save a bit of time on the Database Call side, I'd use the Microsoft.ApplicationBlocks.Data library and the SQL Helper object to perform the data calls!
Mitchel Sellers
A: 

We faced the exact same issue a few weeks ago. We initially were code generating our Entities, Dal, and basic BLL. But the extensive code we would have had to write to load referenced entities, and collections would be extremeley complex without a centralized generic solution.

We already had a great home-grown code generator that we were using, and almost all of our business entities weren't aggregates, they were mostly 1-1 with database tables (mostly). We really wanted to use LinqToSql, but we use attributes for validation, and using the LinqToSql designer did not make this possible. Nor did it allow us to send queries across the wire.

So we decided to use our code generator to generate our LinqToSql entities. We also are using Interlinq which allows us to send queries accross the wire. We have a repository factory that handles all of our linqToSql queries. Basically this just wraps either a LinqToSql datacontext or an Interlinq datacontext depending on whether or not we need to send queries across the wire.

This solution has worked great for us, and allows us to directly query data and keep it live.

Micah