views:

156

answers:

6

In a project we have implemented the data access layer (DAL) with a visual designer that auto-generates a lot of code (in our case: strong-typed DataSets and DataSetTableAdapters in .NET).

However, using source control I find it troublesome to edit and add new things to the DAL. We have started coding new data access by manually writing the SQL statements (in our case: ADO.NET SqlCommands etc) which seems cleaner to me to edit, and especially to see the changes in via source control.

But I'm also worried about mixing the methods of data access. What would you suggest? Stick with the auto-generation method, continue converting to 'manual' SQL statements when changes are needed, or something else?

Edit: Inspired by the nice answers that address the general problem of switching data access strategy, I have generalized the formulation of the question.

The handling of the model data is not very object-oriented. We use .NET DataTables instead of custom objects.

+1  A: 

If you are completely rewriting your DAL then perhaps you should look into a persistence framework like Sprint.NET or NHibernate.

Andrew Hare
+1  A: 

... or SubSonic from Rob Conery :)

see at least the 3 videos that you can find in the project page.

balexandre
+4  A: 

Yes, stick with what you have and refactor as necessary and convenient. I hate to disagree with the other answers, and confuse things, but by using another data access framework you may be trading one system headache for another. Go with your basics with which you are comfortable.

Goblyn27
+2  A: 

Going to straight "naked" ADO.NET and doing all the DAL work manually in code seems like a lot of effort. I would definitely recommend looking at an OR-mapper such as have been recommended by others (NHibernate, Subsonic etc.).

If your backend in SQL Server and you're on .NET 3.0 or higher, you could also look into Linq2SQL (rumors of its death are greatly exaggerated) for simpler scenarios (1:1 mapping from database tables to your database-layer objects), or Entity Framework for more advanced scenarios (lots of tables, lots of object inheritance in your domain model, complicated mapping scenarios).

EF has been getting a lot of bad rap lately, which is only partly justified, IMHO, and the new version, EF v4 (due out with .NET 4.0 and VS 2010 sometime this or next year) looks really promising.

Marc

marc_s
+1  A: 

The DAL is the usually the most over-engineered part of any application. Keep it simple and build what you need now or reasonably anticipate needing very soon. SqlCommand, SqlDataReader, etc. are still relatively abstract compared to the nuts and bolts of actually connecting to, reading from and writing to a database.

That said, your code will be more readable if you use just one general approach.

John M Gant
+2  A: 

If the choice is between converting to manual ADO or continue using datasets+table adapter, I think you're better of staying with the datasets. You get free CRUD by using it, and thus less time used on creating and maintaining sql which doesn't give any value.

By the way you phrase your question, it doesn't sound like you're going for a more object oriented approach either, which could be an argument for going away from the dataset+table adapter approach.

You might want to do some research/prototyping into the OR-mapper + pure objects domain as well, if you have increasingly complex business logic to handle. It's less effective for the RAD approach though. I'd check out Linq 2 sql (if you have a simple schema/object structure and are happy with 1:1 mapping between object and table) or NHibernate if I were you. Entity Framework isn't mature enough. The next version will be better, but it is still potentially a long time coming.

Rune Sundling