views:

100

answers:

2

We are currently embarking on replacing the ADO.NET stack in our C# application with Linq.

Because the application was not architected with a data abstraction layer, there are ADO calls throughout just about every layer of the application to such a degree that compartmentalizing any one object and trying to convert it to Linq means that you run down a labyrinth of rabbit holes.

What I am asking, are stategies or approaches to tackling such wholesale systemic changes while ensuring proper testing and a minimal 'drop tools' wind-down period (shelf making changes at a moments notice and come back to it at a later date).

We have toyed with the following :

  • Create a mirror object of each object with the new code = have to maintain 2 code bases until full conversion
  • Prefix all function names of ADO functions with ADO_ and create Linq versions with original name
  • Have a system wide FLAG to denote whether to use ADO or Linq and wrap every ADO call with if (FLAG) { ADO } else { Linq } = have to go back after conversion and remove all ADO refs

Every suggestion so far is cringe-worthy.

What do you guys/gals suggest?

NOTE: I removed '(ADO to Link)' from the title because I am looking for more generic answers and practices and not just limited to the ADO to Linq conversion used as an example here.

+2  A: 

You can still leverage all of your stored procedures/functions originally created for your solution with Linq2SQL. Using something like the strategies expressed in Micheal Feather's Working with Legacy Code you could create boundaries around regions of the application and update within those boundaries.

The strategy I have used in the past is maintain/ignore the database and its objects. I then slowly iterate through the different ADO calls and replace with Linq2SQL calls until the entire application is using Linq2SQL. I then find it easier to transform the previous ADO calls that exposed and passed DataSets/DataTables into more appropriate entities.

Another approach (for DataSet/DataTable heavy solutions) is to keep the ADO calls in place and instead use the extension methods AsQueryable() and/or OfType<T>() to transform the ds/dt items into appropriate entities, then aggregate these changes into a more cohesive DAL.

Jonathan Bates
+3  A: 

You should really have automated unit tests before making any changes. In fact, you should make no changes for code that isn't covered by unit tests at least 80%.

In the real world, unit tests often do not exist. On the other hand, doing any refactoring without unit tests can totally screw up your code, making Management even less likely to permit you to make changes in the future. What to do?

Using a tool like ReSharper, you can begin by applying some of the "safer" refactorings. With care, there's no reason you can't be successful in repeatedly using "Extract Method" to move your ADO.NET code into separate methods, "Make Method Static" if it wasn't already static, then either "Move Method" or "Make Method Non-Static" to move the method into a separate class.

Once you've got the code moved out, you can begin to write some automated tests. At this stage, they don't need to be "unit tests" in the strict sense of the word. In particular, these tests should be allowed to work with the database.

When you're left only with code that can't easily be unit tested, you can then, very carefully start making that code more testable. You can do things like turn sets of static methods into instance methods of new classes. You can also begin to introduce dependency injection, to make it easier to test using mock objects. But be very careful here - you're modifying code which has no automated tests, and you'll be using refactorings that can actually break stuff.

Once you've got the code adequately tested, then you can reorganize the code to make better sense, and then modify it to make it use LINQ if you like.

John Saunders