views:

2805

answers:

4

I have been having some issues with LINQ-To-SQL around memory usage. I'm using it in a Windows Service to do some processing, and I'm looping through a large amount of data that I'm pulling back from the context. Yes - I know I could do this with a stored procedure but there are reasons why that would be a less than ideal solution.

Anyway, what I see basically is memory is not being released even after I call context.SubmitChanges(). So I end up having to do all sorts of weird things like only pull back 100 records at time, or create several contexts and have them all do separate tasks. If I keep the same DataContext and use it later for other calls, it just eats up more and more memory. Even if I call Clear() on the "var tableRows" array that the query returns to me, set it to null, and call SYstem.GC.Collect() - it still doesn't release the memory.

Now I've read some about how you should use DataContexts quickly and dispose of them quickly, but it seems like their ought to be a way to force the context to dump all its data (or all its tracking data for a particular table) at a certain point to guarantee the memory is free.

Anyone know what steps guarantee that the memory is released?

+11  A: 

A DataContext tracks all the objects it ever fetched. It won't release this until it is disposed.

This is the right way to go:

using(DataContext myDC = new DataContext)
{
  //  Do stuff
} //DataContext is disposed
David B
+5  A: 

If you don't need object tracking set DataContext.ObjectTrackingEnabled to false. If you do need it, you can use reflection to call the internal DataContext.ClearCache(), although you have to be aware that since its internal, it's subject to disappear in a future version of the framework. And as far as I can tell, the framework itself doesn't use it but it does clear the object cache.

Mark Cidade
Note that as the other gentlemen said, it probably is better to use many DataContexts in this situation. But, since the question was how to guarantee releasing memory within one context, the ClearCache() method is closer to the answer.
Sam Schutte
Yes, use or not many datacontext depends of your amount of data, you can learn how to deal with this, I learn from RobConery MVC Road series ... http://blog.wekeroad.com/category/mvc-storefront
Angel Escobedo
+4  A: 

As David Points out, you should dispose of the DataContext using a using block.

It seems that your primary concern is about creating and disposing a bunch of DataContext objects. THis is how linq2sql is designed. The DataContext is meant to have short lifetime. Since you are pulling a lot of data from the DB, it makes sense that there will be a lot of memory usage. You are on the right track, by processing your data in chunks.

Don't be afraid of creating a ton of DataContexts. They are designed to be used that way.

Stefan Rusek
+1  A: 

Thanks guys - I will check out the ClearCache method. Just for clarification (for future readers), the situation in which I was getting the memory usuage was something like this:

using(DataContext context = new DataContext())
{
   while(true)
   {
      int skipAmount = 0;
      var rows = context.tables.Select(x => x.Dept == "Dept").Skip(skipAmount).Take(100);

      //break out of loop when out of rows

      foreach(table t in rows)
      {
         //make changes to t   
      }

      context.SubmitChanges();
      skipAmount += rows.Count();

      rows.Clear();
      rows = null;

      //at this point, even though the rows have been cleared and changes have been
      //submitted, the context is still holding onto a reference somewhere to the
      //removed rows.  So unless you create a new context, memory usuage keeps on growing
   }
}
Sam Schutte