tags:

views:

39

answers:

3

Let say I have a query with a very large resultset (+100.000 rows) and I need to loop through the and perform an update:

var ds = context.Where(/* query */).Select(e => new { /* fields */ } );

foreach(var d in ds)
{
//perform update
}

I'm fine with this process taking long time to execute but I have limited amount of memory on my server.

What happens in the foreach? Is the entire result fetched at once from the database?

Would it be better to use Skip and Take to do the update in portions?

+1  A: 

Best way is to use Skip and Take yes and make sure that after each update, you dispose the DataContext (by using "using")

You could check out my question, has a similiar problem with a nice solution: http://stackoverflow.com/questions/2727591/out-of-memory-when-creating-a-lot-of-objects-c

Bas
+1  A: 

YOu basically abuse LINQ2SQL - not made for that.

  • ALl results are laoded into memory.
  • YOur changes are written out once, after you are done.

This will be slow, and it will be - hm - using TONS of memory. Given limited amounts of memory - not possible.

Do NOT load all data in at once. Try to run multiple queries with partial result sets (1000-2500 items each).

ORM's are not made for mass manipulation.

TomTom
+1  A: 

Could you not use a stored procedure to update everything in one go?

RemotecUk