views:

567

answers:

4

I have C# Client application calling Windows webservice written in WCF calling Sql Procedure and this proc give output around 1.3 million records then the C# client application keep them in memory and does all validations one by one I am getting error:

System.Exception: An exception has occurred when recalculating the balances, the transaction will be rolled back.


System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
   at System.Collections.Generic.List`1.set_Capacity(Int32 value)
   at System.Collections.Generic.List`1.EnsureCapacity(Int32 min)
   at PitToPort.DataLayer.StockpileData.StockpileProfile.CreateStockpileProfileQualityFromAllPartialMovements()
   at PitToPort.DataLayer.StockpileRecalc.Recalc.CreateSP_FOFO_FOLO_NegTransaction(Int32 modelID, StockpileProfile currentStockpileProfile, TransactionsRow drTransactions)
   at PitToPort.DataLayer.StockpileRecalc.Recalc.CreateBalanceFOLO_FOFO_TWAA(TransactionsRow[] drTransactionsRows, Int32 modelID, StockpileProfileList stockpileProfileList)
   at PitToPort.DataLayer.StockpileRecalc.Recalc.CreateBalances()
   at QMastor.PitToPort.StockpileRecalculationBL.RecalcService.CreateBalances()

what might be to cause this error and how to rectify it? I have checked in proc,it is running fine

+4  A: 

It seems like you are trying to initialize a generic List of some type, and in doing so, running out of memory.

The procedure itself may be successful, but may also be returning many, many rows.

You may want to step through your code and inspect the results of the query before initializing the List object.

Without more details, I'm afraid I can't help debug more.

Edit:

You will need to handle the results of the stored procedure in batches. You can still use a SqlDataReader, but I would suggest making a List with a max capacity of something relatively small, like 1000, and read from the SqlDataReader until you have filled the List... then clear the list and start reading again:

SqlDataReader dr = cmd.ExecuteReader(); // where cmd is a SqlCommand

List<SomeType> items = new List<SomeType>(1000);

while(dr.Read()) {
  // read the values for the row
  SomeType obj = new SomeType(dr["id"], dr["value"], ...);

  // add the object to the list
  items.Add(obj);

  // when the list is full, process it, and create a new one.
  if(items.Count >= 1000) {
    Process(items);
    items = new List<SomeType>(1000);
  }
}
Jeff Meatball Yang
Yes, the proc gives result set around 1.3 million records
rmdussa
You are going to have to page those records! 1.3 million is A LOT - and well, as the .Net runtime is saying - it has run out of memory.
Jonathan C Dickinson
A: 

Also you can read this article

“Out Of Memory” Does Not Refer to Physical Memory

Lukas Šalkauskas
+1  A: 

You can also use MemoryFailPoint class to check do you have enough memory before allocating big arrays, lists or other big objects.

http://msdn.microsoft.com/en-us/library/system.runtime.memoryfailpoint.aspx

nice blog post about OutOfMemory

http://blogs.msdn.com/ericlippert/archive/2009/06/08/out-of-memory-does-not-refer-to-physical-memory.aspx

Perica Zivkovic
+2  A: 

Just a question -- do you have to load all the 1.3 million rows into memory? That's a lot of data even if each row is say 1KB. Small wonder your application is struggling to execute.

Is it possible to refactor your application to load a subset of the data?

Conrad