views:

1824

answers:

3

1) ClientApp makes async call to ASP.Net 2.0 WebService 2) Web service calls SQL Server 2005 stored procedure 3) Stored procedure returns data output, 150MB table

Out of Memory Exception is thrown by DataAdapter.Fill(...) while trying to allocate more memory for new rows.

IIS Application Pool does not have any Max Memory restrictions on it.

Are there maximum memory utilization caps set somewhere else at the IIS level? Does a 150MB db table take up a lot more space when read into memory as a DataSet? Is there a scenario (with WCF perhaps) where the result of the procedure never has to reside in web server memory but would be streamed directly to the client?

I would prefer not to have to split the request up into smaller data sets because the client requests them asynchronously. Collecting all the parts would have to happen asynchronously also, and each client would have to implement asynchronous collection for each call.

Any suggestions, best practices, or tips would be appreciated.

A: 

There are memory restrictions based on the type of operating system (32 bit vs. 64 bit).

JD
It's on a 32 bit Windows 2003 Server with 4GB RAM. 150MB doesn't seem like too much for an idle web server to handle.
Tion
A: 

The best practices would be a) Don't return so much data, and b) Use WCF instead of four year-old technology (ASMX web services).

John Saunders
How would WCF take care of the ADO.Net DataSet.Fill() exception? The out of memory occurs before the DataSet is fully loaded in the web servers memory. It doesn't even start to be serialized for transport as the web service response yet. Can WCF be configured to stream the data to the client directly from SQL without having it all be loaded in memory in the DataSet?
Tion
I know that WCF can stream, by returning System.IO.Stream. Worse come to worst, you could use ExecuteReader. Then construct an XmlWriter over the output Stream, and for each row you get back with Read, write a single XML element with the contents. It's more code, but should work. There are probably better ways as well. Do you really need to return all 150MB in a single call?
John Saunders
+4  A: 

Yes, using DataSets uses a lot more memory than the actual data. How much is hard to quantify, but this question on StackOverflow suggests more than 4 times the original data size. Let's assume that is correct. 150MB of data times 4 = 600MB of memory. When ASP.NET application use around 800MB of RAM they will start to throw OutOfMemoryExceptions. I'm not sure how that that limit is the same as the Application Pool memory limit. Have you tried the /3GB switch in boot.ini? (See this article for instructions)

Also note that if you are serializing your DataSet, the serializer may allocate huge buffers for the serialization (up to 10 times the original size, see this article. You indicate that the problem occurs when you read the data so this is probably not the cause of your error (but it might be if you solve the out of memory error and try to send the data over the wire).

My experience with DataSets is that they might seem like a good idea at first but you will very soon run into problems.

Another (probably better) solution would be to use a DataReader and read one row at a time. Return batches of rows (i.e. use some kind of paging for the data) and experiment with the size of each batch to find the sweet spot between performance and memory usage. WCF streaming might do the trick but you will need to configure WCF correctly to allow it to return such massive amounts of data in one call.

Karl