views:

130

answers:

3

I have a criteria page in my asp.net application. When user clicks report button, firstly in a new page results are binded to a datagrid, then this page is exported to excel file with changing content type method.

That normally works, but when large amount of data comes, system.outofmemoryexception is thrown.

Does anyone know a way to fix this problem, or another usefull technic to do?

A: 

Write the data to a file on the server, and provide a link to download it?

joelt
Can you give a reference(link) about that?
mavera
+2  A: 

This is probably happening because your application is trying to build the entire Excel spreadsheet in memory and then deliver it to the user once that is complete. For large datasets, you can easily use up all of the available memory, which is most likely also causing the application to dramatically slow down for the rest of the users.

Instead, you can try streaming the data to the user. This will use the same amount of memory, regardless of how large your dataset is. You can gets the bytes of the Excel spreadsheet, or simply convert your data to a CSV, and then set the HTTP Response Type and stream it to the user.

Here is an example:

byte[] reportDoc = GetExportExcel();
context.Response.ContentType = "application/vnd.ms-excel";

//set the content disposition header to force download
context.Response.AddHeader("Content-Disposition", "attachment;filename=" +
                    "Export.xls");

//write the file content byte array
context.Response.BinaryWrite(reportDoc);

There is a detailed tutorial at http://bytes.com/topic/asp-net/answers/326796-example-streaming-excel-browser-download

Nate Dudek
A: 

Excel also has a maximum number of records, which I believe to be around 65k, though may not be a direct result of your problem, but might still come up if you're getting an error while working with them.

Alex Larzelere