views:

2763

answers:

3

I have created an rdl doc that points at a proc that returns 90 000 rows and I am getting an out of memory exception. Is there a limit to how many rows the report projects can handle?

Currently I have changed the proc that drives my report to just do a select Top 90 000. My specs are to be able to create a report with 120 000 rows. My report is a matrix.

I swear last week I generated a report that has 106800 rows in it but now all of sudden I can't.

I have written a rendering extension and here is part of the exception when I step into the code.

"eInfo: 2/12/2009 12:03:53 PM prairieFyre.ReportActions.RenderReport: Error rendering report Microsoft.Reporting.WinForms.LocalProcessingException: An error occurred during local report processing. ---> Microsoft.ReportingServices.ReportProcessing.ReportProcessingException: An unexpected error occurred in Report Processing. ---> System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown. at System.IO.MemoryStream.set_Capacity(Int32 value) at System.IO.MemoryStream.EnsureCapacity(Int32 value) at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count) at System.IO.BinaryWriter.Write(String value) at Microsoft.ReportingServices.ReportProcessing.Persistence.IntermediateFormatWriter.ReportServerBinaryWriter.WriteString(String stringValue) ....

+2  A: 

I don't think there is a limitation, other than your hardware configuration. If your on a 32-bit machine the worker process that is processing this has less than 2 gigabytes of memory to work with, probably closer to 1 gigabyte when you factor in kernel mode memory. If your going to be serving up large reports like this, you probably need a 64-bit setup with at least 4 gigs of memory or more on the box. This setup will allow the worker process to allocate more than 2 gigs of usable memory to complete these large requests without issue.

If a hardware upgrade is not an option you can also consider these alternatives.

  • You said the report was a matrix report so it sounds like you are not displaying all that data to user but aggregating it. Could you pre-aggregate some of this data in the database and then just use SSRS for the display?

  • Since the report is nothing more than an XML file, build the report XML string from within SQL server or by using some script or process. This may be a lot of work.

James
A: 

The stack indicates that the MemoryStream used to store the report execution result cannot increase (double) it's size.

This is usually caused by address space fragmentation, which commonly cannot be solved by adding hardware.

Microsoft ReportViewer is limited in the amount of data it can process because it is designed to store the intermediate report execution result in a MemoryStream that unfortunately needs a quadratically growing amount of contiguous free address space, which in a usual .NET application (2GB address space) is no bigger than 256MB, and often much smaller than that.

In this stream, for instance, all field values, aggregated values, expressions and images are stored, so the size depends directly on the size of the data sets added to the report.

To analyze the address space and objects of a .NET application, it is best to debug with WinDBG (Debugging Tools for Windows) and the SOS extensions.

Valuable commands to analyze address fragmentation are:

  • !address -summary
  • lm
  • !EEHeap -gc
  • !DumpHeap -stat

In this particular case, it may be possible to reduce the amount of data by pre-aggregating values already in the data source and not in the Matrix.

Further tips are given by the Knowledge Base article "You may receive the 'System.OutOfMemoryException' error message when you use SQL Server Reporting Services" [1].

[1]: http://support.microsoft.com/kb/909678 "System.OutOfMemoryException" error message when you use SQL Server Reporting Services

A: 

James I think you're correct. I had this error running it from my laptop; however, after I deployed to the server, the error went awway. Thanks, Steve

Steve