views:

413

answers:

3

While I understand this question is fairly vague since I'm not giving you all as much detail as I'd like to, I'm hoping for some general improvements that can be made to my generation code or the reports themselves to speed them up. I've asked for more hardware, but have been denied.

public Stream GenerateReport(string reportName, string format)
{
    if (reportName == null)
        throw new ArgumentNullException("reportName");

    reportExecutionService.LoadReport(reportName, null);

    string extension;
    string encoding;
    string mimeType;
    ReportExecution2005.Warning[] warnings;
    string[] streamIDs;

    byte[] results = reportExecutionService.Render(format, null, out extension, 
        out encoding, out mimeType, out warnings, out streamIDs);

    return new MemoryStream(results);
}

The reports themselves are taking 6-10 seconds each. I've narrowed down the bottleneck to Reporting Services itself. Where should I start looking to removed potential speed bottlenecks. Note: some code has been removed to protect the innocent.

+3  A: 

Although not directly related to the code you posted, here are a couple of generic enhancements you should always consider when writing reports in Reporting Services:

  1. Pre-load report tables so that they already aggregate any data that would have been aggregated in the report. For instance, if the report data source summarizes thousands of rows of data and requires joining multiple tables together, then you should create a pre-aggregated table that joins all the data together and already summarizes the data at the required grain for the report.

  2. If you are passing parameters into the data source, then the aggregated underlying table should have a clustered index that corresponds with how the table will be searched. For instance, if the report only displays data for an individual customer and for a given transaction date range, then the clustered index should be ordered on the customer and transaction date.

  3. Filtering data should occur in the data source query and not in the report itself. Meaning, if you parameterize your report so that it filters data, then the parameters should be passed to the database so that it returns a smaller set of data. Do not return a large set of data and then filter the data. It is easy to make this mistake when using a multi-valued parameter since the out-of-box instructions for using multi-value parameters is to filter the data AFTER the data has been returned to Reporting Services.

I hope you are already doing the above and that this is not a relevant post. :)

Registered User
Emphasis on #3. It's one of the easiest changes.
Austin Salonen
All excellent places to start, and all design considerations.
RobS
I'm using two competing solutions for report generation. They're producing the same results, but one system is taking about 0.01 seconds to complete and reporting services is taking 0.2 seconds to complete for my simple test report.
Orion Adrian
A: 

If you've narrowed it down to Reporting Services solely based on your client code, I would review the queries / SPs that retrieve your data. I've encountered some pretty nasty queries in my day that looked fairly innocent.

Austin Salonen
A: 

I just did a couple of really nasty reports! I had to join on shady criterias on multiple tables containing a few million rows each.

Ended up creating a console application for doing the collection every night for previous day. It just got too heavy with all the logic so generating the report simply took too long. Now speed is not an issue anymore.

It depends on the type of report. These three reports had the need for yesterdays figures only. But as Austin says, the queries or whatever is usually the bottleneck.

Another thing to remember is that the report "expires" after a while (this is default setting). So if you havent used the report for a while it takes a bit longer to generate. If you do it again straight after the next one is faster. Workaround for this would be to go to the report in internet explorer and click properties and have a look at Execution and History (they can be tweaked to improve rendering of reports) Be careful though if the data is critical you could end up with old data.

mhenrixon