views:

483

answers:

4

I'm really at a loss as to how to procede. I have a very large database, and the table I'm accessing has approx. 600,000 records. This database is accessed using an accounting application, which provides the report with the SQL query by which this report accesses the database.

My report has a linked subreport which has restrictions that are placed in the report header. When this report is run, the average time to refresh, using a very base query is 36 minutes. When adding two more items to the query, the report takes 2.5 hours.

Here is what I've tried:

  • cleaned up the report only leaving items in absolutely necessary - no difference
  • removed most formulas (removing the remaining formulas makes no time difference)
  • tried editing the SQL query - wasn't allowed because of the accounting application
  • tried flipping subreport and main report - didn't work
  • added other groupings - no difference
  • removed groupings - no difference
  • checked all the servers for lack of temp disc space - no issue
  • tried "on demand" subreport - no change
  • checked Parameters (discrete vs. range) and it is as it should be
  • tried bursting indexes, grouping on server, etc. - no difference
  • the report requires 2 passes. I've tried getting it down to one pass unsuccessfully.

There must be something I'm missing.

There does not appear to be any other modifications to the report using regular crystal functions. Is there any way to speed up the accessing of the data without having to go through all 600,000 records? The SQL query that accesses this data is long and has many requests. It is not something I can change.

Can I add something (formula?) that nullifies these requests? I'm reaching now...

+1  A: 

Couple of things we have had success with is adding indexes to the databases, and instead of importing tables into the report, we instead wrote a stored procedure to retrieve the desired results.

mattruma
+1  A: 

If indices and stored procedures dont get you where you need to be you have reached the denormalise until it works part of life with a database. You might want to look at creating an MI database with tables optimized for your reporting needs; and some data transformation scripts that can extract the data from production to your MI database. Depending on what it is oracle / ms have tools to help you do this.

u07ch
+1 This is another good idea! And one we have also done.
mattruma
A: 

Do you know what the SQL query is? If so, you can move the report outside the accounting application and paste the query directly into the Command in the database expert. I've had to do this in a couple of cases with another application I work with.

SarekOfVulcan
A: 

We use Crystal Reports with a billing system, and we had queries in the database that take over 1.5 hours to complete. This doesn't even take into account the rendering/formatting of the reports.

We created Materialized Views and force the client to refresh them daily. A materialized view is basically a database view that holds the returned dataset. The dataset is not refreshed unless you explicitly tell it to refresh.

contactmatt