views:

84

answers:

2

In our project to generate massive reports in oracle we use some permanent table to hold intermediate results. For example to generate one report we run few queries and populate the table, at the final step we join the intermediate table with huge application tables. These intermediate tables are cleared for next report run. We have few concerns in performance areas.

  1. These intermediate tables are transactional and don't have statistics. Is it good idea to join these with application tables which are partitioned and have up to date statistics.

  2. We need these results stored in the intermediate tables to be available across requests from UI hence we are not in a position to use oracle provided temporary tables.

Any thoughts on what could be done would be appreciated.

+4  A: 

If the intermediate tables reach some relatively consistent number of rows, you could compute the statistics on them after the population process finishes (once) and then use DBMS_STATS.LOCK_TABLE_STATS to lock these values in place. The overestimation of rows, etc won't matter after the tables are truncated and the next reload starts but will be in place when the process starts to query the tables for the next cycle.

dpbradley
A: 

You could also look at dynamic sampling (9i onwards), see the following links

Ask Tom Magazine Article

See Section on -Query Plans with Temporary Tables

Paul James