views:

172

answers:

6

I access several tables remotely via DB Link. They are very normalized and the data in each is effective-dated. Of the millions of records in each table, only a subset of ~50k are current records.

The tables are internally managed by a commercial product that will throw a huge fit if I add indexes or make alterations to its tables in any way.

What are my options for speeding up access to these tables?

+2  A: 

Archive data that's no longer current. (Or if that's not acceptable, data that exceeds some staleness threshold suitable for your requirements.)

Jeff Sternal
+4  A: 

You will need to look at the plans. You may be able to change the order of the join, add criteria, or provide hints to make it faster, but without the explain plan, you do not know why it is slow, so you don't even know IF you can make it faster.

Grant Johnson
+2  A: 

Could you take a daily dump of the records you need into your own database / tables?

rjmunro
+6  A: 

I think you're stuck between a rock and a hard place here, but in the past the following has worked for me:

You can pull down a snapshot of the current data at specified intervals, every hour or nightly or whatever works, and add your indexes to your own tables as needed. If you need realtime access the data, then you can try pulling all the current records into a temp table and indexing as needed.

The extra overhead of copying from one database into your own may dwarf the actual benefit, but its worth a shot.

Juliet
+9  A: 

You could try to create a materialized view of some subset of the tables over the DB link and then query from those.

carson
+1  A: 

What about creating a materialized/indexed view? That might help a bit.

Lars Mæhlum
An indexed view is SQL Server terminology - in Oracle they are called materialized views.
OMG Ponies