views:

273

answers:

1

What is the best way to replicate the update stream to a large (6TB) oracle db into another non-DBMS system? I don't need to "bulk load" the oracle db, but merely want to flow all updates into another home-grown system in near realtime (10s latency or less). Updates happen at the rate of 150 rows/second representing 10s of megabytes per second.

For clarity, let me emphasize that I am not replicating from one db to another. This is an application integration problem: I need to replicate from a db into an in-house, non-db application. I've thought of using an enterprise service bus, but that seems inappropriate.

+2  A: 
ninesided
I will take a look at these products. However, they don't quite seem like the right thing because the system receiving the updates is not a db and needs to read the replicated data; receiving the transaction logs would require that the logs be parsed in order to retrieve the data wouldn't it?
Sorry, I thought you just wanted to back up the logs on a different file system. What do you want to do with them once they're on this 'other system'??
ninesided
The original database contains a hierarchy of information (a links to b, b links to c and d, etc.). The secondary system wants to represent the data in a large matrix in order to perform linear algebra. The feed is for moving the data (in near reatime) from the db into the linear algebra system.
and this needs to be realtime? You can't just create a view that represents your matrix and query that?
ninesided
also, you have to consider the "cost" of replication, unless you've got some nice hardware replicating over the network is going to throttle your write times
ninesided