I have an Oracle 10g database with about 11,000,000 rows of data. It's good for our usually performance tests, but I now have to test our app for a customer that will have an estimated 350,000,000 rows of data.
Is there a tool or technique that I can use to "bulk up" the tables in my database to this size? Ideally, I'd like to use the existing data's distribution of values for certain fields as a "model" for the new data.
I found a reference to a tool called DataShark from Hardball Software that does exactly what I need, but it appears as if the company is defunct and the tool no longer exists. Does anyone know if there are any similar tools out there?