views:

66

answers:

2

I have an Oracle 10g database with about 11,000,000 rows of data. It's good for our usually performance tests, but I now have to test our app for a customer that will have an estimated 350,000,000 rows of data.

Is there a tool or technique that I can use to "bulk up" the tables in my database to this size? Ideally, I'd like to use the existing data's distribution of values for certain fields as a "model" for the new data.

I found a reference to a tool called DataShark from Hardball Software that does exactly what I need, but it appears as if the company is defunct and the tool no longer exists. Does anyone know if there are any similar tools out there?

+1  A: 

Is the really simple solution out of the question just to get you going:

INSERT INTO myTable SELECT * FROM myTable;
Winston Smith
I don't think this will work, as it would duplicate primary key values and multiply foreign key values.
Patrick Cuff
I intended the query as an illustration of the idea - i.e. simply copy your existing data - not the actual complete solution. Of course, you'd have to generate new primary keys. It may or may not be a good fit for your data, but I thought it might be worth considering before investing a lot of time in another solution.
Winston Smith
+1: duplication of existing data will probably be closer to reality than random values.
Vincent Malgrat
A: 

I always use a combination of generating values with the DBMS_RANDOM.VALUE and DBMS_RANDOM.STRING procedures, and generating rows using code such as ...

Select rownum rn
From   dual
Connect By Level <= 1000;

Between the two you can generate most of what you need.

David Aldridge
I was considering this, but was curious if there was a tool I could use rather than "roll my own".
Patrick Cuff