views:

24

answers:

2

Our production database size is 25Gb (approx) and having 700 tables, I just want to import about 2% of the data for every table in my local database for development/testing purposes... (data should be imported in a way that foreign key constraint should be preserved) ???

Your comments/suggestions will be warmly welcomed !!!

+1  A: 

You will find no fully automated way of doing this. The structure and business rules of your database will be required in order to determine how to reduce the data.

I would suggest running through your highest level tables that have a good spread of data picking a few and working from there.

In order to actually perform the data import/export. I would actually consider taking a backup and copy of the database, then running a bunch of delete statements until it's down to your required size. Then re-organising, shrinking and backing up again to restore locally.

Robin Day
A: 

As per Robin, deleting is much easier than scripting selective inserts

If you have RI in place and don't have cascading delete, you can work backwards easily with nested criteria

Start off this way ...

1

delete from table1 where table1PK IN
(somecriteria for table1 deletion)

2

delete from table2 where table2PK in
(select table2PK from table2 where
table1PK in (somecriteria for table1 deletion) -- same as above
) 

-- etc ... continue nesting down the tree

but then when you come to run the deletion obviously you need to reverse the order of deletion to

N .. 2 .. 1

This could still be a lot of work for 700 tables, but usually most data is in a few tables - you just need to focus on the large ones?

HTH

nonnb