tags:

views:

2260

answers:

5

I tried searching here for a similar solution but didn't see one so I was wondering what is the best way to accomplish the following.

I have a table with 17 million + rows all have a unique ID. We have recently created a new table that will be used in conjunction with the previous table where the foreign key of the new table is the unique id of the old table.

For ex.
Table 1 - id, field1, field2, field3... table 2 - table1.id, field1 ...

The problem is since we are migrating this into a live environment, we need to back fill table 2 with a row containing the id from table 1 for each row in table 1. ex, table 1 - 1, test, null table 2 now needs to have: 1, null, ... and so on for each row that is in table1. The main issue is that the ids are not all sequential in table 1 so we will have to read from table 1 and then insert based of the id of found into table 2.

Is there any easier way to go about this? Thanks in advance Joe

Also to clarify, table 2 will be new data and the only thing that it will contain from table 1 is the id to keep the foreign key relationship

Also this is sql server 2000

+2  A: 

I am not sure I am exactly following you, but would something like this work for you?

INSERT INTO table2 ( SELECT field1, field2, field3... FROM table1 )

If I am understanding correctly you want a record in table2 for every record in table1. This will do just that. Just match up your fields in the select in the right order, and specify constants for any fields in table2 that you don't have in table1.

HTH. Let me know if I am not understanding and Ill try to help again.

Ryan Guill
A: 

With that many rows you may run into issues with transaction log space, and length of time running large insert transactions.

If run time is a constraint I'd seriously recommend using Bcp (or what ever tool is applicable depending on the platform)

Select out the id's from the original table, use that to build a Bcp file for the extension table, then Bcp it in.

You many find it more performant to Bcp in files of 10,000 records instead of one humungus file with 17,000,000 rows.

Also, you can do this in the back ground before go live, and write a t-sql job to pick up and that may have been inserted after you took the snapshop of id's.

Binary Worrier
A: 

You need to read this article.

http://stackoverflow.com/questions/346659/what-are-the-most-common-sql-anti-patterns#346679

The main issue is that the ids are not all sequential in table 1 so we will have to read from table 1 and then insert based of the id of found into table 2

Yes, look at my answer in the above article and write a key-walking loop using Item #2.

Make sure when you write the insert statement, you provide a fieldlist - as I say in Item #1.

David B
A: 

If I understand correctly, you want one record in table2 for each record in table1. Also I believe that apart from the reference to table1, table2 should initially contain blank rows.

So assuming

table1 (ID, field1, field2, ...)
table2 (ID, table1_ID, fieldA, fieldB,...)
-- where table1_ID is a reference to ID of table1

After creating table2 you can simply run this insert statement

insert into table2(table1_ID) 
select ID from table1
kristof
A: 

what if there's a lot of values to insert from a textbox and one value is from another table? example: table 1(curriculum_id,year_implemented, course) table 2(student_id,last_name,first_name,course,curriculum_id)

i need to insert the value from table 1(curriculum_id)

if i would do this: insert into table2(table1_ID) select ID from table1

how about the other textbox values in the insert statement?

eva