views:

52

answers:

2

I have two tables with identical structure and VERY LARGE number of fields (about 1000). I need to perform 2 operations 1) Insert from the second table all rows into the fist. Example:

INSERT INTO [1607348182]
SELECT * 
FROM _tmp_1607348182;

2) Update the first table from the second table but for update i can't found proper sql syntax for update.

Queries like:

Update [1607348182]
set [1607348182].* = tmp.*
from [1607348182]
inner join _tmp_1607348182 as tmp on tmp.recordid = [1607348182].recordid

or

Update [1607348182]
from [1607348182]
inner join _tmp_1607348182 as tmp on tmp.recordid = [1607348182].recordid

are invalid.

+2  A: 

Would it work to delete everything from the master table that had an ID in temp, then do an insert with the new data?

Kendrick
Yes i realized this solution and implemented it. Thank you
amuliar
+2  A: 

Not sure if you'll be able to accomplish this without using dynamic sql to build out the update statement in a variable.

This statement will return a list of columns based on the table name you put in:

select name from syscolumns
where [id] = (select [id] from sysobjects where name = 'tablename')

Not sure if I can avoid a loop here....you'll need to load the results from above into a cursor and then build a query from it. Psuedo coded:

set @query = 'update [1607348182] set '
load cursor --(we will use @name to hold the column name)
while stillrecordsincursor
set @query = @query + @name + ' = tmp_[1607348182]. ' +@name + ','
load next value from cursor
loop!

When the query is done being built in the loop, use exec sp_executesql @query.

Just a little warning...building dynamic sql in a loop like this can get a bit confusing. For trouble shooting, putting a select @query in the loop and watch the @query get built.

edit: Not sure if you'll be able to do all 1000 rows in an update at once...there are logical limits (varchar(8000)?) on the size that @query can grow too. You may have to divide the code so it handles 50 columns at a time. Put the columns from the syscolumns select statement into a temp table with an id and build your dynamic sql so it updates 20 columns (or 50?) at a time.

Another alternative would be to use excel to mass build this. Do the column select and copy the results into column a of a spreadsheet. Put '= in column b, tmp.[12331312] in column c, copy column a into column D, and a comma into column e. Copy the entire spreadsheet into a notepad, and you should have the columns of the update statement built out for you. Not a bad solution if this is a one shot event, not sure if I'd rely on this as a on-going solution.

M.E.
It seems to be a best solutions because there is no native expression in sql to do that. Also i had trouble with other limitations of SQL Server related to variable sized columns. (Also I am realize my problems originate from bad DB schema)So your solution seems to be the best because i can cache this SQL statement and reuse later in other requests.
amuliar
About varchar(8000) limitation. I can create varchar(MAX) anyway
amuliar
Good point, I've been working in legacy databases recently and forgot about varchar(max). In regards to the poor db schema...I'm assuming you're working from an old schema (rarely see tables named numbers outside of flatfiles/IDBMS). A nightly import from the legacy system into a normalized database might be a good option to consider, otherwise you are looking at a relatively large amount of time spent on working with this poor schema that will be added to every project you do.
M.E.