views:

266

answers:

2

In a table there are like 113 columns. and there are two default records in the table, one is for unknown and another is for inapplicable. So, each column has its own default value to represent unknown and inapplicable.

I dont wanna write regular insert statement to get those two records.

so, I tried to insert each column using a cursor.

Got the names of columns for that table from information_schema.columns and tried to insert values from exact table in another location using "insert into select" statement, but the name of the columns that we get from information_schema

Declare @col_name varchar(50)

declare my_cur CURSOR for
  select  column_name  from information_schema.columns 
  where table_name = 'tabl' and table_catalog = 'db'
  and table_schema = 'dbo'


  Fetch next from my_cur
  into @col_name

  while @@FETCH_STATUS  = 0
  BEGIN

   Insert into db.dbo.tabl (***@col_name***)
   select ***@col_name*** from openrowset('sqlncli', 'server=my_server;           trusted_connection=yes;', db.dbo.tabl) 



  fetch next from my_cur into @col_name
  end

close my_cur
deallocate my_cur
go

But, I did not realize that @col_name would be treated as string, rather than object (column)

Is there any work around for this case or any alternative solution.

+1  A: 

You will have to generate the INSERT statement as dynamic SQL and then execute it

Declare @InsertStatement VarChar (Max)

SET @InsertStatement = ''

SET @InsertStatement = @InsertStatement + ' Insert into db.dbo.tabl (' + @col_name + ') '
SET @InsertStatement = @InsertStatement + ' select ' + @col_name + ' from openrowset(''sqlncli'', ''server=my_server'';  '

Exec Sp_SQLExec @InsertStatement
Raj More
+5  A: 

I think that getting these defaults populated is the least of your problems.

I'd suggest taking a look at this: Fundamentals of Relational Database Design

And if you still want to do this, it might be better to retrieve all the defaults from the linked server, place them in a temp table, and then join to information_schema.columns to populate your table. You'll probably need to transpose the data to make it work.

AlexCuse
Yeah, there are strong hints there that 113 columns with values that aren't related is a bit of a DESIGN FLAW
Emtucifor
I ll definitely tell my developers to reconsider their design. However, I want ya'll to know that it is a dimension table in a data warehouse.
whizzing_hornet
Ah, that could change things a bit. But what does your proposed table structure look like? Seems like the cursor you posted would give you 113 rows, each with a single column (out of 113) populated. Do you plan to somehow transform this into a more useful structure aftwerwards?
AlexCuse
If the way the data naturally needs to be used isn't the way it's stored, then does it really matter if it's a data warehouse dimension table?
Emtucifor
@AlexCuse and @Emtucifor: I dont think factoring that table will help out some way. Because that table would be used in a SSAS cube as a dimension. If factored out, then there would be multiple dimensions.
whizzing_hornet
So are you moving this from one dimension table to another? If so, my apologies, I was under the impression that a normal database was involved at some point. The approach you're using still seems 'off' to me, but without really seeing what you want the source and destination tables to look like I can't comment further. Are you looking to take a 113 col source table with a single row and turn it into 113 rows at the destination?
AlexCuse
I was tryin 2 create a clone of a known db structure n fill only the default values in the cloned tables, so that I can execute packages and load data of certain date range, and compare back against the original (source).
whizzing_hornet
As I went down to the data warehouse level, the number of columns were just too many. The regular insert script took too long. So, I thought of writing a cursor using Information_schema.Columns table, which will add each column related to that table into a string and that string will later be used in the Insert into Select statement. So, by this way, I dont need to write 113 columns in the Insert statement. and Also I made it a stored proc that can take any table and fill the default values in.
whizzing_hornet