views:

4736

answers:

5

Using C# (vs2005) I need to copy a table from one database to another. Both database engines are MS SQL 2005. For the remote database, the source, I only have execute access to a sproc (stored procedure) to get the data I need to bring locally.

The local database I have more control over as it's used by the [asp.net] application which needs a local copy of this remote table. We would like it local for easier lookup and joins with other tables, etc.

Could you please explain to me an efficient method of copying this data to our local database.

The local table can be created with the same schema as the remote one, if it makes things simpler. The remote table has 9 columns, none of which are identity fields. There are approximately 5400 rows in the remote table, and this number grows by about 200 a year. So not a quickly changing table.

+1  A: 

I would first look at using SQL Server Intergration Services (SSIS, née Data Transfer Services (DTS)).

It is designed for moving/comparing/processing/transforming data between databases, and IIRC allows an arbitrary expression for the source. You would need it installed on your database (shouldn't be a problem, it is part of a default install).

Otherwise a code solution, given the data size (small), pull all the data from the remove system into an internal structure, and then look for rows which don't exist locally to insert.

Richard
Yes a very viable option and one I have used often in the past. For this particular project I didn't want the external dependency of an SSIS job. I am curious on how to keep everything in the code on this one. Thank you for your response
Brettski
+4  A: 

Perhaps SqlBulkCopy; use SqlCommand.ExecuteReader to get the reader that you use in the call to SqlBulkCopy.WriteToServer. This is the same as bulk-insert, so very quick. It should look something like (untested);

    using (SqlConnection connSource = new SqlConnection(csSource))
    using (SqlCommand cmd = connSource.CreateCommand())
    using (SqlBulkCopy bcp = new SqlBulkCopy(csDest))
    {
        bcp.DestinationTableName = "SomeTable";
        cmd.CommandText = "myproc";
        cmd.CommandType = CommandType.StoredProcedure;
        connSource.Open();
        using(SqlDataReader reader = cmd.ExecuteReader())
        {
            bcp.WriteToServer(reader);
        }
    }
Marc Gravell
Ah, very interesting, I like it. Where am I defining the target command/where the data is going withing the defined connection? Or should I read up on SQLBulkCopy for my answer?
Brettski
Oops - yes, I forgot the table name ;-p Will fix...
Marc Gravell
+6  A: 

Bulk Copy feature of ADO.NET might help you take a look at that :

MSDN - Multiple Bulk Copy Operations (ADO.NET)

An example article

Canavar
I don't know that this would work without select access on the source database, but it is definitely something to investigate due to its sheer speed compared to other methods!
Redbeard 0x0A
A: 

You probably can't do this, but if you can't, DON'T do it with a program. If you have any way of talking to someone who controls the source server, see if they will set up some sort of export of the data. If the data is as small as you say, then xml or csv output would be 100x better than writing something in c# (or any language).

So let's assume they can't export, still, avoid writing a program. You say you have more control over the destination. Can you set up an SSIS package, or setup a linked server? If so, you'll have a much easier time migrating the data.

If you set up at bare minimum the source as a linked server you could write a small t-sql batch to

TRUNCATE DestTable

INSERT INTO DestTable SELECT SourceTable.Star FROM [SourceServer].[Schema].[Table]

wouldn't be as nice as SSIS (you have more visual of what's happening, but the t-sql above is pretty clear).

Since I would not take the programming route, the best solution I could give you would be, if you absolutely had to:

Use SqlClient namespace.

So, create 2 SqlConnections, 2 SqlCommands, and get the instance of the 1 SqlReader.

Iterate through the source reader, and execute the destination SqlCommand insert for each iteration with the.

It'll be ugly, but it'll work.

A: 

Doesn’t seem to be huge quantity of data you have to synchronize. Under conditions you described (only SP to access the remote DB and no way to get anything else), you can go for Marc Gravell’s solution. In the case the data can only grow and existing data can not be changed you can compare the record count on remote and internal DB in order to optimize operation; if no change in remote DB no need to copy.

Dan