views:

173

answers:

4

I am working on a feather that export some tables(~50) to a disk file and import the file back to database. Export is quite easy, serialize dataset to a file stream. But when importing: table structure need to be determined dynamically.What I am doing now :

foreach table in dataset
   (compare table schemas that in db and imported dataset)
   define a batch command
   foreach row in table
      contruct a single insert sqlcommand,add it to batch command
   execute batch insert command

this is very inefficient and I also I meet some problem to convert datatype in dataset datatable to database datatable. So I want to know is there some good method to do so?

Edit:

In fact, import and export is 2 functions(button) in program, On UI, there is a grid that list lots of tables, what I need to implement is to export selected tables's data to a disk file and import data back to database later

+2  A: 

Why not use SQL Server's native Backup and Restore functionality? You can do incremental Restores on the data, and it's by far the fastest way to export and then import data again.

There are a lot of very advanced options to take into account some fringe cases, but at it's heart, it's two commands: Backup Database and Restore Database.

backup database mydb to disk = 'c:\my\path\to\backup.bak'

restore database mydb from disk = 'c:\my\path\to\backup.bak'

When doing this against TB-sized databases, it takes about 45 minutes to an hour in my experience. Much faster than trying to go through every row!

Eric
I want to do like that, buti t is a function of product, by design
static
@static: What do you mean it's a function of product?
Eric
@Eric: see my added Edit above
static
A: 

I'm guessing you are using SQL server? if so I would

a) make sure the table names are showing up in the export b) look into the BulkCopy command. that will allow you to push an entire table in. so you can loop through the datatables and bulk copy each one in.

using (SqlBulkCopy copy = new SqlBulkCopy(MySQLExpConn))
{
    copy.ColumnMappings.Add(0, 0);
    copy.ColumnMappings.Add(1, 1);
    copy.ColumnMappings.Add(2, 2);
    copy.ColumnMappings.Add(3, 3);
    copy.ColumnMappings.Add(4, 4);
    copy.ColumnMappings.Add(5, 5);
    copy.ColumnMappings.Add(6, 6);
    copy.DestinationTableName = ds.Tables[i].TableName;
    copy.WriteToServer(ds.Tables[i]);
 }
mcauthorn
This solution is almost fit me, but 1) I can not using connection string in my system, as it is private in our data access class2) If record already exists in db, I need update some fileds like last_update_user and last_update_time.
static
hmmm that does change the overall problem. to 1) if you can access the data access class you can add the function there and expose it to your class and just hand it a table. 2) if it also updates then you will have to do SQL. I don't know of another way.
mcauthorn
A: 

You can use XML serializatin but you will need good ORML tool like NHibernation etc to help you with it. XML Serialization will maintain its data type and will work flowlessly.

You can read entire table and serialize all values into xml file, and you can read entire xml file back into list of objects and you can store them into database. Using good ORML tool you will not need to write any SQL. And I think they can work on different database servers as well.

Akash Kava
A: 

I finally choose SqlCommandBuilder to build insert command automatically

See SqlCommandBuilder Class

static