views:

186

answers:

3

Hi

In a webapp I am building, I have a requirement in which I need to backup and restore around 10-15 tables from a SQL Server database.

My solution..

-Backup: For each table, Fill a dataset , serialize that to a file

-Restore: Deserialize the file into a dataset and use SQLBulkCopy

Is this the best was to achieve my goal?

+2  A: 

Could you not simply move/copy the data back and forth between (backup) database tables? Why deal with files?

IMO a database is almost certainly the best place to store backup data. It could be in the same database (Customer >> CustomersBak) or you could have a small database you copy data across to.

Two issues you might have with restoring the data back:

1) as sambo mentioned, identity columns.

2) foreign key constraints with other tables that might get out of synch and prevent you from refreshing the original tables

MikeW
Yes I dont want to back/restore the whole database just selected table.
James_Dude
+4  A: 

You can use SQLSMO to backup/restore an entire SQL Server database. This article by John Papa covers it in detail.

You can also programmatically run BACKUP/RESTORE commands against the DB.

Mitch Wheat
Yes I dont want to back/restore the whole database just selected table.
James_Dude
@rid00z: could you explain why?
Mitch Wheat
+1  A: 

You need to make sure you consider the following.

  1. What happens when the schema changes? How do you upgrade the data?
  2. If any of the tables have identity columns you need to make sure you wack on identity insert before loading up the data.
  3. Watch out for triggers, they may or may not be fired during your bulk loading depending on your settings.
Sam Saffron