views:

467

answers:

4

I am upsizing the back end of a client's Access app tomorrow and need to be ready with a plan for a particular problem with this client. The boss needs to be able to take the data file away from the office where the SQL Server is on a regular basis (usually monthly) and run statistical analyses on the data.

I've looked at Easiest way to copy an entire SQL server Database from a server to local SQL Express, and the solutions there don't work in this scenario because:

  1. can't use a one-time solution (Database Publishing Wizard), as this needs to be scriptable.

  2. can't use any method that depends on the two computers being networked, as that is not an option -- data has to be transferred via USB thumb drive (so, no replication).

  3. can't use any method that depends on running SQL Server management tools from the server console or from the workstation for the backup.

  4. can't connect to the SQL Server database directly for the analysis because the data has to be portable to other locations.

What I think I need is some way to call a script that creates a backup file, then copy the result to the USB drive. I then need a second script to copy the backup file from the USB drive and restore it onto the other SQL Server.

The data being transported is read-only (or, any changes made don't need to get back to the main server), and data won't be updated in the second location. It's currently scripted with a plain old batch file to copy the back-end MDB file, and I need something that is as simple for the user.

It can't have any dependencies on, say, Powershell (of SQL Server Management Studio), because I don't want it to have to be installed on the computer the user is running the script from (there are a half dozen workstations the script needs to be runnable from, and I don't want to have to install something on all of those).

I'm going to be setting up the backup agent to create a backup every night, so I could perhaps copy that file without needing to initiate the backup before copying. Thus I might only need to script the restore on the target computer.

Thoughts, suggestions, pointers?

+5  A: 

You should definitely be able to create something like that.

One part would be a T-SQL CREATE BACKUP script as a .sql script, and execute that from a standard Windows batch (*.bat) or command (*.cmd) file using the sqlcmd command line tool.

That would be something like this:

backup.sql

BACKUP DATABASE YourDatabase
TO DISK = 'Z:\Backup\YourDatabase.bak'
WITH FORMAT;

The second part would be a .sql file with a T-SQL RESTORE script, basically reading the data from a given location on disk and restoring it to that SQL Server instance there.

restore.sql

RESTORE DATABASE YourDatabase
   FROM AdventureWorks2008R2Backups 
   WITH 
     MOVE 'YourDatabase_Data' TO 'C:\MSSQL\Data\YourDatabase.mdf',
     MOVE 'YourDatabase_Log' TO 'C:\MSSQL\Data\YourDatabase_Log.ldf';
GO

Of course, you need to adapt those names and paths to your own actual requirements - but that should just give you a hint how to get started with this endeavor.

To execute one of those .sql script using sqlcmd, you need something like:

sqlcmd -S (name of server) -U (login) -P (password) -I (name of script file)

e.g.

sqlcmd -S (local) -U someuser -P top$secret -I backup.sql

Resources:

marc_s
How do you run SQLCMDs from a workstation that doesn't have SQL Server installed on it? Is there a way to execute those SQL scripts from the workstation, as opposed to from the server console, or from a workstation with the SQL Server Management tools installed? Those won't be installed on the workstations the user will be using to copy the data to the USB drive (that's exactly the outside dependency I don't want to introduce).
David-W-Fenton
OK, I see that I have to install the commandline utilities separately for the SQLCMD utility. Since I have to install the SQL Server Native Client, I guess it's not unreasonable to install those at the same time. Unfortunately, the download docs for that (http://www.microsoft.com/downloads/details.aspx?FamilyId=C6C3E9EF-BA29-4A43-8D69-A2BED18FE73C) say it requires Windows Installer 4.5, yet another heavyweight install (that requires a reboot for WinXP workstations). This is looking impossibly unwieldy on the source end (though OK on the target, which will have SQL Server Express installed).
David-W-Fenton
The R2 downloads are at http://www.microsoft.com/downloads/details.aspx?FamilyID=ceb4346f-657f-4d28-83f5-aae0c5c83d52
David-W-Fenton
Even though this answer doesn't fit all the restrictions of my specific situation (I don't want to install the commandline utilities on the workstations), I think it's probably the best answer to the question in its most general form, i.e., without my specific restrictions, so I'm accepting it as the answer.
David-W-Fenton
I could swear I accepted the answer. I have no idea what happened there!
David-W-Fenton
A: 

Why not continue copying the entire database file? This reads like a small enough operation to get away with a momentary pause: You could shrink the database, detach, and copy the files directly. On the target system, you could attach the SQLExpress DB by file name in the connection string.

Steve
Scriptable? Runnable by a non-administrator? Runnable from a workstation rather than from the server console?
David-W-Fenton
I was considering the last statement about using a nightly backup copy, which suggests it is copied to a share, and only needing a restore script. If the backup plan includes a task to copy the db (E.G.: via a T-SQL task in the maintenance plan), and the connection string targets the db file name, then there wouldn't need to be a restore script.
Steve
Well, I haven't created the backup plan yet, since I haven't done the upsizing -- that's tomorrow! But, yes, I figured I'd share the destination for the backup script, and, copying that would be easy, but that wouldn't require any of the things you've recommended about stopping the db server and so forth. Stopping the server and copying the actual files is a complete non-starter, in any case, so the cited URL doesn't really help.
David-W-Fenton
+1  A: 

Just a thought but if he currently goes away with a copy of a MDB file with all the data in just for himself then why not carry on like that. You could just do it all in access with a query for each table.

If you setup your “reporting” MDB with the tables linked and called “tblFoo_LINKED” and a local table called “tblFoo” you could then just run a bit of VBA code that would loop through all the tables and execute something like

INSERT INTO tblFoo SELECT * FROM tblFoo_LINKED

The SQL server backup restore would still be the option I would go for but just offering a different spin on things which might fit your needs

Kevin Ross
Doesn't work because the remote user also has to use the existing application. I don't want to have to maintain two versions of the front end, one for use with Jet and one for use with SQL Server. Right now, the app hasn't changed enough for this to matter, but once I start moving things server-side for enhancing performance, it will no longer work. On the other hand, most of the things that will be moved server-side will be things like inserts (using SPROCs), which the user won't be doing. Hmm. Worth a think.
David-W-Fenton
I started giving this more thought, as I don't think my concerns about the main applicaton working are that important. I got it working using DoCmd.TransferDatabase (should be more efficient than a MakeTable and get more accurate data types), but it's not importing the indexes. Hmm. Gotta do some more spelunking....
David-W-Fenton
I'm not accepting this as the answer, even though it's closest to what I ended up choosing as my solution. My reason is that I think it's better to accept the answer that more directly matches the question I asked (which was SQL Server only). But I'm upvoting it because it caused me to think through and test this approach. I'm going to use one of two approaches (I haven't tested the performance yet): A. insert data into empty template MDB with indexes already defined, or B. import with TransferDatabase and add the indexes with DAO.
David-W-Fenton
That’s fair enough, thanks for the up vote. As for your two plans, its close but I think my money would be on importing and then building the indexes but only just. It will be interesting to hear back with the finial results
Kevin Ross
My concern with building the indexes is simply that it's fussy code. If I could script it out of an existing database, I'd be more likely to use it. I do believe the import would be faster than the INSERTs. Before I fell asleep last night it did occur to me that TransferText can import to an existing table, so maybe TransferText can, too? I'll have to check on that, because if it can, then that's by far the easiest method (importing into tables in an existing empty shell database).
David-W-Fenton
A: 

I had this same issue transporting db between production (server) and test-development (local in another location) and also transporting finished db to hosted server.

As it turned out I can just transport the .MDF by itself.

  1. Make sure target db does not have that db attached, delete it first in SSMS.
  2. Move the .MDF by itself (without the log file).
  3. At target location default c:\program files...sql..\DATA make sure all prior instance of MDF and LDF are moved or deleted -- if it sees an .ldf there it gets confused.
  4. In SSMS, choose to attach. Press Add, select the .mdf.
  5. At this point, in the box right below, it will show that an MDF and LDF are attached and that the LDF is missing. Click the LDF and press the REMOVE button.
  6. Now the MDF will be attached and a new/empty LDF will be added.

I do this all the time; works perfectly -- saves the trouble of transporting a large .ldf. (I often use dropbox.com instead of a thumb drive and encrypt the file with PKZIP/SecureZip first.)

I'm not sure what happens if you copy the MDF with the server started, though I do not stop it before copying. I'm not sure how to make this scriptable -- the attach can be scriptable but I'm not sure about deleting the .LDF before proceeding.

You could script the server to make a snapshot copy of the database, then transport that MDF knowing that it was not being updated.

Another idea - write a program to insert all the records into a SQL-Server compact edition file and transport that? I've not tried it, but read about it.

pghcpa
While it's probably good for completeness to have this answer here, it doesn't fit the restrictions in my question. It is no less efficient than the backup file. The database I was doing this for has a 64MB MDF file, and the backup file is 63MBs! So, there's no real savings in taking just the MDF file, and if you take both files, you're copying a lot more data. Also, the the backup dump runs in mere seconds, while stopping and starting the server would take quite a long time. Last of all, this has to be done during the working day, and the server can't really be stopped.
David-W-Fenton
I re-read twice again and didn't see where you wanted a "smaller" file -- so anyway... backup should contain MDF/LDF combo -- MDF alone should be smaller. I'm not sure you have to stop the server to get it. It probably zips way down. But here's a few more ideas if you don't think SQL/Server Compact edition will work:1. Use SQL/Server publisher/subscriber replication to "update" the db -- only changes come over so should be quick. If not networked as you said then have a "hub" hosted -- I use serverintellect.com -- so fast it's almost like being local
pghcpa
Or 2. Try sync services: http://msdn.microsoft.com/en-us/sync/default.aspx -- which it says will sync with SQL/Server compact edition. If requirement is no network ever, no Internet ever (small portable USB-powered drive like Seagate only) then not sure how you can avoid copying over full db unless you copy out just changed records since last update. Hope it helps.
pghcpa
I didn't ask for a smaller file, but you brought up the issue of smaller files by suggesting skipping the LDF file. I'm only concerned with how long the process takes, and copying more data takes longer. But for me, the time it takes to stop/start the server is by far the bigger issue, not least of which because it can't be done during regular business hours.
David-W-Fenton
Using SQL Server CE would require installing it on the target workstation, or setting it up as linked server there. I can't see the value in doing that. As of this point, I've entirely abandoned the idea of copying the SQL Server database, and am instead importing the data into an Access database, and using that. It's much simpler as the target workstation doesn't need SQL Server installed at all in that scenario.
David-W-Fenton
At the time I posted this, the server involved did not have Internet access, so using a site like serverintellect.com would not have been an option.
David-W-Fenton
The original scenario was no networked connection between source and target machines and no Internet access on the source machine (possibly not on the target machine, either, but I'm not entirely certain on that). So, copying the whole database seemed to me to be the only solution. There was never any benefit to copying just the deltas, as it's a one-way, read-only operation -- that is, just replacing the target data with the new data was all that was needed. The backup seemed to me to be by far the simplest way to capture the data and load it into another SQL Server.
David-W-Fenton
My understanding is that CE requires no installation. Works like a back-end .ACCDB. Can be replicated so that when the boss is ready to take a copy, one is always ready to go without stopping server. Could be delivered via dropbox.com (or equivalent -- such as Groove now part of Sharepoint) so that only changed bits copy over -- should be very fast. If 63mb imported ACCDB works and causes no problem when source data is in SQL/Server, then I guess you found the best solution.
pghcpa