tags:

views:

99

answers:

5

Hi Friends,

I need to copy a backup file from a server to a share every day. Actually I need to create a job to do that every day. I have created a bat file to do it. The problem is that I don't know how to have it run in the job. Is there any other way to do that.

Please, suggest. I would really appreciate it.

Thanks

+1  A: 

If you just need to run the batch script, set up a Windows scheduled task for it.

Nick
A: 

You can use two commands: at.exe or schtasks.exe

at.exe is easier to use, but is less flexible. The article here talks about the kind of thing you want to do.

schtasks.exe is the command-line version of the Windows scheduled task. You can find more info on it here: http://msdn.microsoft.com/en-us/library/bb736357(VS.85).aspx

at.exe is much less granular. It can run a job only once a day. schtasks.exe is more complicated, but it is very, very flexibly. You can schedule a job to recur every N minutes, and more.

For example, if you wanted to backup every 5 minutes (say for warm standby databases):

schtasks.exe /Create /ST 00:00 /SC MINUTE /MO 5 /RU SYSTEM /TN DatabaseBackup /TR "backup_db.bat" /F
Christopher
+1  A: 

Q1: When do you need to copy the backup file? Q2: How do you know that the backup has been completed at the time you want to perform the copy?

To avoid trying to copy a backup file that hasn't been completed yet, I would made the copy part of the process that creates the backup (with logic for "IF (backup succesful) THEN (copy file) ELSE (send panic email). You could set it on a different schedule(r), but then you've got possible timing synchronization issues -- ones that you have to resolve now that will apply for the next N weeks, months, or years, and things (such as database sizes) tend to change in unpredictable ways over the longer stretches of time.

If (for example) your database backup is being generated by a SQL Agent job, then what you'd do is add a step to the job after the backup is made to copy the file--in this case, to call your batch file. There are any number of ways to do this, and they're all a bit involved, so I'll just give some highlights: - Add a second job step, with job step type of "Operating System (CmdExec)"; in here, call the batch file - Add a second job step, job step type "Transact-SQL script (T-SQL)", and use xp_cmdshell to call your batch file - Add a second T-SQL step, it calls a procedure, that procedure shells out via xp_cmdshell, and can do other stuff if or as necessary.

The big gotcha here will be security: does the account running SQL Agent have sufficient rights to access file files and folders, do you want xp_cmdexec enabled, what about the SQL Agent proxy account... it can get complex fast, depending on how your environment is configured and how tight you want security to be. Be wary as you go forward with this!

Philip Kelley
xp_cmdshell is a security problem, so I'd be careful with it.
onupdatecascade
A: 

I agree with Philip, use SQL Agent.

Since I didn't want to give two separate comments to the other posts, I'll post as answer instead.

Mixing the OS scheduling (at, schtask) with SQL operations is riddled with problems. It makes administering and troubleshooting the deployment unnecessary difficult. The OS schedulers don't follow SQL in event of a failover (clustering or mirroring). They don't participate in the system backup and restore strategy. They have a separate administrative toolset meaning the admins have to know to look for them. They are not found when searching for dependencies. And so on and so forth.

Since SQL Server comes with such a rich job scheduling tool as Agent, which has been around for a while and the know-how around how to us it is just abundant, there's really no reason to turn to at.exe and its baby cousin.

The only place when the OS schedulers are even worth considering is when Agent is not available: Express deployments. For them I prefer using timers since they are persisted in the database so they play better with the backup/restore scenarios, but I concede that the OS schedulers are a valid choice too. Again, only for Express.

Remus Rusanu
To mention, I have use the Windows scheduler when there was a hodge-podge of database-related activity to do across multiple servers, something SQL isn't really all that good at. But coordination of the flow of events can be very tricky.
Philip Kelley
If the logic is complex you can consider using SSIS and have Agent run the SSIS package.
Remus Rusanu
A: 

I have tried using an agent job to copy to a share. I use net use to make sure the current context has a mapped drive , then i xcopy the files to that share.

When i launch the job manually, it says it starts, but never completes. The volume of data is such that 10 minutes should be more than enough. i dont see any progress indicators in the agent log. Any ideas on where i can look to see what is happening.

Using timestamps as an indication, I believe that NO files are being copied. Hmmmm....

any thoughts?

greg