tags:

views:

1172

answers:

12

I am looking for a robust way to copy files over a Windows network share that is tolerant of intermittent connectivity. The application is often used on wireless, mobile workstations in large hospitals, and I'm assuming connectivity can be lost either momentarily or for several minutes at a time. The files involved are typically about 200KB - 500KB in size. The application is written in VB6 (ugh), but we frequently end up using Windows DLL calls.

Thanks!

+5  A: 

Try using BITS (Background Intelligent Transfer Service). It's the infrastructure that Windows Update uses, is accessible via the Win32 API, and is built specifically to address this.

It's usually used for application updates, but should work well in any file moving situation.

http://www.codeproject.com/KB/IP/bitsman.aspx

DannySmurf
A: 

How about simply sending a hash after or before you send the file, and comparing that with the file you received? That should at least make sure you have a correct file.

If you want to go all out you could do the same process, but for small parts of the file. Then when you have all pieces, join them on the receiving end.

Erik van Brakel
A: 

@Erik van Brakel: That's exactly what BITS does.

DannySmurf
A: 

According to the Code Project site:

Files are transferred only using the HTTP and HTTPS protocols with or without authentication

Unfortunately, this is a deal breaker (at least within the current scope) because the server application does not have a web interface; files must be copied to a public folder, where they are scooped up and processed from there.

As for comparing hash values, we're already doing that, but only after the file copy completed successfully. But do keep those suggestions coming!

glaxaco
+8  A: 

I've used Robocopy for this with excellent results. By default, it will retry every 30 seconds until the file gets across.

Joel Spolsky
+3  A: 

I'm unclear as to what your actual problem is, so I'll throw out a few thoughts.

  • Do you want restartable copies (with such small file sizes, that doesn't seem like it'd be that big of a deal)? If so, look at CopyFileEx with COPYFILERESTARTABLE
  • Do you want verifiable copies? Sounds like you already have that by verifying hashes.
  • Do you want better performance? It's going to be tough, as it sounds like you can't run anything on the server. Otherwise, TransmitFile may help.
  • Do you just want a fire and forget operation? I suppose shelling out to robocopy, or TeraCopy or something would work - but it seems a bit hacky to me.
  • Do you want to know when the network comes back? IsNetworkAlive has your answer.

Based on what I know so far, I think the following pseudo-code would be my approach:

sourceFile = Compress("*.*");
destFile = "X:\files.zip";

int copyFlags = COPYFILEFAILIFEXISTS | COPYFILERESTARTABLE;
while (CopyFileEx(sourceFile, destFile, null, null, false, copyFlags) == 0) {
   do {
     // optionally, increment a failed counter to break out at some point
     Sleep(1000);
   while (!IsNetworkAlive(NETWORKALIVELAN));
}

Compressing the files first saves you the tracking of which files you've successfully copied, and which you need to restart. It should also make the copy go faster (smaller total file size, and larger single file size), at the expense of some CPU power on both sides. A simple batch file can decompress it on the server side.

Mark Brackett
A: 

Thanks for the RoboCopy and CopyFileEx suggestions. CopyFileEx in particular looks promising. Currently, the client application is using (VS.85).aspx">SHFileOperation to do the actual file copy.

glaxaco
A: 

SMS if it's available works.

Shawn Simon
+1  A: 

I agree with Robocopy as a solution...thats why the utility is called "Robust File Copy"

I've used Robocopy for this with excellent results. By default, it will retry every 30 seconds until the file gets across.

And by default, a million retries. That should be plenty for your intermittent connection.

It also does restartable transfers and you can even throttle transfers with a gap between packets assuing you don't want to use all the bandwidth as other programs are using the same connection (/IPG switch)?.

Robbo
A: 

Does anyone know of a robocopy equivalent for Unix? A minimal "retry x times before giving up" would do. My current "copy solution"

$ tar cf - src/ | (cd dst && tar xvf -)

fails too often on dodgy usb sticks and big(ger) files (GB+). Then it just informs "tried to write xxx bytes, wrote xx" and moves to the next file.

A: 

Hm, seems rsync does it, and does not need server/daemon/install I thought it does - just $ rsync src dst.