views:

1286

answers:

5

I often have to transfer large files >50GBs sometimes >100GBs between drives both internal and external during backups of our networks email servers. What is the best method of transferring these files? Command Line such as XCOPY? Possibly something robust enough to continue the transfer if interrupted due to time limits or network issues.

+2  A: 

For free, I use SyncToy (from Microsoft). That way if something fails it doesn't abort the whole transfer.

The next best for non-repetitive tasks IMHO is XCopy.

routeNpingme
+1  A: 

I have used Teracopy with good success.

David Segonds
+6  A: 

Check out robocopy. From Wikipedia:

robocopy, or "Robust File Copy", is a command-line directory replication command. It was available as part of the Windows Resource Kit, and introduced as a standard feature of Windows Vista and Windows Server 2008.

driAn
There's even a handy GUI available for it as well!
Harper Shelby
Is there? Where?
Erik Forbes
A: 

I get asked this question every now and again and I always say the same thing. Microsoft Background Intelligent Transfer Service (BITS). This is the same technology used to deliver large service packs and such to workstations. Some of the features:

  • Network Throttling
  • Asynchronous Transfers
  • Auto-Resume
  • Priority Levels for Downloads
  • Proven Transfer Mechanism

For those not wanting to deal with the command line syntax you can explore wrapper applications, such as SharpBITS.NET, that provide a GUI interface.

Dscoduc
A: 

I use CopyHandler and find it does the job well.

Blorgbeard