I often have to transfer large files >50GBs sometimes >100GBs between drives both internal and external during backups of our networks email servers. What is the best method of transferring these files? Command Line such as XCOPY? Possibly something robust enough to continue the transfer if interrupted due to time limits or network issues.
For free, I use SyncToy (from Microsoft). That way if something fails it doesn't abort the whole transfer.
The next best for non-repetitive tasks IMHO is XCopy.
Check out robocopy. From Wikipedia:
robocopy, or "Robust File Copy", is a command-line directory replication command. It was available as part of the Windows Resource Kit, and introduced as a standard feature of Windows Vista and Windows Server 2008.
I get asked this question every now and again and I always say the same thing. Microsoft Background Intelligent Transfer Service (BITS). This is the same technology used to deliver large service packs and such to workstations. Some of the features:
- Network Throttling
- Asynchronous Transfers
- Auto-Resume
- Priority Levels for Downloads
- Proven Transfer Mechanism
For those not wanting to deal with the command line syntax you can explore wrapper applications, such as SharpBITS.NET, that provide a GUI interface.