views:

4191

answers:

7

There is a limitation on Windows Server 2003 that prevents you from copying extremely large files, in proportion to the amount of RAM you have. The limitation is in the CopyFile and CopyFileEx functions, which are used by xcopy, Explorer, Robocopy, and the .NET FileInfo class.

Here is the error that you get:

Cannot copy [filename]: Insufficient system resources exist to complete the requested service.

The is a knowledge base article on the subject, but it pertains to NT4 and 2000.

There is also a suggestion to use ESEUTIL from an Exchange installation, but I haven't had any luck getting that to work.

Does anybody know of a quick, easy way to handle this? I'm talking about >50Gb on a machine with 2Gb of RAM. I plan to fire up Visual Studio and just write something to do it for me, but it would be nice to have something that was already out there, stable and well-tested.

[Edit] I provided working C# code to accompany the accepted answer.

A: 

You can try TotalCopy

ChRoss
+9  A: 

The best option is to just open the original file for reading, the destination file for writing and then loop copying it block by block. In pseudocode :

f1 = open(filename1);
f2 = open(filename2, "w");
while( !f1.eof() ) {
  buffer = f1.read(buffersize);
  err = f2.write(buffer, buffersize);
  if err != NO_ERROR_CODE
    break;
}
f1.close(); f2.close();

[Edit by Asker] Ok, this is how it looks in C# (it's slow but it seems to work Ok, and it gives progress):

using System;
using System.Collections.Generic;
using System.IO;
using System.Text;

namespace LoopCopy
{
    class Program
    {
        static void Main(string[] args)
        {
            if (args.Length != 2)
            {
                Console.WriteLine(
                  "Usage: LoopCopy.exe SourceFile DestFile");
                return;
            }

            string srcName = args[0];
            string destName = args[1];

            FileInfo sourceFile = new FileInfo(srcName);
            if (!sourceFile.Exists)
            {
                Console.WriteLine("Source file {0} does not exist", 
                    srcName);
                return;
            }
            long fileLen = sourceFile.Length;

            FileInfo destFile = new FileInfo(destName);
            if (destFile.Exists)
            {
                Console.WriteLine("Destination file {0} already exists", 
                    destName);
                return;
            }

            int buflen = 1024;
            byte[] buf = new byte[buflen];
            long totalBytesRead = 0;
            double pctDone = 0;
            string msg = "";
            int numReads = 0;
            Console.Write("Progress: ");
            using (FileStream sourceStream = 
              new FileStream(srcName, FileMode.Open))
            {
                using (FileStream destStream = 
                    new FileStream(destName, FileMode.CreateNew))
                {
                    while (true)
                    {
                        numReads++;
                        int bytesRead = sourceStream.Read(buf, 0, buflen);
                        if (bytesRead == 0) break; 
                        destStream.Write(buf, 0, bytesRead);

                        totalBytesRead += bytesRead;
                        if (numReads % 10 == 0)
                        {
                            for (int i = 0; i < msg.Length; i++)
                            {
                                Console.Write("\b \b");
                            }
                            pctDone = (double)
                                ((double)totalBytesRead / (double)fileLen);
                            msg = string.Format("{0}%", 
                                     (int)(pctDone * 100));
                            Console.Write(msg);
                        }

                        if (bytesRead < buflen) break;

                    }
                }
            }

            for (int i = 0; i < msg.Length; i++)
            {
                Console.Write("\b \b");
            }
            Console.WriteLine("100%");
            Console.WriteLine("Done");
        }
    }
}
jabial
Use at least a 100KB better a 1MB buffer. This will speed up the file copy enormously!
Aaron Digulla
You've got 2 GB to play with... ;) Seriously, though, Aaron is right - bump up the read/write buffer.
GalacticCowboy
+5  A: 

If you want to write code, one way you can optimize is sending the file in chunks (like using MTOM). I used this approach for sending down huge files from a DataCenter down to our office for printing..

Also, check the TeraCopy utility mentioned here..

Gulzar
A: 

MuCommander (http://www.mucommander.com/) can copy files of arbitrary size and it can also delete directories with names longer than 256 characters ...

Aaron Digulla
+1  A: 

Use Cygwin's cp or dd commands

cp \path\to\file \target
dd if=\path\to\file of=\path\to\new\file
Paul Betts
A: 

Aaron and GalacticCowboy,

It appears that the very low buffer size of the original code is what makes this work. I had pretty much the same code but was using 1MB chunk and my code was failing. When I reduce my code to 1024 byte buffer then it works, if I increase the buffer of the code posted above then it fails. The only problem is the poor performance with a file copy taking 2 to 4 times longer.

I must say that this is a very serious issue, an operating system that cannot do something as basic as copy a file. It appears Microsoft are simply ignoring this issue and think that registry hacks is a valid fix. This has been a problem since windows NT and will probably be a problem in future operating systems. It appears there is more than once cause for this error which is a concern because it means MS have made several mistakes. Time for MS to get their act together me thinks.

Edit: Actually the above code does NOT work. It appears to be a timing issue and the only reason it works sometimes is because using a 1k buffer causes it to copy slowly across the network. When the bottle neck is eliminated this fails in the same place everything else I have tried fails (after copying 18GB of a 70GB file).

Michael C
What is the error?
Brannon
See the first post:Cannot copy [filename]: Insufficient system resources exist to complete the requested service.
Michael C
A: 

Just to answer other people's posts Teracopy does work and I assume other utilities mentioned do also but when trying to run a scheduled task you get limited feedback when something goes wrong. As an example, if you get a hard drive full error then TeraCopy simply waits for ever and never returns. I had to put a timeout in my script that just says "TeraCopy failed to complete after 4 hours". I could find no way to get the error message. TeraCopy failed on the very first weekend run for no apparent reason.

For Gulzar, Sending the file in chunks does not work, I tried 1GB chunks and windows manages to remember that you've read the previous 17 chunks (in my case it fails on the 18th or 19th chunk). My script would open the source and destination files, copy a 1GB chunk and then close both files. Then it would reopen both of them, move to the 2nd GB and copy again etc. I tried various chunk sizes with no luck, it always failed around 18GB.

Michael C
any sample code about your issues ?
alhambraeidos