tags:

views:

152

answers:

2

I have a Python script using shutil.copy2 extensively. Since I use it to copy files over the network, I get too frequent I/O errors, which lead to the abortion of my program's execution:

Traceback (most recent call last):
  File "run_model.py", line 46, in <module>
    main()
  File "run_model.py", line 41, in main
    tracerconfigfile=OPT.tracerconfig)
  File "ModelRun.py", line 517, in run
    self.copy_data()
  File "ModelRun.py", line 604, in copy_ecmwf_data
    shutil.copy2(remotefilename, localfilename)
  File "/usr/lib64/python2.6/shutil.py", line 99, in copy2
    copyfile(src, dst)
  File "/usr/lib64/python2.6/shutil.py", line 54, in copyfile
    copyfileobj(fsrc, fdst)
  File "/usr/lib64/python2.6/shutil.py", line 27, in copyfileobj
    buf = fsrc.read(length)
IOError: [Errno 5] Input/output error

How can I avoid abortion of my program's execution and have it retry the copy process instead?

The code I'm using already checks that the file is actually copied completely by checking the filesize:

def check_file(file, size=0):
    if not os.path.exists(file):
        return False
    if (size != 0 and os.path.getsize(file) != size):
        return False
    return True

while (check_file(rempdg,self._ndays*130160640) is False):
    shutil.copy2(locpdg, rempdg)
+6  A: 

you can use

try:
    ...
except IOError as err:
    ...

to catch the errors and treat them

Have a look on this

ThibThib
+6  A: 

Which block is giving the error? Just wrap a try/except around it:

def check_file(file, size=0):
    try:
        if not os.path.exists(file):
            return False
        if (size != 0 and os.path.getsize(file) != size):
            return False
        return True
    except IOError:
        return False # or True, whatever your default is

while (check_file(rempdg,self._ndays*130160640) is False):
    try:
        shutil.copy2(locpdg, rempdg)
    except IOError:
        pass # ignore the IOError and keep going
Mark Rushakoff