tags:

views:

305

answers:

3

I'm writing a backup script I intend to execute in a cronjob every night.

The script sets sys.stdout and sys.stderr to an output file to keep a log of what happens.

To do the backup I use the following code

cmd = 'rsync -av --del --stats --filter "- .thumbnails/" ' + \
    '--filter "- *~" --filter "- *.iso" --filter "- lost+found/" ' + \
    '--filter "- .cache/" --filter "- tmp/" --filter "- *.mp3" ' + \
    '--filter "- *.log" ' + srcDir + ' ' + dstDir

print "Executing '"+cmd+"' ..."
try:
    sys.stdout.flush()
    sys.stderr.flush()
    retcode = subprocess.call( cmd, stdin = sys.stdin, stdout = sys.stdout,
        stderr=sys.stderr, shell=False )
    if retcode < 0:
        print >>sys.stderr, "Command was terminated by signal", -retcode
    elif retcode > 0:
        print >>sys.stderr, "Command returned code ", retcode
except OSError, e:
    print >>sys.stderr, "Execution failed:", e

I add print statements before and after the subprocess call. The problem is that I get the output of the subprocess call before any output of my print instructions before the call. I added the flush() calls but it has no effect.

Why is this happening and how could I change this behaviour ?

A: 

Have you tried putting the flush calls outside the try block?

Ewan Todd
No because this should not should have no effect on the output behavior. It looks like the flush is not flushing in fact. Adding >>sys.stdout to the print "Executing" instruction doesn't change de the behavior.
chmike
A: 

Why are you printing to stderr? If the subprocess is writing to stdout while you are writing to stderr, that could explain the odd interleaving.

Ned Batchelder
I assume the call blocks until the called command ends. Otherwise I wouldn't be able to get the return code.
chmike
The call does block, as you say, so you can retrieve the return code. But that doesn't explain why you are printing to stderr. If you use the same file as the subcommand, perhaps the output would be properly interleaved.
Ned Batchelder
I'm using the same file because I have the instruction 'sys.stdout = sys.stderr = logFile = open( "/tmp/backup.log", "a" )' at the start of the program. Beside specifying stdout=sys.stdout etc. sets the log file as stdout file for the called command. It works. It is just that flush() is not really writing the buffered data to disk.
chmike
+1  A: 

I just found the solution here in a Stackoverflow answer.

Replace

sys.stderr = sys.stdout = logFile = open( tmpLogFileName, 'a' )

with

sys.stderr = sys.stdout = logFile = open( tmpLogFileName, 'a', 0 )

This tells python to not assign any output buffer to file.

chmike
But this is giving up too soon! :-) We want to help you figure out why flush() isn't working; if you just turn off the buffering, the real problem — whatever it was — won't get fixed. :-)
Brandon Craig Rhodes
:] That's the solution. I'm not giving up. I think the real problem is inside flush. Investigating and solving this is beyond my competence and my needs.
chmike
Well, that was an interesting pitfall, glad you overcame it :-)
RedGlyph