tags:

views:

1394

answers:

6

I have a small script, which is called daily by crontab using the following command:

/homedir/MyScript &> some_log.log

The problem with this method is that some_log.log is only created after MyScript finishes. I would like to flush the output of the program into the file while it's running so I could do things like

tail -f some_log.log

and keep track of the progress, etc.

A: 

This isn't a function of bash, as all the shell does is open the file in question and then pass the file descriptor as the standard output of the script. What you need to do is make sure output is flushed from your script more frequently than you currently are.

In Perl for example, this could be accomplished by setting:

$| = 1;

See perlvar for more information on this.

Greg Hewgill
+1  A: 

bash itself will never actually write any output to your log file. Instead, the commands it invokes as part of the script will each individually write output and flush whenever they feel like it. So your question is really how to force the commands within the bash script to flush, and that depends on what they are.

Chris Dodd
Thanks, that cleared up the issue a bit.
noam
A: 

I don't know if it would work, but what about calling sync?

ws
`sync` is a low-level filesystem operation and is unrelated to buffered output at the application level.
Greg Hewgill
Since you can call sync as a command, what does it do, then?
ws
`sync` writes any dirty filesystem buffers to physical storage, if necessary. This is internal to the OS; applications running on top of the OS always see a coherent view of the filesystem whether or not the disk blocks have been written to physical storage. For the original question, the application (script) is probably buffering the output in a buffer internal to the application, and the OS won't even know (yet) that the output is actually destined to be written to stdout. So a hypothetical "sync"-type operation wouldn't be able to "reach into" the script and pull the data out.
Greg Hewgill
A: 

well like it or not this is how redirection works.

In your case the output (meaning your script has finished) of your script redirected to that file.

What you want to do is add those redirections in your script.

alinrus
+1  A: 

Take a look at this answer for links to an expect script called unbuffer and see if it would work for you.

Dennis Williamson
A: 

I had this problem with a background process in Mac OS X using the StartupItems. This is how I solve it:

If I make sudo ps aux I can see that mytool is launched.

I found that (due to buffering) when Mac OS X shuts down mytool never transfers the output to the sed command. However, if I execute sudo killall mytool, then mytool transfers the output to the sed command. Hence, I added a stop case to the StartupItems that is executed when Mac OS X shuts down:

start)
    if [ -x /sw/sbin/mytool ]; then
      # run the daemon
      ConsoleMessage "Starting mytool"
      (mytool | sed .... >> myfile.txt) & 
    fi
    ;;
stop)
    ConsoleMessage "Killing mytool"
    killall mytool
    ;;
Freeman