views:

214

answers:

3

I've got a script like:

#!/bin/bash
exec /usr/bin/some_binary > /tmp/my.log 2>&1

Problem is that some_binary sends all of its logging to stdout, and buffering makes it so that I only see output in chunks of a few lines. This is annoying when something gets stuck and I need to see what the last line says.

Is there any way to make stdout unbuffered before I do the exec that will affect some_binary so it has more useful logging?

(The wrapper script is only setting a few environment variables before the exec, so a solution in perl or python would also be feasible.)

Thanks.

+3  A: 

You might find that the unbuffer script that comes with expect may help.

Dennis Williamson
http://expect.sourceforge.net/ is probably a more direct link.
Amber
Thanks, I'll change it.
Dennis Williamson
+1  A: 

Some command line programs have an option to modify their stdout stream buffering behaviour. So that's the way to go if the C source is available ...

# two command options ...
man file | less -p '--no-buffer'
man grep | less -p '--line-buffered'

# ... and their respective source code

# from: http://www.opensource.apple.com/source/file/file-6.2.1/file/src/file.c
if(nobuffer)
   (void) fflush(stdout);

# from: http://www.opensource.apple.com/source/grep/grep-28/grep/src/grep.c
if (line_buffered)
   fflush (stdout);

As an alternative to using expect's unbuffer script or modifying the program's source code, you may also try to use script(1) to avoid stdout hiccups caused by a pipe:

See: Trick an application into thinking its stdin is interactive, not a pipe

# Linux
script -c "[executable string]" /dev/null

# FreeBSD, Mac OS X
script -q /dev/null "[executable string]"
trevor
Thanks for the tip!
bstpierre
+1  A: 

GNU coreutils-8.5 also has the stdbuf command to modify I/O stream buffering.

http://www.pixelbeat.org/programming/stdio_buffering/

zaga