tags:

views:

163

answers:

3

I would like to run several commands, and capture all output to a logfile. I also want to print any errors to the screen (or optionally mail the output to someone).

Here's an example. The following command will run three commands, and will write all output (STDOUT and STDERR) into a single logfile.

{ command1 && command2 && command3 ; } > logfile.log 2>&1

Here is what I want to do with the output of these commands:

  • STDERR and STDOUT for all commands goes to a logfile, in case I need it later--- I usually won't look in here unless there are problems.
  • Print STDERR to the screen (or optionally, pipe to /bin/mail), so that any error stands out and doesn't get ignored.
  • It would be nice if the return codes were still usable, so that I could do some error handling. Maybe I want to send email if there was an error, like this:

    { command1 && command2 && command3 ; } > logfile.log 2>&1 || mailx -s "There was an error" [email protected]

The problem I run into is that STDERR loses context during I/O redirection. A '2>&1' will convert STDERR into STDOUT, and therefore I cannot view errors if I do 2> error.log

Here are a couple juicier examples. Let's pretend that I am running some familiar build commands, but I don't want the entire build to stop just because of one error so I use the '--keep-going' flag.

{ ./configure && make --keep-going && make install ; } > build.log 2>&1

Or, here's a simple (And perhaps sloppy) build and deploy script, which will keep going in the event of an error.

{ ./configure && make --keep-going && make install && rsync -av --keep-going /foo devhost:/foo} > build-and-deploy.log 2>&1

I think what I want involves some sort of Bash I/O Redirection, but I can't figure this out.

A: 

Try:

command 2>&1 | tee output.txt

Additionally, you can direct stdout and stderr to different places:

command > stdout.txt >& stderr.txt

command > stdout.txt |& program_for_stderr

So some combination of the above should work for you -- e.g. you could save stdout to a file, and stderr to both a file and piping to another program (with tee).

Ether
David Zaslavsky
Stefan Lasiewski
+2  A: 
(./doit >> log) 2>&1 | tee -a log

This will take stdout and append it to log file.

The stderr will then get converted to stdout which is piped to tee which appends it to the log (if you are have Bash 4, you can replace 2>&1 | with |&) and sends it to stdout which will either appear on the tty or can be piped to another command.

I used append mode for both so that regardless of which order the shell redirection and tee open the file, you won't blow away the original. That said, it may be possible that stderr/stdout is interleaved in an unexpected way.

R Samuel Klatchko
Stefan Lasiewski
This seems to work if I do something like this:`( { ./doit
Stefan Lasiewski
R Samuel Klatchko
Shockingly, something like this seems to work also: `{ { date
Stefan Lasiewski
+1  A: 

If your system has /dev/fd/* nodes you can do it as:

( exec 5>logfile.txt ; { command1 && command2 && command3 ;} 2>&1 >&5 | tee /dev/fd/5 )

This opens file descriptor 5 to your logfile. Executes the commands with standard error directed to standard out, standard out directed to fd 5 and pipes stdout (which now contains only stderr) to tee which duplicates the output to fd 5 which is the log file.

Geoff Reedy
When I execute your command, my logfile only contains the error messages that I see on screen. stdout is lost.
tangens
Try `tee -a /dev/fd/5`
Dennis Williamson
This works also. However, I'm not sure I understand why.But I tried a command like this, and it works: `( exec 5>logfile.txt ; { date
Stefan Lasiewski