views:

604

answers:

2

I am writing some testing scripts and want to catch all error output and write it to an error log as well as all regular output and write that to a separate log. I am using a command of the form

cmd > output.file 2> error.file

The command I am writing test scripts for can cause a segmentation fault. When the command segfaults, bash still prints out segmentation fault to the terminal.

I want this to not happen or get redirected along with standard error.

Is it possible? It must be of bash's doing because both output streams are being redirected.

A: 

I don't think segfaults are part of your program's output from the shell's point of view. So use

Expect for more reliable output

http://en.wikipedia.org/wiki/Expect

Aiden Bell
+6  A: 
bash -c 'cmd >output.file 2>error.file' >bash_output.file 2>&1
chaos
Your shell-foo is good sir :P
Aiden Bell
Note that behavior is different between shells. zsh output put it out to a file at all, and regular 'sh' puts the segmentation fault error message in the stderr for the process that segfaulted, while bash seems to put it out on the stderr stream for the parent shell. Be very sure you know which shell you're working with because of the apparent non-portability of this idea.
Michael Trausch
*zsh *DOESN'T* output it out to a file at all. Goodness, I need to learn how to type. :-P
Michael Trausch
that is exactly the way to do it but I would favor the $() syntax instead of backticks ` `.
Jeremy Wall