views:

925

answers:

6

say i have a directory with hi.txt and blah.txt and i execute the following command on a linux-ish command line

ls *.* | xargs -t -i{} echo {}

the output you will see is

echo blah.txt
blah.txt
echo hi.txt
hi.txt

i'd like to redirect the stderr output (say 'echo blah.txt' fails...), leaving only the output from the xargs -t command written to std out, but it looks as if it's stderr as well.

ls *.* | xargs -t -i{} echo {} 2> /dev/null

Is there a way to control it, to make it output to stdout?

+1  A: 

It looks like xargs -t goes to stderr, and there's not much you can do about it.

You could do:

ls | xargs -t -i{} echo "Foo: {}" >stderr.txt | tee stderr.txt

to display only the stderr data on your terminal as your command runs, and then grep through stderr.txt after to see if anything unexpected occurred, along the lines of grep -v Foo: stderr.txt

Also note that on Unix, ls *.* isn't how you display everything. If you want to see all the files, just run ls on its own.

C Pirate
+1  A: 

xargs -t echos the commands to be executed to stderr before executing them. If you want them to instead echo to stderr, you can pipe stderr to stdout with the 2>&1 construct:

ls *.* | xargs -t -i{} echo {} 2>&1
Adam Rosenfield
I think u mean stdout the 2nd time you mentioned it, and this runs into the same issue as i only want stderr msg's from xargs, not all stderr messages.
Roy Rico
+2  A: 

Use:

ls | xargs -t -i{} echo {} 2>&1 >/dev/null

The 2>&1 sends the standard error from xargs to where standard output is currently going; the >/dev/null sends the original standard output to /dev/null. So, the net result is that standard output contains the echo commands, and /dev/null contains the file names. We can debate about spaces in file names and whether it would be easier to use a sed script to put 'echo' at the front of each line (with no -t option), or whether you could use:

ls | xargs -i{} echo echo {}

(Tested: Solaris 10, Korn Shell ; should work on other shells and Unix platforms.)


If you don't mind seeing the inner workings of the commands, I did manage to segregate the error output from xargs and the error output of the command executed.

al * zzz | xargs -t 2>/tmp/xargs.stderr -i{} ksh -c "ls -dl {} 2>&1"

The (non-standard) command al lists its arguments one per line:

for arg in "$@"; do echo "$arg"; done

The first redirection (2>/tmp/xargs.stderr) sends the error output from xargs to the file /tmp/xargs.stderr. The command executed is 'ksh -c "ls -dl {} 2>&1"', which uses the Korn shell to run ls -ld on the file name with any error output going to standard output.

The output in /tmp/xargs.stderr looks like:

ksh -c ls -dl x1 2>&1
ksh -c ls -dl x2 2>&1
ksh -c ls -dl xxx 2>&1
ksh -c ls -dl zzz 2>&1

I used 'ls -ld' in place of echo to ensure I was testing errors - the files x1, x2, and xxx existed, but zzz does not.

The output on standard output looked like:

-rw-r--r--   1 jleffler rd          1020 May  9 13:05 x1
-rw-r--r--   1 jleffler rd          1069 May  9 13:07 x2
-rw-r--r--   1 jleffler rd            87 May  9 20:42 xxx
zzz: No such file or directory

When run without the command wrapped in 'ksh -c "..."', the I/O redirection was passed as an argument to the command ('ls -ld'), and it therefore reported that it could not find the file '2>&1'. That is, xargs did not itself use the shell to do the I/O redirection.

It would be possible to arrange for various other redirections, but the basic problem is that xargs makes no provision for separating its own error output from that of the commands it executes, so it is hard to do.

The other rather obvious option is to use xargs to write a shell script, and then have the shell execute it. This is the option I showed before:

ls | xargs -i{} echo echo {} >/tmp/new.script

You can then see the commands with:

cat /tmp/new.script

You can run the commands to discard the errors with:

sh /tmp/new.script 2>/dev/null

And, if you don't want to see the standard output from the commands either, append 1>&2 to the end of the command.

Jonathan Leffler
wouldn't this redirect all errors? I just want to capture the output from xargs, not all stderr :(
Roy Rico
There isn't a meaningful distinction - there's standard error from xargs and what it runs, so it will indeed redirect 'all errors'. You can write your own xargs if you want it to do something else. As shown, the 'echo' command won't send anything to stderr anyway, so I'm not clear what the issue is -- you probably have simplified the question using echo.
Jonathan Leffler
thanks for the insite and info how to use make... i think in the end it's just easier to write a mini bash shell and have xargs pipe into that. thanks for the writeup tho
Roy Rico
A: 

So I believe what you want is to have as stdout is

  • the stdout from the utility that xargs executes
  • the listing of commands generated by xargs -t

You want to ignore the stderr stream generated by the executed utility.

Please correct me if I'm wrong.

First, let's create a better testing utility:

% cat myecho
#!/bin/sh
echo STDOUT $@
echo STDERR $@ 1>&2
% chmod +x myecho
% ./myecho hello world
STDOUT hello world
STDERR hello world
% ./myecho hello world >/dev/null
STDERR hello world
% ./myecho hello world 2>/dev/null
STDOUT hello world
%

So now we have something that actually outputs to both stdout and stderr, so we can be sure we're only getting what we want.

A tangential way to do this is not to use xargs, but rather, make. Echoing a command and then doing it is kind of what make does. That's it's bag.

% cat Makefile
all: $(shell ls *.*)

$(shell ls): .FORCE
  ./myecho $@ 2>/dev/null

.FORCE:
% make
./myecho blah.txt 2>/dev/null
STDOUT blah.txt
./myecho hi.txt 2>/dev/null
STDOUT hi.txt
% make >/dev/null
%

If you're tied to using xargs, then you need to modify your utility that xargs uses so it surpresses stderr. Then you can use the 2>&1 trick others have mentioned to move the command listing generated by xargs -t from stderr to stdout.

% cat myecho2
#!/bin/sh
./myecho $@ 2>/dev/null
% chmod +x myecho2
% ./myecho2 hello world
STDOUT hello world
% ls *.* | xargs -t -i{} ./myecho2 {} 2>&1
./myecho blah.txt 2>/dev/null
STDOUT blah.txt
./myecho hi.txt 2>/dev/null
STDOUT hi.txt
% ls *.* | xargs -t -i{} ./myecho2 {} 2>&1 | tee >/dev/null
%

So this approach works, and collapses everything you want to stdout (leaving out what you don't want).

If you find yourself doing this a lot, you can write a general utility to surpress stderr:

% cat surpress_stderr
#!/bin/sh
$@ 2>/dev/null
% ./surpress_stderr ./myecho hello world
STDOUT hello world
% ls *.* | xargs -t -i{} ./surpress_stderr ./myecho {} 2>&1
./surpress_stderr ./myecho blah.txt 2>/dev/null
STDOUT blah.txt
./surpress_stderr ./myecho hi.txt 2>/dev/null
STDOUT hi.txt
%
rampion
i was kinda hoping for a xargs -err=stdout -t, but if xargs doesn't have that option, piping to a script rather than the actual command would work. thx
Roy Rico
A: 

I also have the same problem

I am searching for strings like @ser.name@ in the config file and checking against my properties file using the following command.

grep -P @[a-z-A-Z0-9.]*@ ABCD.txt --color -o | sed 's/@//g' | xargs -t -i grep {} ABCD.properties -c

It works. but I want to add further filtering like "show me the property with 0 matches" by adding a grep and it fails

grep -P @[a-z-A-Z0-9.]*@ ABCD.txt --color -o | sed 's/@//g' | xargs -t -i grep {} ABCD.properties -c | grep "0" -B 1

When I piped the output to test.txt file I only see the numbers and not the command associated with it

grep -P @[a-z-A-Z0-9.]*@ ABCD.txt --color -o | sed 's/@//g' | xargs -t -i grep {} ABCD.properties -c > test.txt

A: 

As I understand your problem using GNU Parallel http://www.gnu.org/software/parallel/ would do the right thing:

ls *.* | parallel -v echo {} 2> /dev/null
Ole Tange