views:

8548

answers:

8

I'm trying to copy a bunch of files below a directory and a number of the files have spaces and single-quotes in their names. When I try to string together find and grep with xargs, I get the following error:

find .|grep "FooBar"|xargs -I{} cp "{}" ~/foo/bar
xargs: unterminated quote

Any suggestions for a more robust usage of xargs?

This is on MacOS 10.5.3 with BSD xargs.

+15  A: 

find . -print0 | grep -z 'FooBar' | xargs -0 ...

I don't know about whether grep supports -z, nor whether xargs supports -0, on Leopard, but on GNU it's all good.

Chris Jester-Young
Leopard does support "-Z" (it is GNU grep) and of course find(1) and xargs(1) do support "-0".
Keltia
+3  A: 

Look into using the --null commandline option for xargs with the -print0 option in find.

Shannon Nelson
+12  A: 

You might also be able to combine all of that into a single find command:

find . -iname "*foobar*" -exec cp "{}" ~/foo/bar \;

This will handle filenames and directories with spaces in them. You can use -name to get case-sensitive results.

(These command line arguments will work with GNU find; I don't know if they're available with BSD's or OS X's find.)

godbyk
-exec will work with any find, I've never understood why people use xargs (and just wait until you hit the xargs directory length limit!!)
Kendall Helmstetter Gelner
What is an 'xargs directory length limit'? Do you mean maximum command size? If yes, xargs is supposed to split its arguments in appropriate group sizes.
ΤΖΩΤΖΙΟΥ
People use xargs because typically it's faster to call an executable 5 times with 200 arguments each time than to call it 1000 times with one argument every time.
ΤΖΩΤΖΙΟΥ
The answer from Chris Jester-Young ought to be the "good answer" there... BTW this solution does not work if a filename begins with "-". At least, it needs "--" after cp.
Keltia
+5  A: 

This is more efficient as it does not run "cp" multiple times:

find -name '*FooBar*' -print0 | xargs -0 cp -t ~/foo/bar
Tometzky
This didn't work for me. It tried to cp ~/foo/bar into whatever you find, but not the opposite
Shervin
A: 

Be aware that most of the options discussed in other answers are not standard on platforms that do not use the GNU utilities (Solaris, AIX, HP-UX, for instance). See the POSIX specification for 'standard' xargs behaviour.

I also find the behaviour of xargs whereby it runs the command at least once, even with no input, to be a nuisance.

I wrote my own private version of xargs (xargl) to deal with the problems of spaces in names (only newlines separate - though the 'find ... -print0' and 'xargs -0' combination is pretty neat given that file names cannot contain ASCII NUL '\0' characters. My xargl isn't as complete as it would need to be to be worth publishing - especially since GNU has facilities that are at least as good.

Jonathan Leffler
A: 

I have found that the following syntax works well for me.

find /usr/pcapps/ -mount -type f -size +1000000c | perl -lpe ' s{ }{\\ }g ' | xargs ls -l | sort +4nr | head -200

In this example, I am looking for the largest 200 files over 1,000,000 bytes in the filesystem mounted at "/usr/pcapps".

The Perl line-liner between "find" and "xargs" escapes/quotes each blank so "xargs" passes any filename with embedded blanks to "ls" as a single argument.

Bill Starr Fri, 23 Jan 2009, 5:40 pm EST

A: 

The perl version above won't work well for embedded newlines (only copes with spaces). For those on e.g. solaris where you don't have the gnu tools, a more complete version might be (using sed)...

find -type f | sed 's/./\\&/g' | xargs grep string_to_find

adjust the find and grep arguments or other commands as you require, but the sed will fix your embedded newlines/spaces/tabs.

A: 

I used a Bill Star's answer slightly modified on Solaris:

find . -mtime +2 | perl -pe 's{^}{\"};s{$}{\"}' > ~/output.file

this will put quotes around each line. I didn't use the '-l' option although it probably would help.

The file list I was going though might have '-' but not newlines. I haven't used the output file with any other commands as I want to review what was found before I just start massively deleting them via xargs.

Carl Yamamoto-Furst