tags:

views:

87

answers:

4

Can find perform full-text search? How would you do a search with both some constraints on the filename and the file content?

+6  A: 

find . -name whatever -print | xargs grep whatever

Add "-l" option to grep to just get filenames.

smcameron
And if you have spaces in the filename, use `-print0` to find and `-0` to xargs.
JesperE
Yep. Everyday sample :(prompt): find . -print0 | xargs -0 grep -i string_i_want_to search
Olivier Pons
That's unfortunately very inefficient...
static_rtti
How is it very inefficient?
Roger Pate
much, much more efficient than `find ... -exec grep`
glenn jackman
use `grep -H` to force filename output for each match
glenn jackman
@glann jackman: Are you kidding me???
static_rtti
I guess its inefficiency is in creating a big list of files as an intermediate. However in an other way it is more efficient than the "find -exec" solution as it starts less instances of grep. Which would end up faster in practice is not clear to me ( though I suspect grep/xargs ) . But in most cases the difference will be so small as to not matter.
Michael Anderson
Michael: except it doesn't create a big list of files as an intermediate, and will spawn at least as many grep processes as find -exec (the + version of -exec, which hasn't been mentioned yet, is what would spawn less).
Roger Pate
+2  A: 
find -name whatever -exec grep --with-filename you_search_for_it {} \;

{} contains the file name returned by find

\; to terminate the find command

Luc M
Thanks, this is already a bit more efficient, even though I would have prefered a solution that did not involve starting a process each time a filename matches the constraints.
static_rtti
"The find command" is ambiguous here; you should point out that the semicolon is part of the `-exec` syntax, and is escaped for the benefit of the shell interpreting the whole command.
Roger Pate
static_rtti: processes on unix are cheap, and the whole unix methodology revolves around tying together different programs.
Roger Pate
define "cheap".
static_rtti
So cheap that you will have a *very* hard time noticing any difference from solutions that are much harder to setup and use---especially given that the question implies using find is desirable.
Roger Pate
How can you tell, given that you don't know which filesystem I am running this on?
static_rtti
+1  A: 

In some cases globbing will provide enough constraints on your filenames:

shopt -s nullglob    # Bash: prevents "No such file or directory" errors
grep string {.,[jm]*,{one,two}}/{[a-c],[hlz]}?{earth,mars,venus}[[:ascii:]]*atm*.dat

which would search files such as:

./bZmars_321atmBB111.dat
m42a/z3venus-a18atm9.dat
two/aaearth+GHIatm9876.dat
Dennis Williamson
cool answer, thanks! Can that work with recursive searches?
static_rtti
In Bash 4 (and zsh), you can use `**` recursively match directories (in Bash you have to enable it with `shopt -s globstar`). So you could do `grep popsicle doc/**/treat*/*[bB]*`
Dennis Williamson
+3  A: 

I would strongly recommend getting hold of ack and using it for any findy-greppy-type-stuff that you want to do - I use it every day and can't imagine how I lived without it! In this case it sounds like ack -G <file-regex> <text-regex> would do what you want.

Matthew Slattery