views:

368

answers:

6

I'm trying to overcome a limitation on our file structure. I want to grep a whole series of files in a known location. If I do a standard grep from command line

(grep -i searchpattern known_dir/s*.sql)

I get the following error:

ksh: /usr/bin/grep: 0403-027 The parameter list is too long.

so I have a little for loop that looks like:

searchfor=$1
for i in /$ENV_VAR_DIR/s*.sql
do
  grep -i $searchfor $i
done

When I execute this I get a couple problems:

  1. it gives me the line of code but no file name I need both
  2. -l obviously gives me just the path/filename I want to trim the path off I am just executing from my home directory.
A: 

grep -H will force the filename to be prepended to each output line, the same as it would be if you were doing the original single command.

The easiest way to get just the filename, whether from -H or -l, is to have the directory be your current working directory when you run the command:

searchfor=$1
cd /$ENV_VAR_DIR
for i in s*.sql
do
  grep -Hi $searchfor $i
done
chaos
A: 

You want either ack or grin.

Jonathan Feinberg
A: 
ls | read fname; do
    grep searchpattern $fname /dev/null
done

This does two things for you.

  1. You don't need to expand all the files at once, it processes them one at a time.
  2. By listing /dev/null as a second file it won't match anything but it will get grep to print the filename. Gnu grep has an option to force filename output but this will work on any unix.

The other traditional thing to do with this shell design pattern is to feed the loop with find.

find . -name "*.sql" | read fname; do
    grep searchpattern $fname /dev/null
done

You might also wants to make searchpattern be $1.

DigitalRoss
Beware pathnames with spaces!
Jonathan Leffler
True, every `$fname` should be `"$fname"` and the `"*.sql"` should be `'*.sql'`.
DigitalRoss
+1  A: 

find ./known_dir/ -name "s*.sql"|xargs grep -i searchpattern

Vijay Sarathi
sorry my mistake it is find ./known_dir/ "s*.sql"|xargs grep -i searchpattern
Vijay Sarathi
Works like a charm! I know this is asking alot but now is there a way to trim off the directory path so all I have is the filename?
JAdams
This works great I think Im good with what I have in the end here is what I ended up with:for i in $1do find $i -name $2 | xargs grep -i $3done
JAdams
I still think you want ack or grin. Go try them out. I don't care about the "accept". :)
Jonathan Feinberg
+2  A: 

I would recommend the following:

$ find known_dir -name 's*.sql' -print0 | xargs -0 grep -i searchpattern

With xargs, you can vary the maximum number of files you pass to grep each time using the -n option.

The -print0 and -0 options is a defense against spaces in filenames.

You could even compute multiple grep commands in parallel on multiple cores using -P option.

Andrey Vlasovskikh
Add /dev/null after the search pattern and you get the file names (because there are always at least two files on the command line, /dev/null and whatever was found by find).
Jonathan Leffler
You can also use '-H' on some versions of 'grep' - but not all - to get the file names listed unconditionally.
Jonathan Leffler
+1  A: 

As a hack you could also specify /dev/null to your grep command to get -l to list a file, however your method is going to be slow as it uses shell looping and starts a grep process per file. find and xargs are your friend here as already mentioned.

Perhaps you could use the findrepo script which seems to do as you require.

pixelbeat