I many times have to work with directories containing hundreds of thousands of files, doing text matching, replacing and so on. If I go the standard route of, say
grep foo *
I get the too many files error message, so I end up doing
for i in *; do grep foo $i; done
or
find ../path/ | xargs -I{} grep foo "{}"
But these are less than optimal (create a new grep process per each file).
This looks like more of a limitation in the size of the arguments programs can receive, because the * in the for loop works alright. But, in any case, what's the proper way to handle this?
PS: Don't tell me to do grep -r instead, I know about that, I'm thinking about tools that do not have a recursive option.