I'm trying to figure out the best way to find the number of files in a particular directory when there are a very large number of files ( > 100,000).
When there are that many files, performing "ls | wc -l" takes quite a long time to execute. I believe this is because it's returning the names of all the files. I'm trying to take up as little of the disk IO as possible.
I have experimented with some shell and Perl scripts to no avail. Any ideas?