Searching the file system without a preexisting index is IO bound. Otherwise, products ranging from locate
to Windows Desktop Search would not exist.
Type D:\> dir /b/s > directory.lst
and observe how long it takes for that command to run. You should not expect to beat that without indexing files first.
One major improvement you can make is to print less often. A minor improvement is not to use capturing parentheses if you are not going to capture:
my @dirs;
sub Lib_files {
return unless -d $File::Find::name;
if ( /^[Ll]ib/ ) {
push @dirs, $File::Find::name;
}
return;
}
On my system, a simple script using File::Find
to print the names of all subdirectories under my home directory with about 150,000 files takes a few minutes to run compared to dir %HOME% /ad/b/s > dir.lst
which completes in about 20 seconds.
I would be inclined to use:
use File::Basename;
my @dirs = grep { fileparse($_) =~ /^[Ll]ib/ }
split /\n/, `dir %HOME% /ad/b/s`;
which completed in under 15 seconds on my system.
If there is a chance there is some other dir.exe
in %PATH%
, cmd.exe
's built-in dir
will not be invoked. You can use qx! cmd.exe /c dir %HOME% /ad/b/s !
to make sure that the right dir
is invoked.