views:

1011

answers:

2

How do I find out which directories are responsible for chewing up all my inodes?

Ultimately the root directory will be responsible for the largest number of inodes, so I'm not sure exactly what sort of answer I want..

Basically, I'm running out of available inodes and need to find a unneeded directory to cull.

Thanks, and sorry for the vague question.

+6  A: 

So basically you're looking for which directories have a lot of files? Here's a first stab at it:

find . -type d -print0 | xargs -0 -n1 count_files | sort -n

where "count_files" is a shell script that does (thanks Jonathan)

echo $(ls -a "$1" | wc -l) $1
Paul Tomblin
no need for the '-l' parameter to ls - it will default to one-line-per-file output if stdout is not a tty.
Alnitak
If the hidden files start with '.', this won't find them. I'd probably use - echo $1 $(ls -a "$1" | wc -l) - to generate directory name and count on one line (and I might well reverse the order to list count first). Note the careful use of quotes around the file name where it matters.
Jonathan Leffler
+4  A: 

Here's a simple Perl script that'll do it:

#!/usr/bin/perl -w

use strict;

sub count_inodes($);
sub count_inodes($)
{
  my $dir = shift;
  if (opendir(my $dh, $dir)) {
    my $count = 0;
    while (defined(my $file = readdir($dh))) {
      next if ($file eq '.' || $file eq '..');
      $count++;
      my $path = $dir . '/' . $file;
      count_inodes($path) if (-d $path);
    }
    closedir($dh);
    printf "%7d\t%s\n", $count, $dir;
  } else {
    warn "couldn't open $dir - $!\n";
  }
}

push(@ARGV, '.') unless (@ARGV);
while (@ARGV) {
  count_inodes(shift);
}

If you want it to work like du (where each directory count also includes the recursive count of the subdirectory) then change the recursive function to return $count and then at the recursion point say:

$count += count_inodes($path) if (-d $path);
Alnitak