views:

129

answers:

4

What will be the fastest way to check whether a folder size is beyond a specific size say 10 MB, 1 Gb , 10 GB etc, without actually calculating the folder size. Something like quota. A Pythonic solution will be great, but standard UNIX utilities also welcome

+1  A: 

Folder size is still the total size of the folder contents.

You may try to call du -s foldername from python

S.Mark
+2  A: 

I'd have to say it's impossible. I don't believe any filesystems cache folder sizes. Whatever you do is going to have to walk the tree in some fashion or another. Using du is probably the fastest method since it's all going to be happening in C.

If you know the maximum filesize expected or supported you could perhaps optimise a little by counting the enties in each folder rather than the sizes and short-cutting in the case where there aren't enough files to meet the limit.

SpliFF
and bailing out as soon as you exceed the limit ...
EJP
+3  A: 

you can use du -sb, which still have to calculate folder size .eg

threshold=1024000 #bytes
path="/your/path"
s=$(du -sb "$path")
set -- $s
size=$1
if [ "$size" -gt $threshold ];then
    echo "size of $path greater than $threshold"
fi
ghostdog74
+4  A: 
import os
from os.path import join, getsize

def getsize_limited(directory, limit):
        total_size = 0
        for root, dirs, files in os.walk(directory, topdown=False):
            for name in files:
                total_size += getsize(join(root, name))
                if total_size > limit:
                   return limit, False
        return total_size, True

Example:

size, within_limit = getsize_limited(os.getcwd(), limit=10**6)
J.F. Sebastian