views:

418

answers:

3

I want to quickly find the total size of any folder using python.

import os
from os.path import join, getsize, isfile, isdir, splitext
def GetFolderSize(path):
    TotalSize = 0
    for item in os.walk(path):
        for file in item[2]:
            try:
                TotalSize = TotalSize + getsize(join(item[0], file))
            except:
                print("error with file:  " + join(item[0], file))
    return TotalSize
print(float(GetFolderSize("C:\\")) /1024 /1024 /1024)

That's the simple script I wrote to get the total size of the folder, it took around 60 seconds (+-5 seconds). By using multiprocessing I got it down to 23 seconds on a quad core machine.

Using the Windows file explorer it takes only ~3 seconds (Right click-> properties to see for yourself). So is there a faster way of finding the total size of a folder close to the speed that windows can do it?

Windows 7, python 2.6 (Did searches but most of the time people used a very similar method to my own) Thanks in advance.

+3  A: 

If you want same speed as explorer, why not use the windows scripting to access same functionality using pythoncom e.g.

import win32com.client as com

folderPath = r"D:\Software\Downloads"
fso = com.Dispatch("Scripting.FileSystemObject")
folder = fso.GetFolder(folderPath)
MB=1024*1024.0
print  "%.2f MB"%(folder.Size/MB)

It will work same as explorer, you can read more about Scripting runtime at http://msdn.microsoft.com/en-us/library/bstcxhf7(VS.85).aspx.

Anurag Uniyal
That works great, amazing actually. But only most of the time.In a directory ('C:\Downloads') with a size of 37GB and 7 000 files your method get's the result almost instantaneously. The os.walk() way get's the result back in a couple of seconds (3 seconds)But I have some problems on other directories such as C:\Windows, C:\users etc. where it says an exception occurred.
@freakazo, C:\Windows worked on my machine, what error do you get?
Anurag Uniyal
@Anurag, Traceback (most recent call last): File "Test.py", line 7, in <module> print "%.2f MB"%(folder.Size/MB) File "C:\python26_32\lib\site-packages\win32com\client\dynamic.py", line 501,in __getattr__ ret = self._oleobj_.Invoke(retEntry.dispid,0,invoke_type,1)pywintypes.com_error: (-2147352567, 'Exception occurred.', (0, None, None, None, 0, -2146828218), None)Press any key to continue . . .### A couple more tests showed that it is folder.size that's giving the problem. folder.name for example works on the C:\Windows directory
+3  A: 

I compared the performance of the Python code against a 15k directory tree containing 190k files and compared it against the du(1) command which presumably goes about as fast as the OS. The Python code took 3.3 seconds compared to du which took 0.8 seconds. This was on Linux.

I'm not sure there is much to squeeze out of the Python code. Note too that the first run of du took 45 seconds which was obviously before the relevant i-nodes were in the block cache; therefore this performance is heavily dependent upon how well the system is managing its store. It wouldn't surprise me if either or both:

  1. os.path.getsize is sub-optimal on Windows
  2. Windows caches directory contents size once calculated
msw
It looks like it is indeed slower on windows, on windows with a 23K directory tree and 175K files it took around 60 seconds.Using the du windows equivalent it took 6 seconds to complete. So it looks like Python is 10x slower on windows than du and 4 times slower on linux.So yip it seems that1. os.path.getsize/os.walk is indeed sub-optimal on windows2. Windows does seem to cache directory contents size3. Windows still is just slower than linux
+7  A: 

You are at a disadvantage.

Windows Explorer almost certainly uses FindFirstFile/FindNextFile to both traverse the directory structure and collect size information (through lpFindFileData) in one pass, making what is essentially a single system call per file.

Python is unfortunately not your friend in this case. Thus,

  1. os.walk first calls os.listdir (which internally calls FindFirstFile/FindNextFile)
    • any additional system calls made from this point onward can only make you slower than Windows Explorer
  2. os.walk then calls isdir for each file returned by os.listdir (which internally calls GetFileAttributesEx -- or, prior to Win2k, a GetFileAttributes+FindFirstFile combo) to redetermine whether to recurse or not
  3. os.walk and os.listdir will perform additional memory allocation, string and array operations etc. to fill out their return value
  4. you then call getsize for each file returned by os.walk (which again calls GetFileAttributesEx)

That is 3x more system calls per file than Windows Explorer, plus memory allocation and manipulation overhead.

You can either use Anurag's solution, or try to call FindFirstFile/FindNextFile directly and recursively (which should be comparable to the performance of a cygwin or other win32 port du -s some_directory.)

Refer to os.py for the implementation of os.walk, posixmodule.c for the implementation of listdir and win32_stat (invoked by both isdir and getsize.)

Note that Python's os.walk is suboptimal on all platforms (Windows and *nices), up to and including Python3.1. On both Windows and *nices os.walk could achieve traversal in a single pass without calling isdir since both FindFirst/FindNext (Windows) and opendir/readdir (*nix) already return file type via lpFindFileData->dwFileAttributes (Windows) and dirent::d_type (*nix).

vladr
+1 informative.
msw
Indeed is very informative!