I have a folder with 100k text files. I want to put files with over 20 lines in another folder. How do I do this in python? I used os.listdir, but of course, there isn't enough memory for even loading the filenames into memory. Is there a way to get maybe 100 filenames at a time?
Here's my code:
import os
import shutil
dir = '/somedir/'
def file_len(fname):
f = open(fname,'r')
for i, l in enumerate(f):
pass
f.close()
return i + 1
filenames = os.listdir(dir+'labels/')
i = 0
for filename in filenames:
flen = file_len(dir+'labels/'+filename)
print flen
if flen > 15:
i = i+1
shutil.copyfile(dir+'originals/'+filename[:-5], dir+'filteredOrigs/'+filename[:-5])
print i
And Output:
Traceback (most recent call last):
File "filterimage.py", line 13, in <module>
filenames = os.listdir(dir+'labels/')
OSError: [Errno 12] Cannot allocate memory: '/somedir/'
Here's the modified script:
import os
import shutil
import glob
topdir = '/somedir'
def filelen(fname, many):
f = open(fname,'r')
for i, l in enumerate(f):
if i > many:
f.close()
return True
f.close()
return False
path = os.path.join(topdir, 'labels', '*')
i=0
for filename in glob.iglob(path):
print filename
if filelen(filename,5):
i += 1
print i
it works on a folder with fewer files, but with the larger folder, all it prints is "0"... Works on linux server, prints 0 on mac... oh well...