views:

3672

answers:

4

I went to upload a new file to my web server only to get a message in return saying that my disk quota was full... I wasn't using up my allotted space but rather my allotted FILE QUANTITY. My host caps my total number of files at about 260,000.

Checking through my folders I believe I found the culprit...

I have a small DVD database application (Video dB By split Brain) that I have installed and hidden away on my web site for my own personal use. It apparently caches data from IMDB, and over the years has secretly amassed what is probably close to a MIRROR of IMDB at this point. I don't know for certain but I did have a 2nd (inactive) copy of the program on the host that I created a few years back that I was using for testing when I was modifying portions of it. The cache folder in this inactive copy had 40,000 files totaling 2.3GB in size. I was able to delete this folder over FTP but it took over an hour. Thankfully it also gave me some much needed breathing room.

...But now as you can imagine the cache folder for the active copy of this web-app likely has closer to 150000 files totaling about 7GB worth of data.

This is where my problem comes in... I use Flash FXP for my FTP client and whenever I try to delete the cache folder, or even just view the contents it will sit and try to load a file list for a good 5 minutes and then lose connection to the server...

my host has a web based file browser and it crashes when trying to do this... as do free online FTP clients like net2ftp.com. I don't have SSH ability on this server so I can't login directly to delete either.

Anyone have any idea how I can delete these files? Is there a different FTP program I can download that would have better success... or perhaps a small script I could run that would be able to take care of it?

Any help would be greatly appreciated, Thanks.

+3  A: 

It sounds like it might be time for a command line FTP utility. One ships with just about every operating system. With that many files, I would write a script for my command-line FTP client that goes to the folder in question and performs a directory listing, redirecting the output to a file. Then, use magic (or perl or whatever) to process that file into a new FTP script that runs a delete command against all of the files. Yes, it will take a long time to run.

If the server supports wildcards, do that instead and just delete ..

If that all seems like too much work, open a support ticket with your hosting provider and ask them to clean it up on the server directly.

Having said all that, this isn't really a programming question and should probably be closed.

Justin Scott
+1  A: 

We had a question a while back where I ran an experiment to show that Firefox can browse a directory with 10,000 files no problem, via FTP. Presumably 150,000 will also be ok. Firefox won't help you delete, but it might be helpful in capturing the names of the files you need to delete.

But first I would just try the command-line client ncftp. It is well engineered and I have had good luck with it in the past. You can delete a large number of files at once using shell patterns. And it is available for Windows, MacOS, Linux, and many other platforms.

If that doesn't work, you sound like a long-term customer---could you beg your ISP the privilege of a shell account for a week so you can remote login with Putty or ssh and blow away the entire directory with a single rm -r command?

Norman Ramsey
+2  A: 

Anyone have any idea how I can delete these files?

Submit a support request asking for them to delete it for you?

Zoredache
+1 for common sense :)
Kev
A: 

Thanks for all the advice... both attempting to get the file list through Fire Fox and using ncftp failed by simply hanging and then losing connection.

I called my ISP they said "yup ok, we can delete that for you" then hung up and it's several hours later and nothing has happened...

is there some way I can do this with a small script that can discover the file names and delete the files without overloading the server?