views:

1578

answers:

3

I run an OpenSuse server that uploads zipped source code backups to a Microsoft FTP server every night. I have written a Bash script that does this through a cron job.

I want to delete backed up files that are older than a certain date. How could I do this?

Thanks.

+1  A: 

The following deletes all files under the directory tree rooted at dir whose last modification time was before November 1:

find dir -type f \! -newermt 2008-11-01 -exec rm '{}' \+

The date/time format should be ISO 8601; I don't know if other formats are accepted.

Adam Rosenfield
The server I'm using doesn't seem to support the "Find" command - from the greeting, it looks like it's NcFTPd. Is this just a command the admin hasn't enabled, or is there something else I can use?
rwmnau
+1  A: 

You can delete files on the FTP server using the delete or mdelete FTP commands. I don't know of a way to select old files as a server-side operation, so one option would be to do an FTP ls to get a list of the files on the server, then parse the output to pick up those files which are older than your specified date. Then delete each one using an FTP command.

If you have a local copy of all the files then it is probably easier to generate the list of files locally using find then delete them one at a time from the server.

If you have some control over the FTP server then using rysnc instead of FTP would probably be easier.

David Dibben
A: 

Unfortunately deleting old files from an FTP server is not as simple as running find . -mtime +30 -delete because usually you don’t get shell access to your FTP space. Everything must be done via FTP.

Here comes a simple perl script that does the trick:

http://www.nervous.it/2009/08/delete-old-files-from-ftp-server/

It requires the Net::FTP module.