views:

39

answers:

1

We have a crawler that persistently crawls our target sites, it's log files are turning out to be quite huge. Over 1 GB in some cases, I'm not too comfortable with deleting or overwriting them. Any examples of how you managed large log files?

+2  A: 

Use a cron script to rotate the log files on a daily basis. Basically, you rename your logfile.log to logfile-YYYY-MM-DD.log. This way, instead of one huge logfile, you have smaller ones and are able to find logged messages from a certain time period easily. If you also compress your rotated logs, you will save even more disk space.

nikc
check out log rotate docs: http://gd.tuwien.ac.at/linuxcommand.org/man_pages/logrotate8.html. it's a standard tool.
kgb