There are set of log files that have a pattern xxxxxYYY where xxxx -> some text and YYY is a sequence number increasing sequentially by one and wrapping around. Only the last n number of files are available at a given time.
I would like to write a foolproof script that would make sure that all the log files are backed up in another server (via ssh/scp).
Can somebody please suggest a logic/code snippet(perl or shell) for it?
=> The script can run every few minutes to ensure bursts of traffic do not cause log files to miss getting backed up.
=> The roll over needs to be detected so that files are not overwritten in the destination server/directory.
-> I do not have super user either in source or destination boxes. The destiantion box does not have rsync installed and would take too long to get it installed. -> Only one log file gets updated at a time.