views:

381

answers:

4

A Perl script (which I do not control) appends lines to the end of a text file periodically.

I need my PHP script (which will run as a cron job) to read the lines from that file, process them, and then remove them from the file. But, it seems like the only way to remove a line from a file with PHP is to read the file into a variable, remove the one line, truncate the file, and then rewrite the file.

But what happens if:

  1. PHP reads the file
  2. The Perl Script appends a new line.
  3. The PHP script writes the modified buffer back over the file.

In that case the new line would be lost because it would be overwritten when the PHP script finishes and updates the file.

Is there a way to lock a file using PHP in a way that Perl will respect? It looks like the flock() function is PHP specific.

+1  A: 

If the Perl script, which you cannot control, already implements file locking via flock, you are fine. If it doesn't (and I'm afraid that we have to assume that), you are out of luck.

innaM
+3  A: 

Do you have any freedom to change the design? Is removing the processed lines from the file an essential part of your processing?

If you have that freedom how about letting the perl-produced file grow. Presumably the authors of the perl script have some kind of housekeeping in mind already? Maintaining your own "log" of what you have processed. Then when your script starts up it reads the perl file upto the point recorded in your "log". Process a record, update the log.

djna
The file in question is basically a log file for the Perl script (which is a VERY old shopping cart). The cart writes entries to the log file, but never uses them. The manual suggests that you (the end user) periodically clean the file out.Unfortunately, this log file is the only programmatic access point into an otherwise closed system, and the only way to extend the cart's functionality is to parse the incoming log file, and process the orders accordingly.We could do as you suggested and leave the entries in the file. We would just have to be sure to manually empty the file periodically.
Nick
A: 

Maybe you instead of working on the same file could let your php script work on a copy? I imagine it could work with three files:

  1. File written to by perl script
  2. A copy of file 1
  3. A processed version of file 2

Then when your php script starts, it checks if file 1 is newer than file 2, and if so makes a new copy, processes this (possibly skipping the number of lines already processed previously) and writes this to file 3.

hlovdal
What about moving the file, instead of copying it? I believe that the Perl script will create the file if it can't find it (because it's been moved). My next question is, what happens if PHP issues the move command at the same moment Perl tries to write? Is there any chance of an entry getting lost?
Nick
+1  A: 

Another possibility would be to instead of having the perl script write to a file, let it write to a named pipe and have your php script read out directly on the other end and let it write to a real file.

hlovdal
So far, I like this option the best, but the Perl script runs on a machine which I don't have shell access to. I don't suppose one can create a FIFO locally and then FTP it up?
Nick
@Nick, just have a script create the fifo for you.
daotoad