views:

131

answers:

4

I want to read everything from a textfile and echo it. But there might be more lines written to the text-file while I'm reading so I don't want the script to exit when it has reached the end of the file, instead I wan't it to wait forever for more lines. Is this possible in php?

+1  A: 

this is just a guess, but try to pass through (passthru) a "tail -f" output.

but you will need to find a way to flush() your buffer.


IMHO a much nicer solution would be to build a ajax site.

read the contents of the file in to an array. store the number of lines in the session. print the content of the file.

start an ajax request every x seconds to a script which checks the file, if the line count is greater then the session count append the result to the page.


you could use popen() inststed:

$f = popen("tail -f /where/ever/your/file/is 2>&1", 'r');
while(!feof($f)) {
    $buffer = fgets($f);
    echo "$buffer\n";
    flush();
    sleep(1);
}
pclose($f)

the sleep is important, without it you will have 100% CPU time.

Rufinus
Actually I made a solution with tail that works but it uses an infinite loop that consumes all cpu, that's why i want some sort of "blocking" call.Calling every X seconds is not enough, i need realtime, and it's not a website so no ajax.
Martin
`tail -f` is "realtime" and needs neither a loop or php at all.
VolkerK
see my edited answer
Rufinus
That popen-solution is almost exactly like mine. The problem is the sleep, it makes the delay too big.
Martin
then try a usleep() and go as low as you need to, but i would guess the lower you go the heavier is your cpu load.
Rufinus
I tried without sleep at all now and it's still slow, maybe it's executing the tail-command from php that is too slow.
Martin
show some code
Rufinus
It's your code exactly and then i echo a new line to the file with the command line.I think stream_set_blocking is the solution but I havn't really figured out how to use it yet.http://us2.php.net/manual/en/function.stream-set-blocking.php
Martin
dont think so: stream_set_blocking does not unblock streams opened with popen. (tested in php 5.1.6)
Rufinus
yes that comment is a little bit worrying. But what is want _is_ blocking. But i think the problem is that it reaches EOF and therefore doesn't block.
Martin
A: 

In fact, when you "echo" it, it goes to the buffer. So what you want is "appending" the new content if it's added while the browser is still receiving output. And this is not possible (but there are some approaches to this).

Adrián
you do know there is flush() ?
Rufinus
yes, i do :)
Adrián
A: 

I solved it.

The trick was to use fopen and when eof is reached move the cursor to the previous position and continue reading from there.

<?php
$handle = fopen('text.txt', 'r');
$lastpos = 0;
while(true){
   if (!feof($handle)){
       echo fread($handle,8192);
       flush();
       $lastpos = ftell($handle);
   }else{
       fseek($handle,$lastpos);
   }
}
?>

Still consumes pretty much cpu though, don't know how to solve that.

Martin
You're busy-looping, of course it consumes all CPU. Try adding a `usleep(50000)` after the `fseek` line. That'll sleep for 50ms, which will *greatly* lower your CPU usage with no notable impact on your latency. To do even better, you'd need to use inotify (assuming you're on a system that supports it).
derobert
Yes i know it's because of the loop, usleep works but i'm not really a fan of sleep-solutions. inotify though seems to be exactly what i've been looking for, i will look into that tomorrow.
Martin
A: 

You may also use filemtime: you get latest modification timestamp, send the output and at the end compare again the stored filemtime with the current one.

Anyway, if you want the script go at the same time that the browser (or client), you should send the output using chunks (fread, flush), then check any changes at the end. If there are any changes, re-open the file and read from the latest position (you can get the position outside of the loop of while(!feof())).

Adrián