tags:

views:

276

answers:

2

Hello all,

I make an AJAX POST request to a PHP script on my server. That script in turn makes a GET request to a php script on an external server to check the contents of a text file. However, it takes a very long time to come back with a result. Why is the case?

AJAX Post Request to Script on my server

session_start();
$fileName = $_POST['textFile'];
$result = file_get_contents($_SESSION['serverURL']."fileReader.php?textFile=$fileName");
echo $result;

GET Request to a script on different server

$fileName = $_GET['textFile'];

if (file_exists('text/'.$fileName.'.txt')) {

    $lines = file('text/'.$fileName.'.txt');

    echo $lines[sizeof($lines)-1];

}
else{
    echo 0;
}

These are very simple scripts and its only checking a very small text file, so why does it take so long?

I am making other AJAX request on my site but these surely can't be causing a problem. Having said this, the returned value of that text file always co-incides with the completion of another AJAX request which initiates a script that takes a while to complete on my server, but how would these effect each other?! They shouldn't right?

+1  A: 

Network latency and CPU overload are likely the biggest factors in the delay here.

Every time you open up a new connection, you have the overhead of opening the connection, as well as the reduced transfer rate(as opposed to reading a file directly).

Also regarding your "other" AJAX request which takes a while to complete... This can in fact affect everything else on the system. If it takes a long time to complete, it is probably safe to say that it is eating up a lot of CPU or disk bandwidth. Any intensive process like this can affect everything else on the system. Especially if you are working with the same files or database tables, you can be thrashing your filesystem or database at this time.

It's really hard to say what the problem is without seeing it, you should time each piece of your system, and monitor CPU and memory loads during requests. Try to figure out where the bottleneck is.

Kekoa
I don't think its to do with any system resource as these are brand new server with nothing running on them. In addition, if I run the final script on the other server through the browser it will return instantly (< 1 sec). This all seems to suggest its my PHP logic that isn't working!
Abs
If you suspect it is file_get_contents, you can try using cURL. It is not as easy though.
Kekoa
+1  A: 

You don’t have to take the route over HTTP to get the last line of a file. Why don’t you stay inside the file system and use a function like that to retrieve the last line:

function fileGetLine($filename, $line) {
    if (!is_file($filename) || !is_readable($filename)) {
        // invalid file name or not readable
        return false;
    }
    if (!is_int($line)) {
        // invalid line number
        return false;
    }
    $lines = file($filename);
    $lineCount = count($lines);
    if ($line < 0) {
        // negative line number, count from end
        $line = $lineCount + $line;
    }
    if ($line < 0 || $line >= $lineCount) {
        // line number out of bounds
        return false;
    }
    return $lines[$line];
}

session_start();
echo fileGetLine('text/'.basename($_POST['textFile']).'.txt', -1);
Gumbo
Is this function testing a text file on an external server or just the server the script resides on? How do I initiate it from another server?
Abs
Oh just give it the URL! :)
Abs
Oh, I must have skipped that you’re requesting it from a remote server. Well than you should consider if it’s necessary to have that file on a remote server instead of the same server. Or you should try more efficient file handling such as `fopen` and read the file line by line instead of all at once like `file` does. Or use a separate file in that you just store the last line.
Gumbo
Ok, I made use of file_get_contents and that did the trick.
Abs