views:

15

answers:

1

This question is related to one I asked previously, see here.

As a way to implement segmented ajax responses, I created a code which does this:

The client first calls the script which initializes the process. On the server side, the startScript.cgi code starts generating data, and as it does this, it groups the responses into chunks, and writes them into individual files indexed sequentially (chunk1.txt chunk2.txt etc). Immediately after startScript.cgi starts this process, the client side begins a second ajax request, sent to gatherOutput.cgi, with parameter ?index=0.

gatherOutput.cgi sees the request, and then looks in 'chunk'.$index.'.txt' and then returns the data. The client outputs this to html, and then begins a second ajax request to gatherOutput.cgi with parameter ?index=1, etc. This continues until all of the data from startScript.cgi is reported.

If gatherOutput.cgi cannot locate "chunk$index.txt", it goes into this loop:

until(-e "$directory/chunk$index.txt")
{
    #nothing
}
open $fh, "<$directory/chunk$index.txt" || warn "File not found. blah blah";
#Read file and print, etc...

Note, startScript.cgi runs code which may take a long time to complete, so the point is to simultaneously broadcast older output from startScript.cgi as it is generating new output.

The problem with this is that the performance suffers, and output would take a while to come out despite being long ago created. I'm assuming this is due to harddrive access being very slow compared to the CPU operations in startScript.cgi, so gatherOutput.cgi is frequently waiting on the new chunk to be written, or the client is frequently waiting for gatherOutput.cgi to read the files, etc. Though there could be other issues.

Does any one have any ideas or suggestions to fix this problem? Or if anyone has a different approach to this problem that'd be great as well.

By the way, startScript.cgi may only be called once, it starts a large task system task (with system escapes such as exec, system, or backticks) that keeps running, and can't fathomably be segmented.

+1  A: 

Your gatherOutput.cgi shouldn't drop into a loop when the file doesn't exist. Instead return a status to your AJAX request that the file doesn't exist yet and then have it wait (using setInterval or setTimeout) and try again after so many seconds.

That will be MUCH easier on your server. For the user you can still show a loading graphic or something else that let's them know the process is still happening in the background.

Cfreak
Good idea, I'll implement this and see how much it improves performance. Thanks!Any further suggestions are still appreciated.
Razor Storm