tags:

views:

66

answers:

4

In PHP I have seen that if a certain process or function isn't completed then the entire application which includes that function get delayed due to that.

Say there is a search function which returns lots of result which includes more than 20 functions. An "x" function is taking too much time, hence the result page is getting delayed due to that. My question is how do I fix a time limit to "X" function, say 2 seconds, and if it isn't complete within that time then "X" function should be skipped.

Is there any way to do that or better?

A: 

You may have a look at tick functions and declare. So you execute the tick function say every 10 ticks and check execution time. But I have no idea how the tick function may abort the further execution of the function...

nikic
A: 

The canonical way to handle something like this without resorting to a thread pool is an alarm signal. Here's what I was able to find in a quick search of the PHP docs:

http://php.net/manual/en/function.pcntl-signal.php

http://php.net/manual/en/function.pcntl-alarm.php

Nicholas Knight
Hi thanks for the reply.I am clearing my doubt that are we on a same page?? I feel you people are getting into some complicate process. I will elaborate more. say this function/process retrieves info from Alexa.com. how do I restrict this function to work only for say 3 seconds..$data = file_get_contents('http://www.alexa.com/siteinfo/google.com');preg_match_all( '/<a href="\/topsites\/countries\/(.*)">(.*)<\/a>/mU', $data, $result, PREG_SET_ORDER);
mathew
@mathew, I don't think anyone is trying to make it complicated for you but you're asking for something that requires an intermediate or advanced solution from what it sounds like.
jlafay
Please don't take it in negative mode. I wasn't seen this type of functions before and thought may be misunderstood my question and that's why I elaborate it again.
mathew
+1  A: 

Not a direct answer to your question, but perhaps a possible solution to your problem:

Are the search results of the different sites mixed in one big result?

If not, I would use ajax to load the different sections / results simultaneously, showing the sub-results as soon as they become available.

If you want to mix all sub-results, you could still do the same storing the sub-results in a session and generating your final output when all 20 are known or when a certain time has passed.

This does depend heavily on javascript however...

Edit: jquery example:

Using jquery you can load the different results in different div's:

javascript

/* start loading results when the document is loaded */
$(document).ready(function() {
    $("#results01").load("http://www.mysite.com/page_results_01.php");
    $("#results02").load("http://www.mysite.com/page_results_02.php");
    ...
    $("#results20").load("http://www.mysite.com/page_results_20.php");
});

html

<div id="results01">Loading results from page 01 ...</div>
<div id="results02">Loading results from page 02 ...</div>
...
<div id="results20">Loading results from page 20 ...</div>

Emtpy div's don´t show, so you can get rid of the text if you don´t want it...

jeroen
Actually Yes it is a big result but it is under separate heading so what ever finishes it can be shown first and rest can be shown as soon as they got finished. I dont mind if it is in Javascript but need a sample as I am not much familiar.
mathew
where I place this javascript?? between <head> or between <body>??
mathew
You will have to read up on jquery if you really want to use this: http://docs.jquery.com/Tutorials
jeroen
+2  A: 

So, if you're talking about file_get_contents (why didn't you say that right away?) you may specify a timeout using stream contexts:

$ctx = stream_context_create(array(
    'http' => array(
        'timeout' => 10
        )
    )
);
file_get_contents("http://example.org/", 0, $ctx);

See HTTP context options.

If using curl use curl_setopt for CURLOPT_TIMEOUT.

curl_setopt($curl, CURLOPT_TIMEOUT, 10);

This will set timeout to ten seconds.

nikic
Nikic actually that was a sample file. The final file wont have file_get_contents but will be in CURL.
mathew
Nikic the reason why I wont use that is many site wont allow to scrape with the same. so Curl is safe.
mathew
Curl also has a timeout option...
Charles
how do I do it??
mathew
@curl: See my edit ;)
nikic
its is curl_setopt($curl, CURLOPT_TIMEOUT, 10); there is a typo error
mathew
thanks, fixed it.
nikic