tags:

views:

33

answers:

2

Hi,

In a php script I am doing a lot of different cUrl GET requests (a hundred) to different URL.

Is reusing the same curl handle from curl_init will improve the performance or is it negligible compare to the response time of the cURL requests?

I am asking that because in the current architecture it would be not easy to keep the same cUrl handle.

Thanks,

Benjamin

+1  A: 

It depends how many requests you will be making - the overhead for closing & reopening each is negligable, but when doing a thousand? Could be a few seconds or more.

I believe curl_multi_init would be the fastest method.

The whole thing depends on how many requests you need to do.

adam
I cannot use curl_multi_init because my curl requests need to be synchronous. I will have a hundred of request each time.
benjisail
+1  A: 

check this out too


try {
    $pool = new HttpRequestPool(
        new HttpRequest($q1),
        new HttpRequest($qn)
    );
    $pool->send();

    $x = 0;
    foreach($pool as $request) {

      $out[] = $request->getResponseBody();

    }
} catch (HttpException $e) {
    echo $e;
}


sathia
I don't see the point of your answer in relation with my question... Could you be more precise?
benjisail
well, it's a different approach to the problem. if you need to have tons of curl GET requests, you can use the HttpRequestPool of php which has been designed exactly for this purpose: http://pecl.php.net/package/pecl_http
sathia