views:

91

answers:

2

Hello all,

I have a site that has a simple API which can be used via http. I wish to make use of the API and submit data about 1000-1500 times at one time. Here is their API: http://api.jum.name/

I have constructed the URL to make a submission but now I am wondering what is the best way to make these 1000-1500 API GET requests? Here is the PHP CURL implementation I was thinking of:

$add = 'http://www.mysite.com/3rdparty/API/api.php?fn=post&username=test&password=tester&url=http://google.com&category=21&title=story a&content=content text&tags=Season,news';
curl_setopt ($ch, CURLOPT_URL, "$add");
curl_setopt ($ch, CURLOPT_POST, 0);
curl_setopt ($ch, CURLOPT_COOKIEFILE, 'files/cookie.txt');
curl_setopt ($ch, CURLOPT_FOLLOWLOCATION, 0);
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, TRUE);
$postdata = curl_exec ($ch);

Shall I close the CURL connection every time I make a submission? Can I re-write the above in a better way that will make these 1000-1500 submissions quicker?

Thanks all

+1  A: 

PHP's curl, by default, reuses a connection for multiple calls to curl_exec().

So in this case, you just ruse the curl handle, you got by curl_init and if the URL matches between calls to curl_exec(), it will send a "Connection: keep-alive" header and reuse the connection.

Do not close the connection and do not set CURLOPT_FORBID_REUSE

also see here:

http://stackoverflow.com/questions/972925/persistent-keepalive-http-with-the-php-curl-library

pilif
+1  A: 

If you have access to php 5.2+ I would highly recommend php's curl_multi.

This allows you to process several curl requests in parallel, which in this case would definitely come in handy.

Related documentation : http://us3.php.net/manual/en/ref.curl.php
An example usage : http://www.somacon.com/p537.php

Ian Elliott