tags:

views:

89

answers:

3

With twitter being down today I was thinking about how to best handle calls to an API when it is down. If I am using CURL to call their api how do I cause the script to fail quickly and handle the errors so as not to slow down the application?

+1  A: 

Use curl_setopt

curl_setopt($yourCurlHandle, CURLOPT_CONNECTTIMEOUT, '1'); // 1 second

If you use curl >= 7.16.2 and PHP >= 5.2.3 there is CURLOPT_CONNECTTIMEOUT_MS

jitter
The only issue is this is that it wouldn't be able to tell the difference between a slow connection and a downed api.
Ian Elliott
Would this return true if it timed out?
Tim
A fatally slow connection is just as bad as a downed API. Maybe 3+ secs is a more generous timeout though.
Kekoa
+2  A: 

Use curl_getinfo to get the cURL response code or content length and check against those.

$HttpCode = curl_getinfo($curl, CURLINFO_HTTP_CODE);
jjclarkson
I've actually decided to go with a combination of returned codes and the other answer of running a chron job to check site status.
Tim
+2  A: 

Perhaps use a sort of cache of whether or not twitter is up or down. Log invalid responses from the api in a database or server-sided file. Once you get two/three/some other amount of invalid responses in a row, disable all requests to the api for x amount of time.

After x amount of time, attempt a request, if it's still down, disable for x minutes again.

If your server can run CRON jobs consider making a script that checks the api for a valid response every few minutes. If it finds out it's down, disable requests until it's back up. At least in this case the server would be doing the testing and users won't have to be the guinea pigs.

Ian Elliott