views:

285

answers:

1

I have a PHP client application that is interfacing with a RESTful server. Each PHP Goat instance on the client needs to initialize itself based on information in a /goat request on the server (e.g. /goat/35, /goat/36, etc.). It does this by sending an HTTP request to its corresponding URL via cURL. Working with 30+ goat objects per page load equates to 30+ HTTP requests, and each one takes 0.25 second - that's baaaad, as my goats would say. Lazy-loading and caching the responses in memory helps, but not enough.

foreach ($goats as $goat) {
   $goat->getName() // goat needs to hit the REST API
}

The advantage of this technique is that my goats are all smart and encapsulated. The disadvantage is that the performance is horrible. The goats don't know how to queue their HTTP requests, one goat doesn't know if there are other goats that need to initiate a request, etc. I guess one alternative would be to build the goats externally:

$urls = array('http://', 'http://', ...);  // array of goat URLs
$result = fancy_pipelined_http_request_queue($urls);
foreach ($result as $xml) {
   $goat->buildSelfFromXML($xml);
}

I'm sure this is a well-known OO/REST dilemma that there are more advanced ways of solving, I just don't know where to look. Any ideas?

+1  A: 

You can use non-blocking sockets if you like. This involves a bit of coding to switch to them as you will need to kick curl aside. But this may improve performance because you will really be able to perform requests simultaneously.

See socket_set_blocking / stream_set_blocking functions.

FractalizeR