views:

345

answers:

4

I'm writing some code which processes a queue of items. The way it works is this:

  1. Get the next item flagged as needing to be processed from the mysql database row.
  2. Request some info from a google API using Curl, wait until the info is returned.
  3. Do the remainder of the processing based on the info returned.
  4. Flag the item as processed in the db, move onto the next item.

The problem is that on step # 2. Google sometimes takes 10-15 seconds to return the requested info, during this time my script has to remain halted and wait.

I'm wondering if I could change the code to do the following instead:

  1. Get the next 5 items to be processed as usual.
  2. Request info for items 1-5 from google, one after the other.
  3. When the info for item 1 is returned, a 'callback' should be done which calls up a function or otherwise calls some code which then does the remainder of the processing on items 1-5.
  4. And then the script starts over until all pending items in db are marked processed.

How can something like this be achieved?

+2  A: 

You can split this in 2 process types.

  1. Worker process (there are many of them running): knows the database row being processed, makes and waits for Google API call, and then does the job, and saves the results to the database.

  2. Scheduler (one and only): periodically (say, every few seconds) checks if there's work to do, and makes sure that there are N (5 or whatever is optimal) workers running. If less then N workers are running, starts more workers (with exec) to keep it N, until all the work is done.

Ivan Krechetov
+1  A: 

I really don't know if this is an elegant approach, but in theory you could use fork() to fork the PHP process for each item. This would allow all code to be in one file.

// Get items from DB
$items = get_items_from_db();
foreach($items as $item) {
    $pid = pcntl_fork();
    if($pid == -1) die("Couldn't fork!");
    if(!$pid) {
      // Process the item in the child process
      process_item($item);
      exit();
    }
}
// Wait for all child processes to end
pcntl_wait();
// We're done!

But yes, this solution will most likely make some people scream ;)

TuomasR
+1  A: 

I think you can create a master script that will call a child script on same machine for a particular item. The child script will send request to google API and work accordingly.

Acharya
A: 

hello all,

the story: 1. Mobile sends the word "algorithms and programming" (in the J2ME => i use Wsoap client) to Web services. 2. Web services send it to the E-library that has been registered. 3. Each e-library to find the word in its database, and then send the data results from the database to the web services. 4. Web services received, and send it back to Mobile.

I'm still looking for a solution to exchange data between webservices (provider function) and two e-library of images, with the assumption of each e-library has set a SOAP Client (NUSOAP include). My question is, whether nusoap can implement this architecture? Do your friends have a tutorial?

restroika