tags:

views:

34

answers:

3

I am running a php script that:

  1. queries a local database to retrieve an amount
  2. executes a curl statement to update an external database with the above amount + x
  3. queries the local database again to insert a new row reflecting that the curl statement has been executed.

One of the problems that I am having is that the curl statement takes 2-4 seconds to execute, so I have two different users from the same company running the same script at the same time, the execution time of the curl command can cause a mismatch in what should be updated in the external database. This is the because the curl statement has not yet returned from the first user...so the second user is working off incorrect figures.

I am not sure of the best options here, but basically I need to prevent two or more curl statements being run at the same time.

I thought of storing a value in the database that indicates that the curl statement is being executed at that time, and prevent any other curl statements being run until its completed. Once the first curl statement has been executed, then the database flag is updated and the next one can run. If this field is 'locked', then I could loop through the code and sleep for (5) seconds, and then check again if the flag has been reset. If after (3) loops, then reset the flag automatically (i've never seen the curl take longer than 5 seconds) and continue processing.

Are there any other (more elegant) ways of approaching this?

+2  A: 

You can use flock with arbitrary file. This way, the second script will block until it can acquire the lock.

$lockfile = 'foo.bar';
$fd = fopen($lockfile, "w");
if (flock($fd, LOCK_EX)) {
    do_your_stuff();
}
else
    die("error"); //should not happen; flock should block until the lock is acquired

fclose($fd);

EDIT:

PHP is not Java EE, there is no simple way to implement distributed transactions.

Artefacto
Unfortunately that will not work, since the lock has to happen on a per-company basis - its fine to run concurrent curl operations, as long as there is only one per company at any one time.
JonoB
@JonoB: Use a unique file for each company, like md5($companyName).
Tom
@Tom OK, makes sense, but then I still have to create a new file each time a new company is loaded onto the system. Why would this file-based method be preferred over creating a boolean value in the database?
JonoB
@JonoB The database procedure you describe only works if you use a SERIALIZABLE isolation level.
Artefacto
A: 
Tom
A: 

Hello,

curl supports parallel requests to N resources with curl_multi_exec(). If you want to make these calls (among which are multiple curl_* calls) sequentially and make the above statements an atomic operation do not use curl_multi in case you are using it.

If the database records that you access for update(s) cannot (or should not) be accessed by more than 1 user(s) at the same time then you should consider locking/transactions if available from your Database Server.

The use of a pseudo-transaction mechanism with a column marking a record as 'locked' might help you as you say but i cannot be certain (there is a method for pseudo-transactions using timestamps which you can google for more information).

andreas