tags:

views:

85

answers:

5

I have a PHP script that grabs data from an external service and saves data to my database. I need this script to run once every minute for every user in the system (of which I expect to be thousands). My question is, what's the most efficient way to run this per user, per minute? At first I thought I would have a function that grabs all the user Ids from my database, iterate over the ids and perform the task for each one, but I think that as the number of users grow, this will take longer, and no longer fall within 1 minute intervals. Perhaps I should queue the user Ids, and perform the task individually for each one? In which case, I'm actually unsure of how to proceed.

Thanks in advance for any advice.

Edit

To answer Oddthinking's question:

I would like to start the processes for each user at the same time. When the process for each user completes, I want to wait 1 minute, then begin the process again. So I suppose each process for each user should be asynchronous - the process for user 1 shouldn't care about the process for user 2.

To answer sims' question:

I have no control over the external service, and the users of the external service are not the same as the users in my database. I'm afraid I don't know any other scripting languages, so I need to use PHP to do this.

+2  A: 

Am I summarising correctly?

You want to do thousands of tasks per minute, but you are not sure if you can finish them all in time?

You need to decide what do when you start running over your schedule.

  • Do you keep going until you finish, and then immediately start over?
  • Do you keep going until you finish, then wait one minute, and then start over?
  • Do you abort the process, wherever it got to, and then start over?
  • Do you slow down the frequency (e.g. from now on, just every 2 minutes)?
  • Do you have two processes running at the same time, and hope that the next run will be faster (this might work if you are clearing up a backlog the first time, so the second run will run quickly.)

The answers to these questions depend on the application. Cron might not be the right tool for you depending on the answer. You might be better having a process permanently running and scheduling itself.

Oddthinking
I'm not sure what my options are for having a permanent process other than Cron. Actually, it'd be a scheduled task since it's a Windows machine. What other options would I have?
Arms
Some other options include: (1) Preferred: Making it an automatically-started Windows service (=Unix daemon) so it runs in the background automatically when the machines starts - even before anyone logs in, (2) Just running an application that does the job and leaving it running. (3) Having a scheduled job that checks if another copy is already running, and shutting itself down if so (thus handling occasional crashes). Whatever the choice, the process would do its job, and then sleep until it is due again.
Oddthinking
+1  A: 

So, let me get this straight: You are querying an external service (what? SOAP? MYSQL?) every minute for every user in the database and storing the results in the same database. Is that correct?

It seems like a design problem.

If the users on the external service are the same as the users in your database, perhaps the two should be more closely configured. I don't know if PHP is the way to go for syncing this data. If you give more detail, we could think about another solution. If you are in control of the external service, you may want to have that service dump it's data or even write directly to the database. Some other syncing mechanism might be better.

EDIT

It seems that you are making an application that stores data for a user that can then be viewed chronologically. Otherwise you may as well just fetch the data when the user requests it.

  1. Fetch all the user IDs in go.

  2. Iterate over them one by one (assuming that the data being fetched is unique to each user) and (you'll have to be creative here as PHP threads do not exist AFAIK) call a process for each request as you want them all to be executed at the same time and not delayed if one user does not return data.

  3. Said process should insert the data returned into the db as soon as it is returned.

As for cron being right for the job: As long as you have a powerful enough server that can handle thousands of the above cron jobs running simultaneously, you should be fine.

You could get creative with several PHP scripts. I'm not sure, but if every CLI call to PHP starts a new PHP process, then you could do it like that.

foreach ($users as $user)
{
    shell_exec("php fetchdata.php $user");
}

This is all very heavy and you should not expect to get it done snappy with PHP. Do some tests. Don't take my word for it.

sims
A: 

Databases are made to process BULKS of records at once. If you're processing them one-by-one, you're looking for trouble. You need to find a way to batch up your "every minute" task, so that by executing a SINGLE (complicated) query, all of the affected users' info is retrieved; then, you would do the PHP processing on the result; then, in another single query, you'd PUSH the results back into the DB.

Alex
A: 

Based on your big-picture description it sounds like you have a dead-end design. If you are able to get it working right now, it'll most likely be very fragile and it won't scale at all.

I'm guessing that if you have no control over the external service, then that external service might not be happy about getting hammered by your script like this. Have you approached them with your general plan?

Do you really need to do all users every time? Is there any sort of timestamp you can use to be more selective about which users need "updates"? Perhaps if you could describe the goal a little better we might be able to give more specific advice.

UtahPhunk
+1  A: 

Given your clarification of wanting to run the processing of users simultaneously...

The simplest solution that jumps to mind is to have one thread per user. On Windows, threads are significantly cheaper than processes.

However, whether you use threads or processes, having thousands running at the same time is almost certainly unworkable.

Instead, have a pool of threads. The size of the pool is determined by how many threads your machine can comfortable handle at a time. I would expect numbers like 30-150 to be about as far as you might want to go, but it depends very much on the hardware's capacity, and I might be out by another order of magnitude.

Each thread would grab the next user due to be processed from a shared queue, process it, and put it back at the end of the queue, perhaps with a date before which it shouldn't be processed.

(Depending on the amount and type of processing, this might be done on a separate box to the database, to ensure the database isn't overloaded by non-database-related processing.)

This solution ensures that you are always processing as many users as you can, without overloading the machine. As the number of users increases, they are processed less frequently, but always as quickly as the hardware will allow.

Oddthinking