tags:

views:

63

answers:

4

I have a script that is going to be run by via CRON every night and will take entries in one database and process them. I am curious what methods would be best to take so that I don't use too much memory/error out or lag the server potentially. There could be anywhere up to maybe 6,000 entries that need to be processed...

I am curious on what methods should be used to mitigate the load being I am pretty sure just upping the memory limit could be inviting other issues once it passes a certain point. Thank you.

+1  A: 

Use a database cursor and you don't need to load everything into memory.

Assuming you're using PDO, just do PDOStatement->fetch() for each item of the resultset.

Skilldrick
+2  A: 

I agree with @Skilldrick, make sure you process database query reesults one row at a time. For instance, if you use MySQL:

See documentation for more information.

In Doctrine 1.2 with PDO, you can do this:

$masterConn->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false); 

Note that PHP has no memory limit when you invoke it via command-line or cron. Apologies, this is wrong, PHP does respect memory limit -- I was mistakenly thinking of max input time, which is false by default for the CLI.

Bill Karwin
Bill, memory_limit is certainly respected when invoked via command-line.
webbiedave
@webbiedave, thanks, I've edited the above.
Bill Karwin
A: 

If you will fetch item per item, process them, save and than unset variables and get another portion of data it will not use lot of memory.

Problem could be if you fetching all data and than processing it.

Aram
A: 

You can count and flag each row processed so that you can setup your cron to every 10 minutes for example and each time you ran the code, you can process a determined amount of rows. With that you will never exceed time or memory limit.

You can first test to know the maximum rows accepted by your server without problems for each call to the script and then set it.

Keyne