views:

382

answers:

1

I am writing an application to allow users to schedule one-time long-running tasks from a web application (Linux/Apache/CGI::Application). To do this I use the the Schedule::At module which is the Perl interface to the "at" command. Since the scheduled tasks are not repeating, I am not considering "cron". I have two issues with "at" though:

  1. Scheduling works fine when my CGI application runs under the suexec wrapper, but not when scheduled by the owner of the Apache process. How can I get scheduling to work in both environments (suexec and no-suexec)?
  2. It appears that the processes scheduled by "at" or Schedule::At have no failure reporting, and I sometimes find that scheduled tasks fail silently. Is there some way to log the fact that the scheduled task (not the scheduler itself) has failed to run?

I am not fixed on "at" and am open to using other, more robust, scheduling methods if there are any.

Thank you for your attention.

+4  A: 

I've heard good things about The Schwartz . It doesn't have a delay-until though; you'd submit the jobs via at, but that should solve both of the problems you list above, as long as your submit_job script was simple.

(as a caveat, I've only used Gearman, I think you'd want a reliable job queue for this, a "fire and forget" mechanism, so you can keep your submit_job dumb.)

Todd Gardner
Thanks for the Gearman tip. On first looks, this seems to be exactly what I needed.
Gurunandan
After looking hard at Gearman and the Schwartz, I found Beanstalk to be the most appropriate solution to my problem. Thanks for the tip.
Gurunandan