views:

46

answers:

1

I have a perl cgi script that's fairly resource intensive (takes about 2 seconds to finish). This is fine as long as only at most 4 or 5 of them are running at the same time and that's usually the case.

The problem is that when a user clicks a link that calls this script, a new process is spawned to handle that connection request, so if a user clicks many times (if they're impatient), the server gets overloaded with new processes running and most of them are redundant.

How can I ensure that only one instance of this process is running per host?

This is an old system that I'm maintaining which uses an old framework for the frontend, and I would like to avoid using javascript to disable the button client side if possible. Converting this to fast-cgi perl is out of the question as well, again because this is an old system and adding fast-cgi to apache might break a lot of other things that this thing runs.

+1  A: 

You want to use a file lock. Read the documentation on the Fcntl module and the flock function: http://perldoc.perl.org/functions/flock.html

Edit in response to comment:

Example of using a lock file:

#!/usr/bin/perl

use strict;
use warnings;

use Fcntl qw(:flock);

local *FH;
my $opened_file = open(FH,'>/tmp/example_file.lck');
if ($opened_file) {
    print "Lock file was opened successfully\n";
    if (flock(FH, &LOCK_EX | &LOCK_NB)) {
        &do_stuff();
    } else {
        print "Failed to get lock (another process is running)\n";
    }   
} else {
    print "Failed to open lock file: $!\n";
}
close(FH);

sub do_stuff {
    print "Locked!\n";
    sleep 30; # Pretending to be busy for a long time
}
Benjamin Franz
How are you proposing doing that?
ysth
still doesn't do quite what I want, but it does answer the original question so I'll accept it
Charles Ma