G'day,
We have a Perl script that is processing geolocation requests from the head-end servers in a major web site. The script is a broker providing additional business logic interpreting the data returned by a COTS product that provides data for a given IP address, e.g. country, connection type, routing type, carrier, etc.
This Geo service is currently handling peak loads of around 1,000 requests per second at the COTS backend. BTW It is actually serving 5,000 requests p.s. from its dedicated loadbalance/cache layer that lives directly before the broker layer.
I have recently had to modify the behaviour of this broker to allow for a new category of connections that we've been seeing occur on the site which is causing some problems.
The original version of the script, not my design! btw, has been built using a mixture of config items in the script itself and other items in separate Perl fragments. As was quite rightly pointed out during the peer review of my changes, we should probably migrate all of the config items out to be separate rather than continue with a mixture of embedded and separate config items.
Now I want to take this further and put all config items, created as separate Perl hashes, into a single config file.
At the moment, we have to stop and restart the whole application to get the new config items reloaded which, given the traffic levels, is a bit inconvenient even though there are four instances of the broker across two separate data centres so we never actually lose the service.
I suspect that I am going to have to resort to keeping a timer, or maybe a request counter, and performing a stat on the config file in question. Or maybe even have a configured TTL for the config file and just reload it every ten minutes or so.
But is there a way to make Perl automatically reload a newer version of a file that it has previously loaded? I'm thinking of behaviour like that provided by the Apache mod_perl module.
cheers,