I wrote a Perl program which searches and manipulates one text file. This CGI process slurps the file directly into memory, manipulates the file based on the User's input, then generates the HTML result.
It works functional wise. However, I know that once I deploy on high volume server, it will not be able to respond in due time. I suspect the memory to be a bottleneck. What is the best way to share that file, such that it is read once into memory once the server starts and never again ?
The solution I am guessing is a server daemon which loads the file into memory and serves other processes/threads their data. If so, what is the best method to implement the IPC?