I have this python cgi script that checks if it hasn't been accessed to many times from the same IP, and if everything is ok, reads a big file form disk (11MB) and then returns it as a download.
It works,but performance sucks. The bottleneck seems to be reading this huge file over and over:
def download_demo():
"""
Returns the demo file
"""
file = open(FILENAME, 'r')
buff = file.read()
print "Content-Type:application/x-download\nContent-Disposition:attachment;filename=%s\nContent-Length:%s\n\n%s" % (os.path.split(FILENAME)[-1], len(buff), buff)
How can I make this faster? I thought of using a ram disk to keep the file, but there must be some better solution. Would using mod_wsgi
instead of a cgi script help? Would I be able to keep the big file in apache's memory space?
Any help is greatly appreciated.