Hi All,
About Application
I am working on an e-commerce application in PHP. To keep URL's secure, product download links are kept behind PHP. There is a file, say download.php, which accepts few parameter via GET and verifies them against a database. If all goes well, it serves file using readfile() function in PHP.
About Problem
Now problem comes when file to be passed to readfile() is larger than memory limit set in php.ini As this application will be used by many users on shared-hosting we cannot relay on altering php.ini settings.
In our effort to find workarounds, I first thought we can go for fread() calls in while loop but it seems that will impose problems as well as highlighted here http://stackoverflow.com/questions/597159/sending-large-files-reliably-in-php
So my best option is to detect/check if server supports X-Accel-Redirect (in case of Nginx) / X-Sendfile (in case of Apache)
If server supports X-Accel-Redirect / X-Sendfile, I can use them and in else block I can make system admin aware about memory limit enforced by php.ini
Ideally, I want to use server side support like X-Accel-Redirect / X-Sendfile wherever possible, and if that doesn't work - I would like to have a fallback code to read files without readfile().
I am not yet sure as how readfile() and fread() in while loop are different but seems while loop will create problem, again, as suggested in http://stackoverflow.com/questions/597159/sending-large-files-reliably-in-php
Hope to get some help, suggestions, codes, guidance.
Thanks for reading.
-Rahul