views:

20

answers:

1

After developing an awesome app on my local machine without any consideration of how it would perform on my host, I have run into a terrible issue. I am serving files (.pdf & .zip) through rails send_file so that I can log statistics. The only problem is that when two (or more) files are downloaded simultaneously, a new ruby dispatch.fcgi process must be started to handle each one. I understand this could be avoided by using mod_xsendfile, but unfortunately my host doesn't support that apache mod. So here's the weird part. These processes are being created as expected, but for some reason they are never exiting. As a test, I downloaded about 10 files simultaneously from a couple different computers. There were about 10 processes created, but none ever exited. Even minutes after their invocation and even after the downloads had been long completed.

Why aren't these exiting? What can I do to avoid this problem other than switch to a real host that provides support for mod_xsendfile?

A: 

If you don't need access control to the files you're offering, you could always try placing the files somewhere under /public or some other url outside of the rails application.

When a user goes to download a file, it could take them to a controller action that updates download statistics, then redirects the user's browser to the path where the file is actually stored using a meta refresh tag or a bit of javascript. This way, apache will be handling the file transfer without rails... essentially what xsendfile would do.

On the other hand, switching to another host is probably something worth looking into if this is anything more than a toy project you're working on... fastcgi is a pretty antiquated way to be serving a rails app at this point.

Cratchitimo