views:

132

answers:

3

I'm wondering what is the best pattern to allow large files to be uploaded to a server using Ruby.

I've found Rails and Large, Large file Uploads: Looking at the alternative but it doesn't give any concrete solutions.

I don't want to use Rails since I'm working on a simple upload server that'll run in standalone mode. I'm guessing that Sinatra could be the key but I don't know which web server I should use to run it without raising a Timeout.

I also need this web server to allow simultaneous upload.

UPDATE: By "large files" I mean between 200MB and 5GB.

UPDATE2: Since those files are videos (in my case), I can deal with a max size of 2GB like youtube.

+1  A: 

ok i am taking a bit of a strech here but: if you would use a couchdb as a target for your uploads you would get rid of the timeout problem. consider the couchdb as some "temp" memory in this example. so if a downloads finishes you can take the file from the couchdb and do with it whatever you want. i managed to upload files as big as 9gb over a dsl line into couchdb without any drama. it may take a bit of reading but i think you could make it work.

couchdb has many rails gems so it plays nice with others ;)

let me know if you wanna go down that rabbit hole so i can give you some more pointers

elmac
I really like the idea. I'm giving it a try today :)
garno
A: 

passenger recommends using a separate apache/nginx module to handle uploads.

rogerdpack
Do you have a cue on how to implement this ? I can't find any doc about nginx+passenger handling large files.
garno
Hmm. Looks like there's options for apache but not nginx? Maybe it's just a generic nginx directive as to where to allow uploads...http://blog.schuerrer.org/post/571570048/file-upload-progress-done-rightmight help
rogerdpack
A: 

>passenger recommends using a separate apache/nginx module to handle uploads.
@rogerpack can u point to some docs

and how is that configured?

deepak