views:

406

answers:

5

I have a rails app that accepts file uploads and I wanted to know the best way to have common storage between servers. Since we have a number of windows applications we have used samba in the past, but as we build pure linux apps I would like to do this the best possible way.

We are expecting large amounts of data, so would need to scale this across multiple file servers.

+1  A: 

One easy way to do it is to use attachment_fu with an S3 backend.

Dustin
+4  A: 

I've used paperclip with an S3 backend.

Mike Breen
+2  A: 

If you want to have all the data in-house than a networked file-system might be the way to go. Try setting up AFS it scales pretty good.

Honza
AFS isn't better than any other filesystem when you need read-write everywhere. You can't have read/write replicates, and releasing a volume after every write would be quite expensive.
Dustin
+2  A: 

Another good alternative is from the creators of Memcached:

Mogile FS http://www.danga.com/mogilefs/

TonyLa
A: 

People sometimes say "TMI", meaning "too much information". I wonder if anyone ever says the opposite: TLI?

You don't say what criteria you would apply to decide what is "best" : fastest in operation or to develop, cheapest to build or to run, simplest in concept, most scalable, whatever.

Neither do you define "large amounts of data". Mega-, giga- or terabytes? Thousands, millions or billions of files? For how long do you need/plan to keep them?

You also say "expecting". What do you expect on live date? In a month? Six months? A year? With what certainty?

All these questions are intended to help you define your requirement so that you can evaluate options.

With no real information, I'd say build the simplest thing that could possibly work and improve it as necessary when you need it.

The simplest cross-server option I can think of is the database - it's really really easy to store files the the DB using rails. I asked a related question here that has some sample code.

Mike Woodhouse