I've built a nice website system that caters to the needs of a small niche market. I've been selling these websites over the last year by deploying copies of the software using Capistrano to my web server.
It occurs to me that the only difference in these websites is the database, the CSS file, and a small set of images used for the individual client's graphic design.
Everything else is exactly the same, or should be... Now that I have about 20 of these sites deployed, it is getting to be a hassle to keep them all updated with the same code. And this problem will only get worse.
I am thinking that I should refactor this system, so that I can use one set of deployed ruby code, dynamically selecting the correct database, etc, by the URL of the incoming request.
It seems that there are two ways of handling the database:
- using multiple databases, one for each client
- using one database, with a client_id field in each table, and an extra 'client' table
The multiple database approach would be the simplest for me at the moment, since I wouldn't have to refactor every model in my application to add the client_id field to all CRUD operations.
However, it would be a hassle to have to run 'rake db:migrate' for tens or hundreds of different databases, every time I want to migrate the database(s). Obviously this could be done by a script, but it doesn't smell very good.
On the other hand, every client will have 20K-50K items in an 'items' table. I am worried about the speed of fulltext searches when the items table has a half million or million items in it. Even with an index on the client_id field, I suspect that searches would be faster if the items were separated into different client databases.
If anyone has an informed opinion on the best way to approach this problem, I would very much like to hear it. Thanks much in advance...
-- John