views:

162

answers:

2

I am looking at an application where I need to build out the user friend model as a graph structure. I need to go several degrees deep so using standard SQL in MySQL will not work due to the circular references. I have looked at the graph algorithms available and they involve loading the entire record set into a Graph object then doing operations on that. I can't afford to do this for every operation.

I would like to store the Graph object as a global object in memory and just make calls and updates to it. However, since Rails scales by creating separate processes I am going to have an almost immediate synchronization problem, since a single Rails process is only going to scale to a few simultaneous users.

Does anyone know a way to store an object in memory in Rails and keep it in sync between both requests and between the multiple mongrel/whatever processes?

At this point I am looking at a Java service for the graph operations since it scales using a thread model instead of a process model. I can scale big enough that I won't have to deal with the scaling issue for a while.

I would like to have an all Rails solution though because it will be easier to maintain and build.

A: 

It sounds like you may need a distributed hash table, or maybe something something like CouchDB as an alternative to a RDBMS.

Daniel Auger
A: 

One option would be to build a small rack application that you load the data in to. The logic for querying and computing your data will be in that rack app. You can use normal HTTP calls to localhost in order to transmit whatever you need to transmit (generated HTML? Something else?) from the rack app to your rails app.

This is basically a workaround to having multiple processes for your Rails application. I'm sure there are better solutions out there, such as memcached, NoSQL databases, and so on.

August Lilleaas