I have a rails app using mongodb through mongomapper and all is well. The problem is... I'm going to want to use erlang to do some background processing and I want to update the same mongo/mongomapper models with the results of this processing. What's the best way to share model definitions between the two apps (rails and erlang) and remain sane? Seems like it would be problematic to try to manage them separately if they're both accessing the same records. If there's no "good" way, am I simply approaching this the wrong way?
Additional info: I was originally doing the background processing using starling/workling clients. This is nice since I could use all of the same rails models and code. Performance was terrible though so I'm looking to erlang to solve some of the efficiency problems. It's a large amount of data being processed, but it can be easily processed in parallel.
I'm trying to avoid using erlang as the sole mediator between mongodb and rails through a rest or thrift interface.
Edit: I wanted to shine a little more light on this. The erlang processing will need to know a little bit about the models beforehand. I'm basically using erlang to pull data from other places and fill in details of the model objects. So, for example, there may be a description field that I'm scraping from an xml file using xmerl_xpath:string("//description/text()"). I need to be able to add methods like this that will act upon fields of the model.