With Django multidb, it's fairly easy to write a router that runs a master/slave infrastructure. But is it possible to write a router that writes to multiple databases? My use case is a collection of projects, all running on the same domain. To save users from registering/login in on every site, I'd like to synchronize the contrib.auth
and contrib.sessions
tables. Is that possible with Django multidb or should I look into replication features of the database system (MySQL in my case)?
views:
38answers:
1
+1
A:
i think you will be better implementing an SSO or OAuth service
but if you want like to synchronize your table users between two database and if you are using your own UserModel you can do something like this
class MyUser(models.Model):
name = models.CharField(max_length=100)
user = models.ForeignKey(User, unique=True)
def save(self, ...): # ALL the signature
super(MyUser, self).save(using='database_1')
super(MyUser, self).save(using='database_2')
you can also putting with a decorator like this, like this you can also use it for synchronizing other tables:
def save_multi_db(model_class):
def save_wrapper(save_func):
def new_save(self, *args, **kws):
super(model_class, self).save(using='database_1')
super(model_class, self).save(using='database_1')
return new_save
func = getattr(model_class, 'save')
setattr(model_class, 'save', save_wrapper(func))
return save_wrapper
# and use it like this:
@save_multi_db
class MyUser(models.Model):
....
Hope this will help :)
singularity
2010-10-28 09:57:47
I would have to change Django's source code for this to work (as I said in the question, the tables I want to replicate are from `contrib.auth` and `contrib.sessions`). If at all possible, I'd like to avoid messing with Django itself.
piquadrat
2010-10-28 10:12:18