Assume we have an usual M-M relationship between two tables, like:
users---< users_tags >--- tags.
In this post I'm only concerned about the relation user_tags, tags: I'd like to avoid that linked tags can be deleted. Only tags which aren't referenced should be destroyable.
The stupid way to do this would be:
class Tag
def before_destroy
unless self.user_tags.empty?
raise "error"
end
end
end
But I think there's a potential race condition, between the check user_tags.empty? and the actual delete.
The second approach could be to lock the whole user_tags table before checking if there are any references left.
And the third way I can think of would involve changes to the code which creates the actual reference:
Add a reference into users_tags:
- Fetch the Tag
- Lock it (to avoid concurrent destruction)
- Create reference in users_tag
- Commit
The before_destroy handler then could:
- self.lock!
- Check if there are any references
- destroy self
- Commit
Are there any better ways to do this? Which one is reliable / the best? I personally tend to the second one as it only needs logic in the before_destroy controller, but with the cost of having the whole table locked.
Edit 1:
While experimenting around with LOCK TABLE I realized that they are playing against my transactions. When using innodb you can either use transactions (and their locking features) or use LOCK/UNLOCK table, a mix of both worlds makes matters much much worse (LOCK/UNLOCK causes implicit commits, I missed that warning in the doc). But this just for the protocol.
(Edit 2 (a few weeks later): I fought with that issue again. So I want to stress again Do not use LOCK TABLE)
I'm right now tending to use a SHARE LOCK on the parent object (tag in the example)) when adding children, and a FOR UPDATE lock for deletions. But I'm still wondering if thats the way it's meant to be (Lock a Rang in the child table for a update in the parent table).
Btw. I also realise that this question is now completely independent of rails :).