views:

205

answers:

2

I'm implementing the first option discussed in "Marking for deletion in Django", that is, when an object is no longer active, I set a boolean to mark it as inactive.

The specific reason I'm using this method is that although the object is no longer in active use, it may still be referenced and displayed in various records and reporting outputs. I don't want Django's ripple delete to remove the old records.

How should I go about enforcing uniqueness of active objects?

Initially, I thought I should use unique_together to enforce my constraints at a database level. This works fine until I delete an object, at which point adding a new active object with the same name violates the uniqueness requirement. I could simply reflag the object as active, but I actually want a new object.

I'm looking for something that lets me say something like "unique together when active = True". I could enforce this in the model creation code, but it seems like enforcing it at the database level is a better idea.

Any advice about which of these is the best approach? Any better suggestions?

Note: django-reversion is cool, but totally does not work for my application since I DO need to access "deleted" objects from time to time.

+1  A: 

You could have a unique constraint:

class Meta:
     unique_together = ( ('name', 'active'),)

However, that means you can only have one active and one inactive object with the same name.

If you make active a NullBooleanField, then you can have NULL for active, and have (IIRC) a limitless number of objects that are inactive with the same name. PostgreSQL, at least, interprets a NULL value as part of a constraint as not breaking the constraint.

Matthew Schinckel
The postgresql docs say "...two null values are not considered equal in this comparison. That means even in the presence of a unique constraint it is possible to store duplicate rows that contain a null value in at least one of the constrained columns." http://www.postgresql.org/docs/8.2/static/ddl-constraints.html
Paul McMillan
+1  A: 

I think I understand where you're coming from when you say that unique together should remain on the database level. In the theoretical best of situations, the best design would be one which does not rely on understanding the underlying DB enforcement of unique together and instead acts as if that constraint is a permanent rule for that table.

Working from that idea, what if the differentiation between active/inactive happened on the "table level" and not with some fancy model hacking?

Consider the following:

class BaseModel(models.Model):
    # all of your fields here
    active = models.BooleanField()


class ActiveModel(BaseModel):    
    class Meta:
        unique_together = ('whatever', 'fields')    

    def make_inactive(self):
        # create a new instance of InactiveModel based on
        # this instances's values, then delete this instance


class InactiveModel(BaseModel):
    def make_active(self):
        # create a new instance of InactiveModel based on
        # this instances's values, then delete this instance

In this way, whenever you create a new model, you do so using ActiveModel. The unique_together is then enforced for active models only. To mark a model inactive, you do model.make_inactive instead of the former model.active=False. You can still continue to execute queries against BaseModel and access both active and inactive models.

T. Stone
The problem with this approach is the related values. If I have a foreignkey to BaseModel, deleting an ActiveModel object and re-creating it as an InactiveModel is going to break those links if I don't do extensive "fixing" all over the place.
Paul McMillan