Say I have Object A which has a member of type Object B.. and Object B has a member of type Object C.
Object A VERY RARELY changes but is read very frequently, whereas Object C frequently does. It makes sense to cache Object A, but when it's serialized to go in the cache obviously is serialized the whole graph.
Scenario: Object A is read from the database and populated. Variable of type Object C is accessed and lazy loaded. For the sake of example C.Status is referenced in business logic, and has the value ACTIVE. I change A.Name to be "New Name", and commit to the database and my cache.
Object C (from somewhere else in code) has it's status changed to SUSPEND.
- At this point should I invalidate the cache of Object A? *
If I do:
If (A.B.C.Status == ACTIVE) SendLotsOfMoney(A)
This will pass, because the whole graph is serialized.
If C.Status changes frequently I don't want to keep invalidating the cache of A because I might be frequently referring to A.Name, A.Status etc, and do not want to keep hitting the database.
I guess my options are: 1) Have a flag in the object to day that it's come from cache, and force reload all of the dependencies the first time if it has and they are referenced (lazy load them again). These might come from cache anyway.. but there is still a lot of pointlessly stored data in my cache then.
2) Keep invalidating the cache. Obviously if I have A.B.C.D.E.F.G.Status, and A.H.I.J.K.Status etc, then I'm going to be recreating A all the time?
3) I override OnDeserialization and do what is in (1)
4) I override OnSerialization and set all references to be null (so they are lazy loaded and not stored?)
I'm interested to hear the responses. I'm leaning towards 4
Regards!