views:

131

answers:

2

I have a list of entities which I want to store in the memcache. The problem is that I have large Models referenced by their ReferenceProperty which are automatically also stored in the memcache. As a result I'm exceeding the size limit for objects stored in memcache.

Is there any possibility to prevent the ReferenceProperties from loading the referenced Models while putting them in memcache?

I tried something like

def __getstate__(self): 
    odict = self.__dict__.copy() 
    odict['model'] = None 
    return odict

in the class I want to store in memcache, but that doesn't seem to do the trick.

Any suggestions would be highly appreciated.

Edit: I verified by adding a logging-statement that the __getstate__-Method is executed.

A: 
odict = self.copy()
del odict.model

would probably be better than using dict (unless getstate needs to return dict - i'm not familiar with it). Not sure if this solves Your problem, though... You could implement del in Model to test if it's freed. For me it looks like You still hold a reference somewhere.

Also check out the pickle module - you would have to store everything under a single key, but it automaticly protects You from multiple references to the same object (stores it only once). Sorry no link, mobile client ;)

Good luck!

Reef
+1  A: 

For large entities, you might want to manually handle the loading of the related entities by storing the keys of the large entities as something other than a ReferenceProperty. That way you can choose when to load the large entity and when not to. Just use a long property store ids or a string property to store keynames.

Peter Recore