I know what you're thinking, 'O not that again!', but here we are since Google have not yet provided a simpler method.
I have been using a queue based solution which worked fine:
import datetime from models import *
DELETABLE_MODELS = [Alpha, Beta, AlphaBeta]
def initiate_purge():
for e in config.DELETABLE_MODELS:
deferred.defer(delete_entities, e, 'purging', _queue = 'purging')
class NotEmptyException(Exception): pass
def delete_entities(e, queue):
try:
q = e.all(keys_only=True)
db.delete(q.fetch(200))
ct = q.count(1)
if ct > 0:
raise NotEmptyException('there are still entities to be deleted')
else:
logging.info('processing %s completed' % queue)
except Exception, err:
deferred.defer(delete_entities, e, then, queue, _queue = queue)
logging.info('processing %s deferred: %s' % (queue, err))
All this does is queue a request to delete some data (once for each class) and then if the queued process either fails or knows there is still some stuff to delete, it re-queues itself.
This beats the heck out of hitting the refresh on a browser for 10 minutes.
However, I'm having trouble deleting AlphaBeta entities, there are always a few left at the end. I think because it contains Reference Properties:
class AlphaBeta(db.Model):
alpha = db.ReferenceProperty(Alpha, required=True, collection_name='betas')
beta = db.ReferenceProperty(Beta, required=True, collection_name='alphas')
I have tried deleting the indexes relating to these entity types, but that did not make any difference.
Any advice would be appreciated please.