views:

330

answers:

4

This is a bit of a poser, and I hope you'll find this challenging problem as interesting as I do... :)

I have a subclass of DataContext called MyDataContext, wherein I have overridden the SubmitChanges() method with some code of the form:

BeginTransaction(); // my own implementation
IList<object> Updates = GetChangeSet().Updates;
foreach (object obj in Updates) {
  MyClass mc = obj as MyClass;
  if (mc != null)
    mc.BeforeUpdate(); // virtual method in MyClass to allow pre-save processing
}
// This is followed by similar code for the Deletes and Inserts, then:
base.SubmitChanges();
// Then do post-save processing...
foreach (object obj in Updates) {
  MyClass mc = obj as MyClass;
  if (mc != null)
    mc.AfterUpdate(); // virtual method in MyClass to allow post-save processing
}
// similar code for Inserts and Deletes
// ...
CommitTransaction();
// obviously all enclosed in a try-catch block where the catch does a rollback

So far, so good. But there's a little problem, which surfaces if an implementation of MyClass invokes SubmitChanges() in its BeforeUpdate() or AfterUpdate() method. Now we have a recursion that is liable to lead to a stack overflow.

One way I thought of to work around this is to have a recursion-blocking variable at the beginning of SubmitChanges(). But what to do if the save is blocked? I can't spin it off into a new thread; the calling thread may require the SubmitChanges() call to be synchronous, e.g. if it needs access to an auto-number property immediately after the save.

An additional factor to consider is that if any objects are changed during the course of the pre- or post-processing, I also want their BeforeSave() and AfterSave() methods to be called.

Is there some clever textbook way of doing this all neatly and properly?

+1  A: 

My only idea would be to create a sort of buffer while you're working; store the objects you want to save.

Something like this:

class MyDataContext : DataContext
{
   private bool _WorkingFlag = false; // indicates whether we're currently saving

   private List<object> _UpdateBuffer = new List<object>();

   // ... other buffers here

   protected void BeginTransaction()
   {
   // implementation
   }

   protected void CommitTransaction()
   {
   // implementation
   }

   public override void SubmitChanges()
   {
   BeginTransaction();

   IList<object> updates = GetChangeSet().Updates;
   // also inserts and deletes
   if (_WorkingFlag)
   {
   _UpdateBuffer.AddRange(updates);
   // also inserts and deletes

   return;
   }

   _WorkingFlag = true;

   updates = updates.Concat(_UpdateBuffer).ToList(); // merge the updates with the buffer
   foreach (object obj in updates) // do the stuff here...
   {
   MyClass mc = obj as MyClass;
   if (mc != null)
   mc.BeforeUpdate(); // virtual method in MyClass to allow pre-save processing
   }
   _UpdateBuffer.Clear(); // clear the buffer

   // ... same for inserts and deletes ...
   base.SubmitChanges();

   // ... after submit, simply foreach ...

   CommitTransaction();
   _WorkingFlag = false;

   // of course all in try... catch, make sure _WorkingFlag is set back to false in the finally block
   }
}

I hope that will work fine, I didn't test it.

ShdNx
+1 - I think you're on the right track, but not quite there - this algorithm calls the BeforeUpdate() of the other class AFTER the Concat - which is already too late to catch any triggered changes. Remember, it's the BeforeUpdate() that triggers the recursive call to SubmitChanges. But I think if we can somehow create a collection of all modified objects and clear them out as we call BeforeUpdate(), we'll have a solution!
Shaul
+1  A: 

One way I thought of to work around this is to have a recursion-blocking variable at the beginning of SubmitChanges(). But what to do if the save is blocked?

Throw a NotSupportedException. You shouldn't support another SubmitChanges happening on BeforeChanges ... precisely that's what you are doing, allowing some changes to happen Before SubmitChanges gets called.

Regarding the updated objects getting their BeforeUpdate called, you can check if there are new updated objects just before the SubmitChanges after you have called BeforeUpdate in the original list and do that until there are no extra updated objects.

The same goes for AfterUpdate, something like that is to do changes that happen to the in-memory objects ... not to save more data to the db.

Trying to add SubmitChanges on the different entities in your system, is something that is bound to create some performance issues in your system.

eglasius
+1 What you're saying sounds correct, much like my suggested changes to ShdNx's answer.
Shaul
+Bounty - I debated whether answer credit should go to you or ShdNx, and I decided on this answer because of your second paragraph, which summarizes the solution I eventually used (and which ShdNx was close to, but narrowly missed!) I'll post my own answer showing how I did it.
Shaul
A: 

But there's a little problem, which surfaces if an implementation of MyClass invokes SubmitChanges()

Why does it need to do that?

leppie
Because the 2nd object might also need to have BeforeUpdate invoked for some pre-processing.
Shaul
But it still does not have to call SubmitChanges.
leppie
Agreed, and that's what the other two answerers thus far have observed.
Shaul
A: 

So now that answer credit has already been given, here's how I actually implemented the solution:

private bool _Busy = false;

public override void SubmitChanges(ConflictMode failureMode) {
  if (_Busy)
    return; // no action & no error; just let this SubmitChanges handle all nested submissions.
  try {
    _Busy = true;
    BeginTransaction();
    Dictionary<MyClass, bool> myUpdates = new Dictionary<MyClass, bool>();
    Dictionary<MyClass, bool> myInserts = new Dictionary<MyClass, bool>();
    Dictionary<MyClass, bool> myDeletes = new Dictionary<MyClass, bool>();

    SynchronizeChanges(myUpdates, GetChangeSet().Updates);
    SynchronizeChanges(myInserts, GetChangeSet().Inserts);
    SynchronizeChanges(myDeletes, GetChangeSet().Deletes);

    while (myInserts.Any(i => i.Value == false) || myUpdates.Any(u => u.Value == false) || myDeletes.Any(d => d.Value == false)) {
      List<MyClass> tmp = myInserts.Where(i => i.Value == false).Select(i => i.Key).ToList();
      foreach (MyClass mc in tmp) {
        mc.BeforeInsert();
        myInserts[lt] = true;
      }
      tmp = myUpdates.Where(u => u.Value == false).Select(u => u.Key).ToList();
      foreach (MyClass mc in tmp) {
        mc.BeforeUpdate();
        myInserts[lt] = true;
      }
      tmp = myDeletes.Where(d => d.Value == false).Select(d => d.Key).ToList();
      foreach (MyClass mc in tmp) {
        mc.BeforeDelete();
        myInserts[lt] = true;
      }
      // before calling base.SubmitChanges(), make sure that nothing else got changed:
      SynchronizeChanges(myUpdates, GetChangeSet().Updates);
      SynchronizeChanges(myInserts, GetChangeSet().Inserts);
      SynchronizeChanges(myDeletes, GetChangeSet().Deletes);
    }
    base.SubmitChanges(failureMode);
    // now the After- methods
    foreach (MyClass mc in mcInserts.Keys) {
      mc.AfterInsert();
    }
    foreach (MyClass mc in mcUpdates.Keys) {
      mc.AfterUpdate();
    }
    foreach (MyClass mc in mcDeletes.Keys) {
      mc.AfterDelete();
    }
    CommitTransaction();
  } catch {
    RollbackTransaction();
    throw;
  } finally {
    _Busy = false;
  }
  // now, just in case any of the After... functions triggered a change:
  if (GetChangeSet().Deletes.Count + GetChangeSet().Inserts.Count + GetChangeSet().Updates.Count > 0)
    SubmitChanges();
}

private void SynchronizeChanges(Dictionary<MyClass, bool> mcDict, IList<object> iList) {
  var q = iList.OfType<MyClass>().Where(i => !mcDict.ContainsKey(i));
  q.ToList().ForEach(i => mcDict[i] = false);
}
Shaul