Hi,
I've just started skimming 'Debugging MS .Net 2.0 Applications' by John Robbins, and have become confused by his evangelism for Debug.Assert(...).
He points out that well-implemented Asserts store the state, somewhat, of an error condition, e.g.:
Debug.Assert(i > 3, "i > 3", "This means I got a bad parameter");
Now, personally, it seems crazy to me that he so loves restating his test without an actual sensible 'business logic' comment, perhaps "i <= 3 must never happen because of the flobittyjam widgitification process".
So, I think I get Asserts as a kind-of low-level "Let's protect my assumptions" kind of thing... assuming that one feels this is a test one only needs to do in debug - i.e. you are protecting yourself against colleague and future programmers, and hoping that they actually test things.
But what I don't get is, he then goes on to say that you should use assertions in addition to normal error handling; now what I envisage is something like this:
Debug.Assert(i > 3, "i must be greater than 3 because of the flibbity widgit status");
if (i <= 3)
{
throw new ArgumentOutOfRangeException("i", "i must be > 3 because... i=" + i.ToString());
}
What have I gained by the Debug.Assert repetition of the error condition test? I think I'd get it if we were talking about debug-only double-checking of a very important calculation...
double interestAmount = loan.GetInterest();
Debug.Assert(debugInterestDoubleCheck(loan) == interestAmount, "Mismatch on interest calc");
...but I don't get it for parameter tests which are surely worth checking (in both DEBUG and Release builds)... or not. What am I missing?