I've read plenty of articles (and a couple of other similar questions that were posted on StackOverflow) about how and when to you assertions, and I understood them well. But still, I don't understand what kind of motivation should drive me to use Debug.Assert instead of throwing a plain exception. What I mean is, in .Net the default response to a failed exception is to "stop the world" and display a message box to the user. Though this kind of behavior could be modified, I find it highly annoying and redundant to do that, while I could instead, just throw a suitable exception. This way, I could easily write the error to the application's log just before I throw the exception, and plus, my application doesn't necessarily freeze.
So, why should I, if at all, use Debug.Assert instead of a plain exception? Placing an assertion where it shouldn't be could just cause all kinds of "unwanted behavior", so in my point of view, I really don't gain anything by using an assertion instead of throwing an exception. Do you agree with me, or am I missing something here?
Note: I fully understand what's the difference "in theory" (Debug vs Release, usage patters etc.), but as I see it, I would better of throwing an exception instead of performing an assert. Since if a bug is discovered on a production release, I still would like that the "assertion" would fail (after all, the "overhead" is ridiculously small), so I'll be better off throwing an exception instead.
Edit: The way I see it, if an assert failed, that means that the application entered some kind of corrupted, unexpected state. So why would I want to continue execution? It doesn't matter if the application runs on a debug or release version. The same goes to both