It seems like its not incredibly expensive as long as the stack isn't too deep but I've read conflicting reports. Just wondering if there is a definitive report out there that hasn't been rebutted.
I guess I'm in the camp that if performance of exceptions impacts your application then you're throwing WAY too many of them. Exceptions should be for exceptional conditions, not as routine error handling.
That said, my recollection of how exceptions are handled is essentially walking up the stack finding a catch statement that matches the type of the exception thrown. So performance will be impacted most by how deep you are from the catch and how many catch statements you have.
Jon Skeet wrote Exceptions and Performance in .NET in Jan 2006
Which was updated Exceptions and Performance Redux (thanks @Gulzar)
To which Rico Mariani chimed in The True Cost of .NET Exceptions -- Solution
Also reference: Krzysztof Cwalina - Design Guidelines Update: Exception Throwing
Barebones exception objects in C# are fairly lightweight; it's usually the ability to encapsulate an InnerException
that makes it heavy when the object tree becomes too deep.
As for a definitive, report, I'm not aware of any, although a cursory dotTrace profile (or any other profiler) for memory consumption and speed will be fairly easy to do.
The performance hit with exceptions seems to be at the point of generating the exception object (albeit too small to cause any concerns 90% of the time). The recommendation therefore is to profile your code - if exceptions are causing a performance hit, you write a new high-perf method that does not use exceptions. (An example that comes to mind would be (TryParse introduced to overcome perf issues with Parse which uses exceptions)
THat said, exceptions in most cases do not cause significant performance hits in most situations - so the MS Design Guideline is to report failures by throwing exceptions