Exception have two costs: warm-up to page in the exception infrastructure - if not in to memory then into the CPU cache - and per-throw cost to gather exception stack, search for exception handler, possibly call exception filters, unwind the stack, calling finalization blocks - all operations that the runtime, by design, does not optimize for.
Thus, measuring the cost of throwing exceptions can be misleading. If you write a loop that iteratively throws and catches an exception, without a lot of work between the throw site and the catch site, the cost won't look that large. However, that's because it's amortizing the warm-up cost of exceptions, and that cost is harder to measure.
Certainly, exceptions don't cost anything like they seem to if one's main experience is exceptions thrown by programs under the debugger. But they do cost, and it's advisable to design libraries in particular such that exceptions can be avoided where necessary for optimization.