IMHO No. Checking for nulls is almost never going to be the performance bottleneck in your application. (And in the one in a million case where it is significant, you'll find it with your profiler and remove that one case).
The other question that should form in your mind is "is throw new NullReferenceException() really the best way to handle the error?" Often you can handle things better than that (even if only to provide a better error report to the user and/or yourself for debugging purposes). In many cases code can handle nulls gracefully, making it unnecessary for this to be an error at all.
edit
To answer your edit: Null checks really don't take long. The overhead for simply calling a method will be tens if not hundreds of times more than a null check. The only place where a null check will make any significant difference is in a large, tight loop where you are doing very little else. This situation doesn't happen very often - usually you will check for a null and then do something relatively expensive with that reference.
There is no situation where a crash or failure is a good thing. It is always better to "slow your application down" with null checks than to crash and lose your customer's data.
So don't prematurely optimise your code. Write it well, to be maintainable and robust, then profile it to see where the bottlenecks are. I've been programming for 28 years, being very liberal with null checks, and have never found that a null check was the cause of a performance problem. Usually it's things like doing lots of unnecessary work in a loop, using an O(n^3) algorithm where an O(n^2) approach is possible, failing to cache expensive-to-compute values, etc.