My psychic powers are not great, and it is hard to tell what is going on without actually debugging it. But here's a guess. The issue I discuss here:
http://stackoverflow.com/questions/2342396/why-does-the-calculation-give-different-results-on-different-machines/2343351#2343351
applies not just to "cross machine" but also to "debug vs release". It is not only possible but likely that the release version of your program is using higher precision math than your debug version. If you have floating point bugs in there then it is entirely possible that just by sheer bad luck you are hitting the bugs only in the higher-precision release version and not in the lower-precision debug version.
Why the difference? Because in the unoptimized version the C# compiler frequently generates code for temporary values as though they were local variables; the jitter then actually allocates temporary locals on the stack, and writes the temporary values from the registers to the locals. Then when it needs them, it reads them back into registers from the temporaries. That journey can cause the value that was in the high-precision register to be truncated to mere 64 bit precision, losing bits of precision.
In the optimized version the C# compiler and the jitter work harder to keep everything in registers all the time, because obviously that is faster and higher precision, though harder to debug.
Good luck. Bugs that only repro in release mode are a total pain.