Once again I was in a design review, and encountered the claim that the probability of a particular scenario was "less than the risk of cosmic rays" affecting the program, and it occurred to me that I didn't have the faintest idea what that probability is.
"Since 1/2^128 is 1 out of 340282366920938463463374607431768211456, I think we're justified in taking our chances here, even if these computations are off by a factor of a few billion... We're way more at risk for cosmic rays to screw us up, I believe."
Is this programmer correct? What is the probability of a cosmic ray hitting a computer and affecting the execution of the program?
update: it seems non-error corrected memory is quite likely to be hit, if you have a reasonably large number of servers. how well does error-corrected memory reduce the effect of this hit rate?
note regarding closing: this is a real question, affecting a real software development project. The claim has been made, "probability is less than x." What is x?