Numerical analysis theory uses methods to predict the precision error of operations, independent of the machine they are running on. There are always cases where even on the most advanced processor operations may lose accuracy.
Some books to read about it:
Accuracy and Stability of Numerical Algorithms by N.J. Higham
An Introduction to Numerical Analysis by E. Süli and D. Mayers
If you cant find them or are too lazy to read them tell me and i will try to explain some things to you. (Well im no expert in this because im a Computer Scientist, but i think i can explain you the basics)
I hope you understand what i wrote (my english is not the best).