I would like compute the number of non-zero and non-repeating (ie. 1.999999999) to the right of the decimal point. Eg:
x.xx = 2
x.xxx = 3
x.xxx000 = 3
I can do this by converting the number to a string but I was wondering if there is a faster way using math. Any ideas?
Thanks.
EDIT: Many people seem to think this is a fools errand because of the way numbers are represented in computers. However, allow me to explain the context of the problem. Suppose you are to write a method to randomly generate a floating point number. The number you generate must have a certain precision, therefore you must round the randomly generated number to the specified precision. For example if the precision is 2, your random number cannot be 0.012, it must be rounded to 0.01. The problem is you are not provided the precision, instead you are given the increment, which in the aforementioned case would be 0.01. Given 0.01 or 0.002 or any other such increment which is less than 1, you must find the precision.
Edit: Removed my incorrect usage of the term significant digits.