tags:

views:

195

answers:

2
+2  Q: 

Computational cost

Is there someone that knows what is the computational cost for this two piece of code?

while(n>2)
   n = sqrt(n);

while(n>2)
   n = log(n)
+9  A: 

The second would be O(log* n) where log * is the iterated logarithm.

Analysing the first one yields something like this:

sqrt(n) = n ^ (1/2)
sqrt(sqrt(n)) = n ^ (1/4)
sqrt(sqrt(sqrt(n))) = n ^ (1/8)
...
sqrt applied k times = n ^ (1/2^k)

Consider that the first algorithm executes k times (basically, the number of times we have to apply sqrt until n <= 2).

Consider this reasoning:

n ^ (1/2^k) = p (p <= 2) | ^ (2^k)
n = p ^ (2^k) | log
log n = (2^k) log p | log
log log n = log (2 ^ k) + log log p
log log n = klog2 + log log p
=> k ~= log log n

So the first algorithm is O(log log n).

IVlad
I also thought the same thing(something like log2(log2(n)) but I was not sure, for the second?
BlackShadow
@BlackShadow - simply use these formulas: `1. sqrt(n) = n ^ (1/2); 2. (a^b)^c = a^(b*c); 3. log (a*b) = log a + log b; 4. log (a^b) = b*log a`. This is all you need to prove it.
IVlad
@IVlad, I guess you used also properites of big-O to remove "1/log 2" and "log log p / log 2".
ony
@ony - yes, I did.
IVlad
+4  A: 

The answer to the first one should become obvious if one recasts it in the log domain:

n = log2(n);
while (n > 1)
    n = n / 2;

How many times do you need to halve a number in order to reach 1? O(log n).

Oli Charlesworth