tags:

views:

250

answers:

6
A: 

Yes there is such a thing, but it's not the right complexity for your function.

Paul Hankin
+9  A: 

That's not right. a can't be a part of the big-O formula since it's just a temporary variable.

Off the top of my head, if we take multiplication to be a constant-time operation, then the number of multiplications performed will be O(log log n). If you were multiplying by a constant every iteration then it would be O(log n). Because you're multiplying by a growing number each iteration then there's another log.

Think of it as the number of digits doubling each iteration. How many times can you double the number of digits before you exceed the limit? The number of digits is log n and you can double the starting digits log2 log n times.


As for the other aspect of the question, yes, O(a-th root of n) could be a valid complexity class for some constant a. It would more familiarly be written as O(n1/a).

John Kugelman
I assume the `a` in the formula is supposed to be the initial value of `a` (i.e. 3 in the shown code), which is a constant and thus makes sense to appear in the formula.
sepp2k
@John: `a` could be considered an input into the algorithm (it’s simply not clear from the description). And the running times *do* differ with changing initial values of `a`. Though whether they do so asymptotically I can’t say at a glance.
Konrad Rudolph
The starting value of *a* is not relevant for the big-O complexity. If *n* were a googolplex then whether you start with a=3 or a=100 or a=999999 would shave off only a fixed constant number of multiplications.
John Kugelman
If you were to include a as a variable then it would be Theta(loglogn - loga) which is makes sense intuitively since as a gets larger less iterations are performed.
Il-Bhima
Whether a is a constant or an input, it's still valid for it to appear in the complexity.
Paul Hankin
+2  A: 
Jörg W Mittag
@Jorg: This looks right. If multiplication is f(k) for k digits, then total is Sigma f(2^r) , r = 1 to log log n. Considering only 2r >= loglogn, we get that it is Theta of what you have. (Not just BigOh).
Moron
+1  A: 

After i-th iteration (i=0,1,...) value of a is 32i. There will be O(log log n) iterations, and assuming arithmetic operations in O(1) this is time complexity.

sdcvvc
+2  A: 

Well, you could actually go into an infinite loop!

Assume 32 bit integers:

Try this:

int a = 3 ;
int n = 2099150850;
while (a <= n)
{
    a = a*a;
}

But assuming there are no integer overflows about, other posters are right, it is O(log logn) if you assume O(1) multiplication.

An easy way to see this is:

xn+1 = xn2.

Take x1 = a.

Taking logs.

tn = log xn

Then tn+1 = 2tn

I will leave the rest to you.

It becomes more interesting if you consider the complexity of multiplication of two k digits numbers.

Moron
A: 

Thanks everyone.

timeNomad