Yes there is such a thing, but it's not the right complexity for your function.
That's not right. a can't be a part of the big-O formula since it's just a temporary variable.
Off the top of my head, if we take multiplication to be a constant-time operation, then the number of multiplications performed will be O(log log n). If you were multiplying by a constant every iteration then it would be O(log n). Because you're multiplying by a growing number each iteration then there's another log.
Think of it as the number of digits doubling each iteration. How many times can you double the number of digits before you exceed the limit? The number of digits is log n and you can double the starting digits log2 log n times.
As for the other aspect of the question, yes, O(a-th root of n) could be a valid complexity class for some constant a. It would more familiarly be written as O(n1/a).
After i-th iteration (i=0,1,...) value of a is 32i. There will be O(log log n) iterations, and assuming arithmetic operations in O(1) this is time complexity.
Well, you could actually go into an infinite loop!
Assume 32 bit integers:
Try this:
int a = 3 ;
int n = 2099150850;
while (a <= n)
{
a = a*a;
}
But assuming there are no integer overflows about, other posters are right, it is O(log logn) if you assume O(1) multiplication.
An easy way to see this is:
xn+1 = xn2.
Take x1 = a.
Taking logs.
tn = log xn
Then tn+1 = 2tn
I will leave the rest to you.
It becomes more interesting if you consider the complexity of multiplication of two k digits numbers.