views:

140

answers:

3

Hello, I'm still learning about complexity measurement using the Big O Notation, was wondering if I'm correct to say that following method's complexity is O(n*log4n), where the "4" is a subscript.

public static void f(int n)
{
    for (int i=n; i>0; i--)
    {
        int j = n;
        while (j>0)
            j = j/4;
    }
}
+6  A: 

Yes, You are correct, that the complexity of the function is O(n*log_4(n))

Log_4(n) = ln(n) / ln(4) and ln(4) is a constant, so if Your function has complexity O(n*log_4(n)) it is also true, that it has a complexity of O(n*ln(n))

Maciej Hehl
+3  A: 

Did you mean

public static void f(int n) 
{ 
    for (int i=n; i>0; i--) 
    { 
        int j = i;  // Not j = n.
        while (j>0) 
            j = j/4; 
    } 
} 

?

In that case, too you are correct. It is O(nlogn). Using the 4 as subscript is correct too, but it only makes it more cumbersome to write.

Even with the j=n line, it is O(nlogn).

In fact to be more accurate you can say it is Theta(n logn).

Moron
I don't think using the 4 as subscript is correct though, since it's the same as writing a constant: `log4(n) = log(n)/log(4)`
IVlad
@IVlad: It is still correct. Like it is correct to say O(2n) or O(n^2 + 3n + 45). It is just another function, and we usually avoid constants to make it simpler to read/write/understand. Having constants in there is annoying, but not incorrect.
Moron
@Moron - every text I've seen says that you do not put constants or lower degree terms in big-oh notation, so `O(2n)` is not correct and it should be `O(n)` and your second example should be `O(n^2)`. Both wikipedia and http://www.perlmonks.org/?node_id=227909 seem to suggest this: `when you see O(2N) or O(10 + 5N), someone is blending implementation details into the conceptual ones.`. Can you provide a reference that discusses this issue in more detail?
IVlad
Follows from the definition of BigOh. Assume n > 0 and f is positive valued. If f <= cn^2 for some c > 0, then f <= c(n^2 + 3n + 45). i.e f = O(n^2) implies that f = O(n^2 + 3n + 45). Fortunately, f = O(n^2 + 3n + 45) also implies that f = O(n^2), so they are interchangeable and so you can drop the lower degree terms, making it easier to talk about. I agree with you that use of constants/unnecessary terms usually indicates inexperience with BigOh, but technically, it is still correct.
Moron
Sorry, don't have a reference but a very formal book on BigOh should have it.
Moron
It also follows directly from the definition that the people who answered `O(n²)` and got downvoted, were actually right.
Jörg W Mittag
@Jorg: Yes, technically, just saying it is O(n^2) was right. But they also said other things which were wrong. Like one of them said it will take n^2/2. And another said that the inner loop was n/4.
Moron
I agree. I just wanted to point out that the definitions of the different Bachmann-Landau symbols may have some surprising results if you're not careful and for example mix up `Ο` and `Θ`. The fact that any function which is in `Ο(n×log n)` is also in `Ο(n²)` as well as `Ο(n×log₄ n)` and `Ο(n² + 3×n + 45)` is blatantly obvious from the definition, but often overlooked. In particular, many people often ask for `Ο` when they actually want `Θ` and the difference can be pretty significant.
Jörg W Mittag
@Jorg: Agree. +1 :) It is a common problem. But fortunately, it is usually clear from the context.
Moron
+1  A: 

yes you are right, complexity is n* log4(n)

svlada