views:

735

answers:

5

My question arises from the post "Plain English Explanation of Big O". I don't know the exact meaning for logarithmic complexity. I know that I can make a regression between the time and the number of operations and calculate the X-squared value, and determine so the complexity. However, I want to know a method to determine it quickly on paper.

How do you determine logarithmic complexity? Are there some good benchmarks?

+6  A: 

Master theorem usually works.

Mehrdad Afshari
Somewhat difficult to think about, but very good once you master it.
samoz
+8  A: 

Not rigorous, but it you have an algorithm that is essentially dividing the work needed to be done by half on each iteration, then you have logarithmic complexity. The classic example is binary search.

Mitch Wheat
not necessarily. I understand what you are trying to imply but just because you divide the work in half doesnt mean you get a logarithmic complexity, you could even have exponential time for that matter. You have to note how the solutions are being recombined too and how the divided problems get solved too. There are lot of cases where recombination step dominates. See Master Theorem or better solve the recurrence without the theorem. There are a lot of surprises underneath a simple recurrence.
kunjaan
@unjaan: I think you are misunderstanding me. I didn't just say divide the work in half, I said "the work needed to be done by half on each iteration". What I mean is, if at each step there is half the work remaining to be done as compared to the previous step, then you have logarithmic complexity (for work, read computations).
Mitch Wheat
+4  A: 

If you just want to know about logarithmic Big Oh, be on the lookout for when your data is cut in half each step of the recurrence.

This is because if you are processing data that is 1/2 as big as the step before it, it is an infinite series.

samoz
Not necessarily 1/2. 1/c works, as long as `c` is constant.
Mehrdad Afshari
but 1/2 is more 'intuiative'
Mitch Wheat
Well usually when talking about Big O, log means log base 2.
samoz
@samoz, logarithmic is independent of the base. log_a(x) = log_b(x)/log_b(a) to convert from base a to base b.
George V. Reilly
@George agreed, but it is commonly base 2.
samoz
+3  A: 

Not sure if this is what you mean, but... logarithmic complexity usually arises when you're working with a spread-out data structure like a balanced binary tree, which contains 1 node at the root, 2 children, 4 grandchildren, 8 great-grandchildren, etc. Basically at each level the number of nodes gets multiplied by some factor (2) but still only one of those is involved in the iteration. Or as another example, a loop in which the index doubles at each step:

for (int i = 1; i < N; i *= 2) { ... }

Things like that are the signatures of logarithmic complexity.

David Zaslavsky
+1 very interesting. I am looking for something like your examples more. Is the algorithm logartihmic as: for (int i = BIG_number; i > N; i *= 1/2) { ... }
Masi
1/2 is zero in integer division, but if you use "i /= 2" instead, yes it is. (If that's the particular algorithm you're wondering about, it might have been a good idea to include it in your question.)
David Zaslavsky
+2  A: 

Here is another way of saying it.

Suppose your algorithm is linear in the number of digits in the size of the problem. So, perhaps you have a new algorithm to factor a large number, that you can show to be linear in the number of digits. A 20 digit number thereby takes twice as long to factor as a 10 digit number using your algorithm. This would have log complexity. (And it would be worth something for the inventor.)

Bisection has the same behavior. It takes roughly 10 bisection steps to cut the interval length by a factor of 1024 = 2^10, but only 20 steps will cut the interval by a factor of 2^20.

Log complexity does not always mean an algorithm is fast on all problems. The linear factor in front of the O(log(n)) may be large. So your algorithm may be terrible on small problems, not becoming useful until the problem size is appreciably large that other algorithms die an exponential (or polynomial) death.

woodchips
Well explained with the big problem size.
Masi