views:

39

answers:

1

I've been viewing some video lectures from MIT's opencourseware website, and on the third lecture video the lecturer goes over Recursive Matrix Multiplication and comes up with the time complexity being:

T(n) = THETA(n^3).

It's obvious to me that I really need to review some of my math, but I really don't see the connection from that answer and any one of the cases mentioned for the Master Theorem Method. I know that the recursive relation is of the form: T(n) = a*T(n/b) + f(n) for when n>1. With this recursive relation: a = 8, b = 2, and f(n) = THETA(n^2).

So, how did they get THETA(n^3)?

He mentioned that log_2(8) = 3, which makes sense. But, I just can't figure out which of the three cases that the example recursive relation corresponds to, using the value of f(n).

Since, Case 2 is the only one where f(n) = THETA(anything), then I'm guessing that that's it. Yet, I guess my problem relates to properties of logarithms or exponents.

If f(n) = THETA(n^log_2(8) * (log_2(n))^(k+1)) then how do you go from having a THETA(n^3) for f(n) to a THETA(n^2) using case 2? What is it that I might be missing here?

A: 

Check out the wiki page on Master's Theorem: http://en.wikipedia.org/wiki/Master_theorem

They discuss this very exact situation (a = 8, b=2, f(n) = O(n^2)) when discussing case 1. (not case 2). Note that if f(n) = Theta(n^2) then f(n) = O(n^2), so case 1 can be applied.

Hope that helps.

Moron
Thanks. Before I read your answer, I let the video play on and found that it was from case 1. Similar to your answer, I figured that they Theta still fit within big Omicron because it has that f(n)<=c*g(n) condition for f(n)=O(n). So, since Theta(n) sort of implies that f(n) = c*g(n) then it still fits within O(n).
Liars_Paradox