views:

164

answers:

6
for(i=0;i< m; i++)
{

   for(j=i+1; j < m; j++)
   {

      for(k=0; k < n;k++)
      {
         for(l=0;l< n;l++)
         {if(condition) do something}
      }
   }

} 
A: 

O(m^2*n^2*(compexity of something)). If condition and something are executed in constant time then O(m^2*n^2).

agsamek
What about k and l?
ziggystar
@ziggystar, what about them? They're bound by `n`, so no need to include them in the big-O notation.
Bart Kiers
The -1 is not mine, but the spoon-feed answer without any explanation makes it understandable.
Bart Kiers
+1  A: 

Looks like O(m^2 n^2) to me, assuming the "something" is constant-time.

Although the j loop starts from a different point with each step, the combined effect of the i and j loops is still an m^2 factor.

Evaluating the unstated condition itself would normally be (at least) a constant time operation, so certainly the loop cannot be faster than O(m^2 n^2) - unless, of course, the "something" includes a break, goto, exception throw or whatever that exits out of one or more of the loops early.

All bets are off if, for any reason, either n or m isn't constant throughout.

Steve314
+3  A: 

In details:

The two first loops will result in (m-1) + (m-2) + (m-3) + ... + 1 repetitions, which is equal to m*(m-1)/2. As for the second two loops, they basically run from 0 to n-1 so they need n^2 iterations.

As you have no clue whether the condition will be fulfilled or not, then you take the worst case, which is it being always fulfilled.

Then the number of iterations is:

m*(m+1)/2*n^2*NumberOfIterations(Something)

In O notation, the constants and lower degrees are not necessary, so the complexity is:

O(m^2*n^2)*O(Something)

Rafid K. Abdullah
The second two loops are different - both k and l always start from zero and end at n. Not relevant to big O - just pedanting.
Steve314
Oh, sorry, I didn't notice that. Then the second two loops require n^2 iterations. So the process goes as follows:Number of Iterations = m*(m+1)*n^2*NumberOfIterations(Something).But the result will still be the same.I will edit my original answer.
Rafid K. Abdullah
It is `m * (m - 1) /2` as I have stated in my answer.
Ani
Ani, thanks for the comment. You are right. Let me correct it.
Rafid K. Abdullah
Thanks for such good explanations.And what change will a break statement inside the if block make to the time complexity ?
yashgos
@yashgos, like @snakile said above, it doesn't have any effect. O notation is mainly used to have an idea about the increase in computation cost when the number of iterations is big. This is why they cancel constants and lower degrees.But in case you want to know the number of iterations in case of a break statement inside the if condition, then here is how it goes:First, you need the probability of the satisfaction of the if condition. Let's suppose the probability is p. That is, the probability of entering the if condition is p so the probability of not entering the if condition is (1-p).
Rafid K. Abdullah
Now in case the if condition is satisfied, "Something" will be done, so you need to add its iterations to the total number of iterations, otherwise you just continue, otherwise you add nothing.So in iterations:1st Iteration:--------------p*NoOfIterations(Something)2nd Iteration:--------------(1-p)*p*NoOfIterations(Something)You see we multiplied by (1-p) because we will not reach the 2nd iteration unless the condition of the if statement was not satisfied in the first iteration.
Rafid K. Abdullah
ith Iteratino:--------------(1-p)^(i-1)*p*NoOfIterations(Something)Now you need to sum that up :-) For this, you need to use geometrical series formula: http://en.wikipedia.org/wiki/Geometric_series#FormulaThe final result, if I am not mistaken is [1-(1-p)^(n+1)]*NoOfIterations(Something). You need to check the calculation though, as such calculations are usually error-prone.Finally, you multiply this result by m*(m+1)/2*n^2 to get the "expected" number of iterations.Hope that helps, and sorry for complications!
Rafid K. Abdullah
Sorry for the many comments, but the explanation is quite long.
Rafid K. Abdullah
again thanks, now I am able to nicely comprehend complexity related problems.
yashgos
+1  A: 

I assume the time complexity of "do something" is O(S).

Let's start with the most inner loop: It's time complexity is O(n*S) because it does something n times. The loop which wraps the most inner loop has a time complexity of O(n)*O(n*S)=O(n^2*S) because it does the inner loop n times.

The loop whcih wraps the second most inner loop has a time complexity of O(m-i)*O(n^2*S) because it does an O(n^2*S) operation m-i times.

Now for the harder part: for each i in the range 0...m-1 we do an (m-i)*O(n^2*S) operation. How long does it take? (1 + 2 + 3 + ... + m)*O(n^2*S). But (1 + 2 + ... + m) is the sum of an arithmetic sequence. Therefore the sum equals to m*(m-1)/2=O(m^2).

Conclusion: We do an O(n^2*S) operation about m^2 times. The time complexity of the whole thing is O(m^2*n^2*S)

snakile
+2  A: 
for(i=0;i< m; i++)
{  
   for(j=i+1; j < m; j++)
   {

The inner loop will run ((m-1) + (m-2) + ... 1) times

= 1 + 2 + 3 + ...m-1 
= m * (m - 1) / 2

for(k=0; k < n;k++)
{
   for(l=0;l< n;l++)
   {

In this case, the inner loop clearly runs n * n times


So clearly, the number of iterations is exactly

  (m * (m - 1) / 2) * (n * n)
= O(m^2 * n^2)

Obviously, this assumes that if(condition) do something runs in constant time.

Ani
A: 

O(m²*n²) *complexity of "something"