views:

1198

answers:

5

I'm trying to figure out how to give a worst case time complexity. I'm not sure about my analysis. I have read nested for loops big O is n^2; is this correct for a for loop with a while loop inside?

// A is an array of real numbers.
// The size of A is n. i,j are of type int, key is
// of type real. 
Procedure IS(A)    
for j = 2 to length[A]     
{                
   key = A[ j ]               
   i = j-1   
   while i>0 and A[i]>key    
   {       
      A[i+1] = A[i]                         
      i=i-1                    
   }                
   A[i+1] = key      
}

so far I have:

j=2 (+1 op)  

i>0 (+n ops)   
A[i] > key (+n ops)

so T(n) = 2n+1?

But I'm not sure if I have to go inside of the while and for loops to analyze a worse case time complexity...

Now I have to prove that it is tightly bound, that is Big theta.

I've read that nested for loops have Big O of n^2. Is this also true for Big Theta? If not how would I go about finding Big Theta?!

**C1= C sub 1, C2= C sub 2, and no= n naught all are elements of positive real numbers

To find the T(n) I looked at the values of j and looked at how many times the while loop executed:

values of J:   2, 3, 4, ... n  
Loop executes: 1, 2, 3, ... n

Analysis:
Take the summation of the while loop executions and recognize that it is (n(n+1))/2 I will assign this as my T(n) and prove it is tightly bounded by n^2. That is n(n+1)/2= θ(n^2)

Scratch Work:

Find C1, C2, no є R+ such that 0 ≤ C1(n^2) ≤ (n(n+1))/2 ≤ C2(n^2)for all n ≥ no

To make 0 ≤ C1(n) true, C1, no, can be any positive reals   
To make C1(n^2) ≤ (n(n+1))/2, C1 must be ≤ 1  
To make (n(n+1))/2 ≤ C2(n^2), C2 must be ≥ 1

PF:

Find C1, C2, no є R+ such that 0 ≤ C1(n^2) ≤ (n(n+1))/2 ≤ C2(n^2) for all n ≥ no
Let C1= 1/2, C2= 1 and no = 1.
  1. show that 0 ≤ C1(n^2) is true C1(n^2)= n^2/2
    n^2/2≥ no^2/2
    ⇒no^2/2≥ 0
    1/2 > 0
    Therefore C1(n^2) ≥ 0 is proven true!

  2. show that C1(n^2) ≤ (n(n+1))/2 is true C1(n^2) ≤ (n(n+1))/2
    n^2/2 ≤ (n(n+1))/2 n^2 ≤ n(n+1)
    n^2 ≤ n^2+n
    0 ≤ n

    This we know is true since n ≥ no = 1
    Therefore C1(n^2) ≤ (n(n+1))/2 is proven true!

  3. Show that (n(n+1))/2 ≤ C2(n^2) is true (n(n+1))/2 ≤ C2(n^2)
    (n+1)/2 ≤ C2(n)
    n+1 ≤ 2 C2(n)
    n+1 ≤ 2(n)
    1 ≤ 2n-n
    1 ≤ n(2-1) = n
    1≤ n
    Also, we know this to be true since n ≥ no = 1

    Hence by 1, 2 and 3, θ(n^2 )= (n(n+1))/2 is true since
    0 ≤ C1(n^2) ≤ (n(n+1))/2 ≤ C2(n^2) for all n ≥ no

Tell me what you thing guys... I'm trying to understand this material and would like y'alls input!

+4  A: 

You seem to be implementing the insertion sort algorithm, which Wikipedia claims is O(N2).

Generally, you break down components based off your variable N rather than your constant C when dealing with Big-O. In your case, all you need to do is look at the loops.

Your two loops are (worse cases):

for j=2 to length[A]
    i=j-1
    while i > 0
        /*action*/
        i=i-1

The outer loop is O(N), because it directly relates to the number of elements.

Notice how your inner loop depends on the progress of the outer loop. That means that (ignoring off-by-one issues) the inner and outer loops are related as follows:

 j's     inner
value    loops
-----    -----
  2        1
  3        2
  4        3
  N       N-1
-----    -----
total    (N-1)*N/2

So the total number of times that /*action*/ is encountered is (N2 - N)/2, which is O(N2).

strager
A: 

Big O (basically) about how many times the elements in your loop will be looked at in order to complete a task.

For example, a O(n) algorithm will iterate through every element just once. A O(1) algorithm will not have to iterate through every element at all, it will know exactly where in the array to look because it has an index. An example of this is an array or hash table.

The reason a loop inside of a loop is O(n^2) is because every element in the loop has to be iterated over itself ^2 times. Changing the type of the loop has nothing to do with it since it's about # of iterations essentially.

There are approaches to algorithms that will allow you to reduce the number of iterations you need. An example of these are "divide & conquer" algorithms like Quicksort, which if I recall correctly is O(nlog(n)).

It's tough to come up with a better alternative to your example without knowing more specifically what you're trying to accomplish.

Dan Herbert
I think he's trying to find the complexity of his algorithm, not seeking an alternative algorithm.
strager
That's what I assumed. I gave the suggestion of an alternative algorithm in hopes that it might help the asker have a better scope of understanding of what they're doing. Hopefully it doesn't do the opposite and just create more confusion...
Dan Herbert
After looking at his code, it seems he's implementing an insertion sort (which is O(n^2), making my answer wrong!).
strager
Also "his" name is Ellen -- I know programmers are mostly male, but we should give people a chance to be female if that's what they are :P
mikeh
I AM A FEMALE PROGRAMMER... one of the only females in my computer classes! haha... but no offense taken... Im trying to find T(n) and prove that it is tightly bound!
Ellen
+1  A: 

Looking at the number of nested loops isn't the best way to go about getting a solution. It's better to look at the "work" that's being done in the code, under a heavy load N. For example,

for(int i = 0; i < a.size(); i++)
{
  for(int j = 0; j < a.size(); j++)
  {
    // Do stuff
    i++;
  }
}

is O(N).

A function f is in Big-Theta of g if it is both in Big-Oh of g and Big-Omega of g. The worst case happens when the data A is monotonically decreasing function. Then, for every iteration of the outer loop, the while loop executes. If each statement contributed a time value of 1, then the total time would be 5*(1 + 2 + ... + n - 2) = 5*(n - 2)*(n - 1) / 2. This gives a quadratic dependence on the data. However, if the data A is a monotonically increasing sequence, the condition A[i] > key will always fail. Thus the outer loop executes in constant time, N - 3 times. The best case of f then has a linear dependence on the data. For the average case, we take the next number in A and find its place in the sorting that has previously occurred. On average, this number will be in the middle of this range, which implies the inner while loop will run half as often as in the worst case, giving a quadratic dependence on the data.

Derek E
Can you please "solve" for O(N^2) yourself? I want to delete my answer. I don't feel it deserves to be here, because parts of it are wrong.
strager
A: 

**C1= C sub 1, C2= C sub 2, and no= n naught all are elements of positive real numbers

To find the T(n) I looked at the values of j and looked at how many times the while loop executed: values of J: 2, 3, 4, ... n
Loop executes: 1, 2, 3, ... n

Analysis: Take the summation of the while loop executions and recognize that it is (n(n+1))/2 I will assign this as my T(n) and prove it is tightly bounded by n^2. That is n(n+1)/2= θ(n^2 )

Scratch Work:
Need to find C1, C2, no є R+ such that 0 ≤ C1(n^2) ≤ (n(n+1))/2 ≤ C2(n^2)for all n ≥ no
To make 0 ≤ C1(n) true, C1, no, can be any positive reals
To make C1(n^2) ≤ (n(n+1))/2 , C1 must be ≤ 1
To make (n(n+1))/2 ≤ C2(n^2), C2 must be ≥ 1

PF: Find C1, C2, no є R+ such that 0 ≤ C1(n^2) ≤ (n(n+1))/2 ≤ C2(n^2) for all n ≥ no
Let C1= 1/2, C2= 1 and no = 1.
< 1. show that 0 ≤ C1(n^2) is true >
C1(n^2)= n^2/2
n^2/2≥ no^2/2
⇒no^2/2≥ 0
1/2 > 0
Therefore C1(n^2) ≥ 0 is proven true!
< 2. show that C1(n^2) ≤ (n(n+1))/2 is true >
C1(n^2) ≤ (n(n+1))/2
n^2/2 ≤ (n(n+1))/2 n^2 ≤ n(n+1)
n^2 ≤ n^2+n
0 ≤ n
This we know is true since n ≥ no = 1
Therefore C1(n^2) ≤ (n(n+1))/2 is proven true!

< 3. Show that (n(n+1))/2 ≤ C2(n^2) is true >
(n(n+1))/2 ≤ C2(n^2)
(n+1)/2 ≤ C2(n)
n+1 ≤ 2 C2(n)
n+1 ≤ 2(n)
1 ≤ 2n-n
1 ≤ n(2-1) = n
1≤ n
Also, we know this to be true since n ≥ no = 1

Hence by 1, 2 and 3, θ(n^2 )= (n(n+1))/2 is true since
0 ≤ C1(n^2) ≤ (n(n+1))/2 ≤ C2(n^2) for all n ≥ no

Tell me what you thing guys... im trying to understand this material and would like yalls input!!!

Ellen
A: 
Shafqat