views:

50

answers:

1

Hi, I'm rather new to programming and have recently been introduced to the topic of asymptotic complexity. What I'm curious about is how to figure out the asymptotic complexity of a sorting method, given the number of elements and the time taken to sort them.

Here is an example of what I mean.

  • Time for 'sortArray' to sort sorted array with 400 elements: 4
  • Time for 'sortArray' to sort sorted array with 800 elements: 8
  • Time for 'sortArray' to sort sorted array with 1600 elements: 16
  • Time for 'sortArray' to sort sorted array with 3200 elements: 26

  • Time for 'sortArray' to sort random array with 400 elements: 255

  • Time for 'sortArray' to sort random array with 800 elements: 958
  • Time for 'sortArray' to sort random array with 1600 elements: 4059
  • Time for 'sortArray' to sort random array with 3200 elements: 16585

Any help about how to go about calculating the Big O notation of something like this? Thanks!

+2  A: 

By definition, the asymptotic complexity of an algorithm represents is rate of growth (in time, space, or some other resource) as the size of the input(s) increase(s). It's best to analyze the algorithm itself to determine this rate of growth. However, if all you have is a "black box" sorting algorithm, and you only know the size of the input and the resulting time, the best you can do is to check whether a graph of inputs vs. time appears to grow in the pattern that a particular function would.

To test this idea, graph functions like:

  • f(n) = n
  • f(n) = n ln n
  • f(n) = n^2

etc., and see which most resembles the graph you created of the time to run the algorithm. Remember that asymptotic analysis ends up omitting both constant factors on each term and any lower-order terms. So, if your algorithm is still running in linear time even though the graph looks like f(n) = 2n, you still have an O(n) algorithm.

Andrew