views:

164

answers:

1

I'm trying to find out when a quadratic selection algorithm is faster than a linear selection algorithm. Running some experiments I generated two 3D plots that show the algorithm run times as a function of the input array size and the desired order statistic. Using gnuplot to draw the plot I confirmed that there are cases when the quadratic algorithm is faster. I then used gnuplot's fitting algorithms to find two functions that model my observed runtimes (a,b,c,d,e,f are constants I've already found but leave out):

lin_alg_runtime(x,y) = a*x + b*y +c

quad_alg_runtime(x,y) = (d*x * e*y) + f

where x is the size of the input array and y is the order statistic.

Now I'm kind of lost on how to use these models to calculate when to switch between the quadratic implementation and the linear implementation. I suspect I have to find where these two functions intersect but I'm not quite sure how to do that. How does one find where these two functions intersect?

+1  A: 

Basically you want to use the algorithm that has the lowest runtime estimate.

You can just calculate the value of each of the estimated runtimes, and use the algorithm with the lowest value. You can simplify this very slightly.

You want to use the quad algorithm when:

qual_alg_runtime(x,y) < lin_alg_runtime(x,y)
ax + by + c < dxey + f
ax + by -dexy + c-f < 0

Therefore you can calculate ax + by -dexy + c-f and if it's less than zero, use the quadratic algorithm.

Alex Deem
Doh! I'm embarrassed to have missed that. Thanks
Frank