I'm trying to find out when a quadratic selection algorithm is faster than a linear selection algorithm. Running some experiments I generated two 3D plots that show the algorithm run times as a function of the input array size and the desired order statistic. Using gnuplot to draw the plot I confirmed that there are cases when the quadratic algorithm is faster. I then used gnuplot's fitting algorithms to find two functions that model my observed runtimes (a,b,c,d,e,f are constants I've already found but leave out):
lin_alg_runtime(x,y) = a*x + b*y +c
quad_alg_runtime(x,y) = (d*x * e*y) + f
where x is the size of the input array and y is the order statistic.
Now I'm kind of lost on how to use these models to calculate when to switch between the quadratic implementation and the linear implementation. I suspect I have to find where these two functions intersect but I'm not quite sure how to do that. How does one find where these two functions intersect?