I missed the class where big-O was introduced thinking that it was pretty straight forward. It still seems to be however the teacher said something about O(n) deviating from the function when n gets very small? I couldn't find this anywhere in the book. Could someone enlighten me? Our exploration of O(n) has been in the context of sorting algorithms if that is of any significance.
Thanks Gene
edit: Thanks for the help guys it has been illuminating. I have a follow-up question. Is there a relatively simple mathematical way to figure out the point where n is too small for O(n)?
Related questions
are there any O(1/n) algorithms?
What is the difference between Θ(n) and O(n)?