views:

138

answers:

7

Possible Duplicate:
are there any O(1/n) algorithms?

I've been reading up on various algorithms recently, and have gotten very used to seeing things with O([some combination of n, n^2 and log n). It seems pretty normal for algorithms to increase in running time with more input, so this doesn't really bother me, but are there many well-known algorithms that decrease in running time with increased input? Or are there other algorithms with something like, say, periodic running time based off of input length?

A: 

An algorithm, to be useful, usually has to process all of the input that it's given. Thus, running time is proportional to input in some way. You can get some different answers depending on how you look at it (length of input, vs value of input, sometimes), but I can't think of any algo that actually gets faster as input gets longer.

Donnie
+1  A: 

You can never go below O(N) assuming you read the whole input. It much depends on the algorithm. Many algorithms need to read all the input, other don't. If you only do a binary search (or your data is stored as a B-tree like in data bases) you can go as low as O(logN).

Alexandru
There are O(1) algorithms, such as insertion into a linked list or retrieving an item from an array.
outis
A: 

The closest an algorithm can come to decreasing in run time with a growing data set is to increase at a decreasing rate, as in O(logN). If we have datasets A and B, and all we know is that A is smaller than B, B will take at least as long as A to run.

oltman
A: 

If you are talking about the empirical, measured time of an algorithm, e.g: the time to process each input element, that may be true (caching and environmental computing conditions can influence). But not as a generalized asymptotic running time (big-Oh notation), since we are specifying an upperbound running time over N elements.

Hernán
A: 

The Boyer-Moore algorithm has a shortened runtime for longer search strings.

Dan Lorenc
Interesting example, but note that the complexity is still *O(n)*. Is the runtime for longer inputs actually shorter, or *relatively* shorter?
Stephan202
Actually shorter. The runtime for searching a sentence for the string 'abc' will take longer on average than searching for 'abcdef'.
Dan Lorenc
+1  A: 

Though we usually think of the n in O(n) as the input size, it would be more accurate to think of it as the problem size. Larger problems (bigger problems) are harder. Observe furthermore that big-oh complexity is about worst-case complexity.

Example: consider a sudoku solver. Assuming a regular 9x9 grid, the input for this solver is a set of pairs (position, number). How much work the solver needs to do depends on the input: if no pairs or all pairs are given, then a solution is easily found. Still the complexity of the sudoku solver will be expressed as a monotonically increasing function, based on the most difficult input one can find.

Stephan202
I guess I was thinking about the sudoku example when asking this. With more information, the time to solve would be smaller. But then again, people could say that as the amount of information increases, the actual problem size decreases
Dasuraga
A: 

This is useless example, but it would at least answer your question. For example a program that generates an alphabet a-z, based on the input. If you give it one letter it needs to generate 25 more letters, if you give a-y as input in only needs to generate one more. That way the program runs longer for smaller input and shorter for larger input.

Peter Stuifzand