+6  A: 

Nothing can be worse than infinity.

Ink-Jet
Infinity + 1. Jinx, no returns.
zombat
Infinity + 1 = Infinity
armandino
Not for extremely large values of 1 ;)
zombat
Infinity + Infinity = Infinity.
KennyTM
What really blows my mind about the concept of infinity, is that you can have different "sizes" of infinity. For example, consider the set of all integers - it is infinite in size. Now consider the set of all *even* integers - it is also infinite in size, but it is also clearly half the size of the first set. Both infinite, but different sizes. So awesome. The concept of "size" simply fails to work in the context of infinity.
zombat
@zombat: You're talking about cardinality, not infinity as a symbol indicating a trend on the real line / complex plane.
KennyTM
@zombat. The size of the set of even integers is the same as the size of the set of the integers, as shown by the fact that you can place them in one-to-one correspondence. Now, there are more real numbers than integers, as first shown by Cantor.
David Thornley
And if you *really* want to blow your mind about infinity: http://blog.xkcd.com/2010/02/09/math-puzzle
BlueRaja - Danny Pflughoeft
Whether \omega + 1 = \omega depends on whether you're doing cardinal or ordinal arithmetic.
uckelman
+9  A: 

If you keep the algorithm meaningful in any way, O(n!) is the worst upper bound you can achieve.

Since checking each possibility for a permutations of a set to be sorted will take n! steps, you can't get any worse than that.

If you're doing more steps than that then the algorithm has no real useful purpose. Not to mention the following simple sorting algorithm with O(infinity):

list = someList
while (list not sorted):
    doNothing
Yuval A
But it takes O(n) to check whether it's sorted, so you can get O(n*n!)
erikkallen
@erikkallen: Certainly we can come up with an algorithm to verify sortedness that's worse than O(n). For example, for each element in the array, verify that it's greater than all previous ones, much like insertion sort works. That's an O(n^2) algorithm, and I'm sure I could come up with worse given a little thought.
David Thornley
@David Thornley: the following checking algorithm would perhaps show the same spirit as the bogosort: pick two random elements, check that the one with the smaller index is smaller or equal to the one with the larger index, then repeat. Keep a square bit matrix to see which combinations have already been checked. Of course, checking this matrix could also be done in a random walk...
Svante
+3  A: 

1 Put your items to be sorted on index cards
2 Throw them into the air on a windy day, a mile from your house.
2 Throw them into a bonfire and confirm they are completely destroyed.
3 Check your kitchen floor for the correct ordering.
4 Repeat if it's not the correct order.

Best case scenerio is O(∞)

Edit above based on astute observation by KennyTM.

Patrick Karcher
Isn't this BogoSort?
Steve
No, this is worse because there's no chance of it succeeding. How would the index cards get into your kitchen? They're blowing around outside. It's called, uh, buttheadsort.
Patrick Karcher
I think he means throw the cards up in the air *outside*, and then check your floor *inside*, where there are guaranteed to be no cards. Although not a "named" algorithm... it is certainly worse!
womp
@Patrick Quantum tunneling.
KennyTM
@KennyTM. That had actually occurred to me. *There is an extremely small but non-zero chance that any object might disappear and reappear at any other point in the universe.* I guess it *could* happen to a thousand index cards . . . Oi. Dangit, my algorithm is **flawed**. I'll fix it . . .
Patrick Karcher
It's kind of like having tea and no tea at the same time. Or space travel using an infinite improbability drive.
Barry Brown
+3  A: 

A worst case performance of O(∞) might not even make it an algorithm according to some.

An algorithm is just a series of steps and you can always do worse by tweaking it a little bit to get the desired output in more steps than it was previously taking. One could purposely put the knowledge of the number of steps taken into the algorithm and make it terminate and produce the correct output only after X number of steps have been done. That X could very well be of the order of O(n2) or O(nn!) or whatever the algorithm desired to do. That would effectively increase its best-case as well as average case bounds.

But your worst-case scenario cannot be topped :)

Anurag
+1  A: 

Bozo sort is a related algorithm that checks if the list is sorted and, if not, swaps two items at random. It has the same best and worst case performances, but I would intuitively expect the average case to be longer than Bogosort. It's hard to find (or produce) any data on performance of this algorithm.

tloflin
+3  A: 

You should do some research into the exciting field of Pessimal Algorithms and Simplexity Analysis. These authors work on the problem of developing a sort with a pessimal best-case (your bogosort's best case is Omega(n), while slowsort (see paper) has a non-polynomial best-case time complexity).

Derrick Turk
A: 

You could make any sort algorithm slower by running your "is it sorted" step randomly. Something like:

  1. Create an array of booleans the same size as the array you're sorting. Set them all to false.
  2. Run an iteration of bogosort
  3. Pick two random elements.
  4. If the two elements are sorted in relation to eachother (i < j && array[i] < array[j]), mark the indexes of both on the boolean array to true. Overwise, start over.
  5. Check if all of the booleans in the array are true. If not, go back to 3.
  6. Done.
Brendan Long
+12  A: 

Intelligent Design Sort

Introduction

Intelligent design sort is a sorting algorithm based on the theory of intelligent design.

Algorithm Description

The probability of the original input list being in the exact order it's in is 1/(n!). There is such a small likelihood of this that it's clearly absurd to say that this happened by chance, so it must have been consciously put in that order by an intelligent Sorter. Therefore it's safe to assume that it's already optimally Sorted in some way that transcends our naïve mortal understanding of "ascending order". Any attempt to change that order to conform to our own preconceptions would actually make it less sorted.

Analysis

This algorithm is constant in time, and sorts the list in-place, requiring no additional memory at all. In fact, it doesn't even require any of that suspicious technological computer stuff. Praise the Sorter!

Feedback

Gary Rogers writes:

Making the sort constant in time denies the power of The Sorter. The Sorter exists outside of time, thus the sort is timeless. To require time to validate the sort dimishes the role of the Sorter. Thus... this particular sort is flawed, and can not be attributed to 'The Sorter'.

Heresy!

BioGeek
Also known as "Assumption Sort": Assume the list is sorted, return!
BioGeek
+100 - this answer is made out of 100% pure win.
womp
@BioGeek: Hey! Don't forget "Indecisive Sort" (Also know as "Schrodinger's Sort" or "Quantum Sort"), where the list may or may not be sorted, however checking it will reveal whether or not it is. Here is my sample implementation: `void quantum_sort (void *b, size_t n, size_t s, int (*c)(const void *, const void*)) { if (rand () % 2) qsort (b, n, s, c); }`.
Joe D
+5  A: 

Quantum Bogosort

A sorting algorithm that assumes that the many-worlds interpretation of quantum mechanics is correct:

  1. Check that the list is sorted. If not, destroy the universe.

At the conclusion of the algorithm, the list will be sorted in the only universe left standing. This algorithm takes worst-case O(N) and average-case O(1) time. In fact, the average number of comparisons performed is 2: there's a 50% chance that the universe will be destroyed on the second element, a 25% chance that it'll be destroyed on the third, and so on.

BioGeek
But time ceases to exist in the universe you just destroyed. So an observer in a universe that you have not yet checked will not be able to tell how much of the algorithm has been executed. Thus, this algorithm always takes O(1) time, since previous universe-destructions don't exist anymore.
Barry Brown
To quote the author of the above algorithm:On average, the universe is destroyed. It takes only O(1) time to decide to destroy the universe, and (assumption) O(1) time to destroy the universe. (destroyUniverse has all the time in the world, which should be a constant as the universe has no inputs, or else something is really screwed up with our understanding of the univerrse.) Therefore the average case is O(1).http://www.reddit.com/r/programming/comments/thtx/my_new_favorite_sorting_algorithm/ctl3h?context=2
BioGeek
Isn't it always O(n), since you have to check every element?
Brendan Long
Yes, in the only universe that observes the list sorted, it took O(n) time to execute - how long it took in other universes is irrelevant.
Nick Johnson
There will be many universes left standing. In some of them the list is sorted and Obama gets the Nobel Peace Prize for keeping Guantanamo open. In some of them the list is sorted and Obama gets the Nobel Peace Prize for closing Guantanamo.
Windows programmer
+1  A: 

I had a lecturer who once suggested generating a random array, checking if it was sorted and then checking if the data was the same as the array to be sorted.

Best case O(N) (first time baby!) Worst case O(Never)

Daniel
A: 
Joe D
A: 

My favorite slow sorting algorithm is the stooge sort:

void stooges(long *begin, long *end) { if( (end-begin) <= 1 ) return; if( begin[0] < end[-1] ) swap(begin, end-1); if( (end-begin) > 1 ) { int one_third = (end-begin)/3; stooges(begin, end-one_third); stooges(begin+one_third, end); stooges(begin, end-one_third); } }

The worst case complexity is O(n^(log(3) / log(1.5))) = O(n^2.7095...).

Another slow sorting algorithm is actually named slowsort!

void slow(long *start, long *end) { if( (end-start) <= 1 ) return; long *middle = start + (end-start)/2; slow(start, middle); slow(middle, end); if( middle[-1] > end[-1] ) swap(middle-1, end-1); slow(start, end-1); }

This one takes O(n ^ (log n)) in the best case... even slower than stoogesort.

Ben Goldberg