views:

192

answers:

5

How would you sort a single linked list. (the problem here is the single property + using LinkedList for sorting is harder than an array) I had like to see a pseudo code..

try to make it efficient as possible for both time and space.

Thanks!

A: 

The Insertion sort, and the QuickSort can both be done on a single-linked list with the same effeciency as on an array.

James Curran
Thanks for the quick answer, can you please explain how?It seems like those sorts using the random access property in order to achieve the efficiencye.g. in merge sort, how would you access the middle of the list? you have to run n/2 steps
Quicksort can't be done on a singly linked list with the same efficiency as on an array.
Cam
@incrediman: Why not? Pluck the first off the list as the pivot, then iterate through the rest, building two new lists. Sort this two lists. Splice them together. It's actually easier with linked-lists than with arrays. (I admit was wrong about mergesort)
James Curran
Isn't splicing O(n) though? If not, you have to be keeping track of the lengths of each list in the list, which certainly won't be efficient either. It would be useful if you could write out a quicksort implementation for singly linked lists in pseudocode that demonstrates how its efficiency is equal to that of quicksort on arrays.
Cam
@James: *You have to be keeping track of the tails for each list in the list
Cam
+5  A: 

Since a linked list is simply a number of items pointed to by other items, you can construct an array of the pointers with O(n) time and O(n) space, sort that using any of the excellent sorting algorithms with O(n log n) time and O(1) space, then reconstruct the linked list from scratch with O(n) time and O(1) space.

Overall, that's O(n log n) time and O(n) space.

paxdiablo
sounds great, is it possible to sort it using O(1) space and O(nlogn) time?
No, not to my knowledge. The fact that you can't access arbitrary list nodes in O(1) makes that unlikely. In fact, it's the O(n) space which _allows_ the O(1) individual access time which in turn allows an O(n log n) overall sort time.
paxdiablo
+2  A: 

I believe you can do this with an in-place Quick sort.
I was wrong. This doesn't work with a singly linked list because in requires being able to step backwards through the list.

So the real question becomes how to do an in-place Quick sort.

Here we go...

1) Take a pointer the first, second, and last terms of the linked list.
2) Step the second pointer through the list until you hit a term that is larger than the first term.
3) Step the third pointer backwards through the list until you hit a term that is smaller than the first term. This step doesn't work with a singly linked list.
4) Swap the values of the second and third pointers.
5) Repeat steps 2) through 5) until the second and third pointers equal each other.
6) Insert the first term after the second pointer

-At this point, the linked list is divided into:
[terms less than x] x [terms greater than x]

7) Repeat steps 2) through 7) for the [terms less than x] and the [terms greater than x] until the size of the [terms __ than x] block is one.

Space: 3 pointers per layer.
O(log(n))

Time: Same as Quick sort
O(n log(n)) in general
O(n * n) in the worst case (if the list is either already sorted or in reverse order)


Edited for formatting and stupidity

T.K.
I loved reading this, and I think the algorithm is sound after a brief look. It is not, I believe, O(1) space though. I think it's O(log n) space. Remember you have to have 3 pointers *for each* recursive "stack frame" (even if you don't use actual recursion). You can't just abandon their old value when dividing and conquering.
Mark Peters
Described here: http://en.wikipedia.org/wiki/In-place_algorithm. *Quicksort is commonly described as an in-place algorithm, but is not in fact one. Most implementations require O(log n) space to support its divide and conquer recursion.* Still better than O(n) though so +1.
Mark Peters
@Mark Peters - Ah, you're right. I didn't think about the pointers on the stack frame, which is pretty obviously a needed consideration given that it's pseudo-recursive. My mistake.I'll crunch my head a bit more for some in-place sorting algorithm that can function in O(n log(n)).
T.K.
Doesn't the question stipulate a singly-linked list? And doesn't this algorithm require a doubly-linked list?
Jeffrey L Whitledge
@Jeffrey L Whitledge - Well that's awfully embarrassing. I can't exactly step backwards through a singly linked list. That was dumb of me.
T.K.
+4  A: 

I've been thinking about this a bit more, and I think you achieve the O(n log(n)) time and O(1) space condition with Merge sort.

Let's see...
Take the list:
3 -> 2 -> 1 -> 5 -> 6 -> 4

First pass:
Set a pointer for the 1st, 2nd term, and 3rd terms
Set the smaller term between the 1st and second pointer to point to the larger term.
Set the last of the 2 terms to point to the rest of the original list.
Repeat until the end of the list.
2 -> 3 -> 1 -> 5 -> 4 -> 6

At this point, every pair of terms is ordered.

Nth pass:
Set a pointer to the 1st, (2^(N-1))th, and (2^(N))+1 th terms
Take the smaller valued node and increment its pointer.
Repeat the process until both lists (of length 2^N) are exhausted, appending the smaller valued node each time to the previous smaller valued node.
Set the pointer to the rest of the original list.
Repeat until the end of the list.

0th pass: 3 -> 2 -> 1 -> 5 -> 6 -> 4 (every block of 1 term is ordered) (trivial)
1st pass: 2 -> 3 -> 1 -> 5 -> 4 -> 6 (every block of 2 terms is ordered)
2nd pass: 1 -> 2 -> 3 -> 5 -> 4 -> 6 (every block of 4 terms is ordered)
3rd pass: 1 -> 2 -> 3 -> 4 -> 5 -> 6 (every block of 8 terms is ordered)

Time: log(n) passes, with n comparisons for each pass.
O(n log(n))

Space: Pointer and integer variables
O(1)

T.K.
+2  A: 

Mergesorting involves just a few simple steps:

  1. Check if the list is empty or has a single element. If so, return the list unchanged.

  2. Split the list in half. Mergesort both parts.

  3. Merge the lists by repeatedly taking the smaller element out of both lists. Since the part lists are already sorted, this is just a matter of looking at the first elements in both lists and picking the smaller.

  4. Return merged list.

As a result you will get a linked list sorted in order of smallest element first.

Code example in Haskell:

import Data.List(splitAt)

--Mergesorts a list by smallest element first.
sort :: Ord a => [a] -> [a]
sort = sortBy (<)

--Generic sort that can sort in any order.
sortBy :: (a -> a -> Bool) -> [a] -> [a]
sortBy _ [] = []
sortBy _ [x] = [x]
sortBy f xs = merge f (sortBy f first) (sortBy f second)
    where
        (first,second) = split xs

--Splits a list in half.
split :: [a] -> ([a],[a])
split xs = splitAt (halfLength xs) xs

--Returns a lists length divided by two as a whole number (rounded).
halfLength :: [a] -> Int
halfLength = round . (/2) . fromIntegral . length

--Merges two lists in the provided order.
merge :: (a -> a -> Bool) -> [a] -> [a] -> [a]
merge _ [] [] = []
merge _ [] (y:ys) = y:ys
merge _ (x:xs) [] = x:xs
merge f (x:xs) (y:ys) =
    if f x y
        then x : merge f xs (y:ys)
        else y : merge f (x:xs) ys
voxcogitatio
I believe this implementation of mergesort will use more than constant space.Lets take a 64 term list. If I split the list into 2 sub-lists, I've got to remember where the location in memory of their heads. First I split once, and remember the 1st and 33rd indicies. Then I split the first half again, and have to remember the 1st, 17th, and 33rd indicies. As number of terms increases, the number of head nodes I need to remember increases as well.I explained how to do an in-place mergesort [here](http://stackoverflow.com/questions/3315936/sort-a-single-linked-list/3322097#3322097).
T.K.