The fact that it's a commonly taught sorting algorithm is just an accident of history and imperative languages' being popular. It's actually quite a lot more complex than merge sort or heap sort and it's much harder to analyze. (Proving that merge sort runs in O(n log n) is relatively easy; proving that quicksort runs in O(n log n) in expectation is quite hard.)
Name:
Anonymous2016-08-15 17:28
quicksort [] = [] quicksort (p:xs) = quicksort less ++ [ p ] ++ quicksort more where less = [ x | x <- xs, x < p ] more = [ x | x <- xs, x >= p ]
Name:
Anonymous2016-08-15 17:44
(defun my-quicksort (L pred) (if (null (cdr L)) L (append (my-quicksort (remove-if-not #'(lambda (x) (funcall pred x (car L))) (cdr L)) pred) (list (car L)) (my-quicksort (remove-if-not #'(lambda (x) (funcall (complement pred) x (car L))) (cdr L)) pred))))
>>10 Wrong. Quicksort is O(n) when the vector is almost sorted, otherwise it's O(n^2). That's why nobody uses qsort() from libc.
Name:
Anonymous2016-08-15 21:12
>>12 sorry, you're right but yeah, quicksort is shit
Name:
Anonymous2016-08-15 21:25
>>5 (defun partition (L p q) (let ((i (1- p)) (pivot (nth q L))) (loop for j from p to (1- q) when (> pivot (nth j L)) do (progn (incf i) (rotatef (nth i L) (nth j L)))) (incf i) (rotatef (nth q L) (nth i L)) i))
(defun my-quicksort (L &optional (p 0) (q (1- (length L)))) (if (>= p q) L (let ((q* (partition L p q))) (my-quicksort L p (1- q*)) (my-quicksort L (1+ q*) q))))
Name:
Anonymous2016-08-15 21:29
tree sort
Name:
Anonymous2016-08-15 23:47
make it faster:
data QT a = QNil | QT (QT a) a (QT a)
flat QNil end = end flat (QT x m y) end = flat x . (m:) . flat y $ end
quicksortT [] = QNil quicksortT (p:xs) = QT (quicksort less) p (quicksort more) where less = [ x | x <- xs, x < p ] more = [ x | x <- xs, x >= p ]
quicksort x = flat (quicksortT x) []
Name:
Anonymous2016-08-16 0:45
The only real reason to sort is to compare structures more easily (check all objects against all other objects for interactions while sorted can be optimized in certain ways, for example) which are usually O(n2) anyway. Searching is always going to be O(n) in the worst case, sorted or not. And if order is actually important, you should be using something like AVL or Red-Black trees. Sorting shouldn't be done often enough that the time complexity of the algorithm used should be anything significant.
Name:
Anonymous2016-08-16 2:19
>>17 AVL and Red-Black trees are sorting algorithms.
Name:
Anonymous2016-08-16 2:26
>>18 They aren't, they are data structures where the insertion\deletion algorithm rebalances the tree as it goes. You can use them to sort, but that is just a side-effect.
Name:
Anonymous2016-08-16 2:36
>>19 The balancing only exists to make the [u][i][o]SORTING[o][i][u] faster, which is their primary purpose.