Patience sorting
From Wikipedia, the free encyclopedia
Patience sorting is a sorting algorithm, based on a solitaire card game, that has the property of being able to efficiently compute the length of the longest increasing subsequence in a given array.
Contents |
[edit] The card game
The game begins with a shuffled deck of cards, labeled .
The cards are dealt one by one into a sequence of piles on the table, according to the following rules.
- Initially, there are no piles. The first card dealt forms a new pile consisting of the single card.
- Each new card may be placed either on an existing pile that has a higher value card on the top (if any such pile exists), thus increasing the number of cards in that pile, or to the right of all of the existing piles, thus forming a new pile.
- When there are no more cards remaining to deal, the game ends.
The object of the game is to finish with as few piles as possible. D. Aldous and P. Diaconis[1] suggest defining 9 or fewer piles as a winning outcome for n = 52, which has approximately 5% chance to happen.
[edit] Algorithm for sorting
Given an n-element array with an ordering relation as an input for the sorting, consider it as a collection of cards, with the (unknown in the beginning) statistical ordering of each element serving as its index. Note that the game never uses the actual value of the card, except for comparison between two cards, and the relative ordering of any two array elements is known.
Now simulate the patience sorting game, played with the greedy strategy, i.e., placing each new card on the leftmost pile that is legally possible to use. At each stage of the game, under this strategy, the labels on the top cards of the piles are increasing from left to right. To recover the sorted sequence, collect the minimum card from the top row each time.
[edit] Complexity
If values of cards are in the range , there is an efficient implementation with worst-case running time for putting the cards into piles (which makes the worst case for a complete sort O(nlogn)) and O(n) space spent for supporting structures, relying on a van Emde Boas tree. A description is given in the work by S. Bespamyatnikh and M. Segal.[2]
When no assumption is made about values, the greedy strategy can be implemented in O(nlogn) comparisons in worst case. In fact, one can implement it with an array of stacks ordered by values of top cards and, for inserting a new card, use a binary search, which is O(logp) comparisons in worst case, where p is the number of piles. But there is no known efficient way (aka O(nlogn) worst case) of using the structure obtained by the greedy strategy to complete the sorting. The card with the least value is at top of leftmost pile but when this card is removed this is no longer the case in the new configuration, and some work has to be done. Finding the next card by searching it among all tops of piles, as in the wikibooks implementation, gives a O(n2) worst case. Inserting the first pile in the remaining list of piles, with respect to the top cards ordering, can cost O(n) each time and so gives also a O(n2) worst case (take as starting list of values).
[edit] Algorithm for finding the longest increasing subsequence
First, execute the sorting algorithm as described above. The resulting number of piles will equal the length of the longest increasing subsequence. Whenever a card is placed on top of a pile, put a back-pointer to the top card in the previous pile (that, by assumption, has a lower value than the new card has). In the end, follow the back-pointers from the top card in the last pile to recover a decreasing subsequence of the longest length; its reverse is an answer to the longest increasing subsequence algorithm.
S. Bespamyatnikh and M. Segal,[2] give a description of an efficient implementation of the algorithm, incurring no additional asymptotic cost over the sorting one (as the back-pointers storage, creation and traversal require linear time and space). They further show how to report all the longest increasing subsequences from the same resulting data structures.
[edit] History
According to D. Aldous and P. Diaconis,[1] patience sorting was first recognized as an algorithm to compute the longest increasing subsequence length by Hammersley,[3] and by A.S.C. Ross and independently Bob Floyd as a sorting algorithm. Initial analysis was done by Mallows.[4]
[edit] See also
[edit] References
- ^ a b David Aldous and Persi Diaconis. Longest increasing subsequences: from patience sorting to the Baik-Deift-Johansson theorem. Bull. (new series) of the Amer. Math. Society, Volume 36, number 4, pages 413–432, p.414
- ^ a b Sergei Bespamyatnikh and Michael Segal. Enumerating Longest Increasing Subsequences and Patience Sorting. Pacific Inst. for the Math. Sci. Preprints, PIMS-99-3., pp.7–8
- ^ J.M. Hammersley. A few seedlings of research. In Proc. Sixth Berkeley Symp. Math. Statist. and Probability, Volume 1, pages 345–394. University of California Press, 1972. MR 53:9457, p.362
- ^ C.L. Mallows. Patience sorting. Bull. Inst. Math. Appl., 9:216–224, 1973.
[edit] External links
|