The Art of Ordering Chaos
Every algorithm has a personality. Bubble sort plods along, swapping neighbors endlessly. Quicksort divides and conquers with elegant recursion. Merge sort is methodical and reliable. Heap sort builds a pyramid, then dismantles it. Radix sort doesn’t even compare elements. Watch 20 sorting algorithms race, dance, and transform chaos into order—each with its own strategy, speed, and visual signature. The bars tell the story.
The O(n²) algorithms—slow but intuitive. Every beginner starts here, and their simplicity makes them beautiful to watch.
The simplest sort: compare neighbors and swap. Watch large elements “bubble” to the top. Slow (O(n²)) but mesmerizing.
Find the minimum, put it first. Find the next minimum, put it second. Repeat. The sorted region grows one element at a time.
Like sorting playing cards: take each element and insert it into its correct position in the sorted portion. Fast on nearly-sorted data!
Bubble sort that goes both ways—left to right, then right to left. Avoids the “turtle” problem where small values sink slowly.
A garden gnome sorting flower pots: move forward if in order, swap and step back if not. Charmingly simple, surprisingly effective on small inputs.
The O(n log n) powerhouses that actually get used in production. Divide-and-conquer and clever data structures make them fast.
Divide in half, sort each half, merge them back. Guaranteed O(n log n), beautifully recursive, and stable. The gold standard for linked lists.
Pick a pivot, partition around it, recurse. The fastest general-purpose sort in practice. Watch the partitions cascade like falling dominoes.
Build a max-heap, then extract the maximum repeatedly. Shows both the array view and the tree view simultaneously. In-place and O(n log n).
Insertion sort with a twist: start with big gaps, shrink them. The gap sequence matters enormously—try different ones and see the difference.
Python and Java’s default sort: find natural runs, extend them with insertion sort, merge them intelligently. The real-world champion.
Breaking the O(n log n) barrier by not comparing elements at all. These sorts exploit the structure of the data itself.
Count occurrences of each value, then place them. O(n + k) where k is the range. Watch the histogram build and then flatten into sorted order.
Sort by ones digit, then tens, then hundreds. Never compares two elements! Watch numbers organize themselves digit by digit.
Distribute elements into buckets by range, sort each bucket, concatenate. Watch elements fall into their buckets like a ball-sorting machine.
Like counting sort for integers: put each element in its pigeonhole, then collect them in order. Simple when the range is small.
Estimate each element’s position using the distribution, then refine with insertion sort. Near-linear time for uniformly distributed data.
Unusual, creative, and sometimes deliberately terrible sorting algorithms that are too entertaining not to include.
Bubble sort with shrinking gaps (factor 1.3). Kills “turtles” quickly and approaches O(n log n) in practice. An underrated gem.
Shuffle randomly. Check if sorted. Repeat. Expected time: O(n × n!). The worst sort ever invented—but hilarious to watch trying.
A parallel-friendly sorting network: create bitonic sequences (up then down), then merge them. The butterfly pattern is gorgeous.
Represent numbers as rows of beads on an abacus, then let gravity pull them down. The beads fall into sorted order. Physical computing!
Race all algorithms head-to-head on the same data! Pick your contenders, set the array size, and watch them compete in real time.