Some infinities are bigger than others — and we can prove it
Georg Cantor proved something that seems impossible: you cannot list all real numbers between 0 and 1. Even with an infinitely long list, you'll always miss some.
This means the infinity of real numbers is strictly larger than the infinity of counting numbers. There are different sizes of infinity! The proof is elegant and visual — watch the diagonal argument unfold below.
Suppose we could list ALL real numbers...
Assume we have a complete list of ALL real numbers between 0 and 1.
Each number is written as an infinite decimal: 0.d₁d₂d₃d₄...
The diagonal argument constructs a number that differs from every number in the list — it disagrees with the 1st number in the 1st digit, the 2nd number in the 2nd digit, and so on.
This new number is a valid real number between 0 and 1, but it cannot appear anywhere in our "complete" list. Contradiction! Therefore, no such complete list can exist.
Conclusion: The real numbers are uncountably infinite — a larger infinity than the natural numbers.
"Aleph-null" — the smallest infinity
"Continuum" — strictly larger!
Watch the diagonal argument animate in real-time
Enter decimal numbers (0-1) and see the diagonal argument in action
Visualizing why some infinities are larger than others
Cantor proved there are infinitely many sizes of infinity: ℵ₀ < 2^ℵ₀ < 2^(2^ℵ₀) < ...
Turing used diagonalization to prove some problems are undecidable by any computer.
Similar self-referential techniques prove mathematical systems are incomplete.
Is there an infinity between ℵ₀ and 𝔠? Proven independent of standard math axioms!
Before Cantor, mathematicians assumed all infinities were the same. His diagonal argument shattered that intuition and revealed a rich structure in the infinite.
The technique itself — constructing an object that differs from everything in a list — became one of the most powerful tools in mathematics and computer science, used to prove limitations on computation, logic, and definability.
Cantor's work was controversial in his time; some mathematicians called it a "disease." Today, it's fundamental to our understanding of mathematics itself.