You should ALWAYS switch... but then you should switch back...
and back... and back... forever?
You pick an envelope and see it contains $X.
The other envelope contains either $2X or $X/2, each with 50% probability.
Since 1.25X > X, you should ALWAYS switch!
But wait... if switching is always better, you should switch back. And back again. Forever!
Something is wrong with this reasoning...
The fallacy lies in using X inconsistently. When you say "the other envelope has 2X or X/2," you're treating X as a fixed value. But:
These are different values of X! You cannot combine them in one expected value calculation because they come from mutually exclusive realities.
Let's call the smaller amount S and the larger amount 2S. The total in both envelopes is always 3S.
| Scenario | Your Envelope | Other Envelope | Gain from Switching |
|---|---|---|---|
| You have smaller (50%) | S | 2S | +S |
| You have larger (50%) | 2S | S | -S |
Switching has zero expected advantage. This matches our intuition— you picked randomly, so switching is just picking the other one randomly!
The error is subtle because both "2X" and "X/2" are valid descriptions of what the other envelope could contain. The problem is treating them as belonging to the same probability space.
It's like saying: "I have a sibling. They're either my older brother (50%) or my younger sister (50%). Expected number of brothers = 0.5. Expected number of sisters = 0.5." But you can't have half a brother AND half a sister as one sibling!
The two envelopes paradox is a variant of the necktie paradox and the wallet game. It was popularized in the 1980s and has generated hundreds of academic papers with proposed resolutions.
The paradox reveals how easily our probabilistic intuitions can be led astray, especially when dealing with conditional expectations and self-referential reasoning. It remains a valuable teaching tool in probability theory and decision science.