← Back to Paradoxes

The Ellsberg Paradox

Why we fear the unknown more than the risky

The Setup

Imagine an urn containing 90 balls. You know exactly 30 are red. The remaining 60 are some mix of black and yellow—but you don't know the ratio. It could be 60 black and 0 yellow, or 0 black and 60 yellow, or anything in between.

You'll be paid $100 if the ball you draw matches your bet. Which gambles would you prefer?

Daniel Ellsberg (yes, the Pentagon Papers whistleblower) discovered in 1961 that people's choices in this scenario violate the fundamental axioms of rational decision theory—yet feel completely reasonable.

🎱 The Ellsberg Experiment 🎱

Make your choices, then see the paradox revealed

The Urn
90 balls total
Choice 1: Which bet do you prefer?
Gamble A
Win if RED is drawn
Known: 30/90 = 33.3%
Gamble B
Win if BLACK is drawn
Unknown: 0-60/90 = 0-66.7%
Choice 2: Which bet do you prefer?
Gamble C
Win if RED or YELLOW
Unknown: 30-90/90 = 33-100%
Gamble D
Win if BLACK or YELLOW
Known: 60/90 = 66.7%
0%
Choose A over B
0%
Choose D over C

The Paradox

Most people prefer A over B (known 33% vs unknown black) and D over C (known 67% vs unknown red+yellow). This seems reasonable—we prefer known odds!

But here's the contradiction:

🧠 The Logical Trap
If you prefer A over B: P(red) > P(black)
This means you believe: 30/90 > Black/90
Which implies: Black < 30
If Black < 30, then Yellow > 30 (since Black + Yellow = 60)
So Red + Yellow > 60 meaning C > D!

But you preferred D! Your preferences are logically inconsistent. You can't simultaneously believe black is less likely than red AND that black+yellow is more likely than red+yellow.

"People are not simply risk-averse; they are ambiguity-averse. They prefer the devil they know to the devil they don't."
— Daniel Ellsberg, 1961

Ambiguity Aversion

The Ellsberg Paradox reveals that humans don't just dislike risk—we have a separate, powerful aversion to ambiguity (not knowing the probabilities at all).

Risk vs. Ambiguity

Standard expected utility theory treats these the same—a rational agent should assign subjective probabilities and maximize expected value. But humans systematically avoid ambiguity, even when it means accepting objectively worse odds.

Why Does This Violate Rationality?

The Sure-Thing Principle (one of the axioms of rational choice) says: if you prefer A to B when some event E happens, and you also prefer A to B when E doesn't happen, then you should prefer A to B regardless of E.

In the Ellsberg case:

Real-World Applications

📈
Home Bias in Investing
Investors prefer domestic stocks over foreign ones with identical returns—because foreign markets feel more "unknown."
🏥
Medical Decisions
Patients prefer treatments with known side effects over newer treatments with uncertain (but potentially better) outcomes.
🛡️
Insurance Purchasing
People over-insure against ambiguous risks and under-insure against well-quantified ones.
💼
Career Choices
People stay in unsatisfying jobs with known downsides rather than risk switching to opportunities with unclear outcomes.

The Man Behind the Paradox

Daniel Ellsberg is better known for leaking the Pentagon Papers in 1971—classified documents revealing government deception about the Vietnam War. But before becoming a whistleblower, he was a brilliant decision theorist at RAND Corporation.

His 1961 paper "Risk, Ambiguity, and the Savage Axioms" challenged the foundations of rational choice theory. Though initially controversial, his insights are now fundamental to behavioral economics, earning him recognition as a pioneer in understanding how humans actually make decisions under uncertainty.

Interestingly, John Maynard Keynes described a version of this paradox as early as 1921, but it was Ellsberg who formalized it and explored its implications for economic theory.

Theoretical Resolutions

Maxmin Expected Utility

Instead of assuming a single probability distribution, assume the decision-maker considers the worst-case scenario across all possible distributions. This "pessimistic" approach can rationalize ambiguity aversion.

Choquet Expected Utility

French mathematician Gustave Choquet developed a generalized integral that allows for non-additive probabilities, accommodating the kind of behavior seen in the Ellsberg experiment.

The Lesson

The Ellsberg Paradox doesn't mean humans are irrational—it means our intuitive definition of "rational" was incomplete. We've since developed richer theories of decision-making that account for ambiguity, leading to better models in economics, AI, and policy design.

Sources & Further Reading