← Back to Paradoxes
Paradox #139

Newcomb's Problem

One Box or Two? A War Between Decision Theories

👁️
The Predictor
A being who has correctly predicted 99.9% of all choices
The Predictor has already made its prediction and filled the boxes accordingly.
Box A
Transparent
$1,000
Box B
Opaque
What do you choose?
Philosophers' Poll (Bourget & Chalmers 2014)
~40% One-Box
~60% Two-Box

As Nozick observed: "To almost everyone, it is perfectly clear what should be done. The difficulty is that these people divide almost evenly."

The Setup

William Newcomb, a physicist at Lawrence Livermore Laboratory, devised this problem in 1960. It was later analyzed by philosopher Robert Nozick and popularized by Martin Gardner in Scientific American (1973).

Before you are two boxes. Box A is transparent and contains $1,000. Box B is opaque and contains either $1,000,000 or $0.

A Predictor—a being, superintelligence, or perfect simulation—has already examined your brain, psychology, and decision-making tendencies. It has made its prediction and filled the boxes accordingly:

  • If the Predictor predicted you would take only Box B, it put $1,000,000 in Box B.
  • If the Predictor predicted you would take both boxes, it left Box B empty.

The Predictor has been right in 99.9% of all previous cases. The boxes are already filled. Your choice cannot change what's in them. What do you do?

The Two Arguments

📦 The One-Boxer's Argument

Evidential reasoning: One-boxers almost always get $1,000,000. Two-boxers almost always get $1,000. The evidence is overwhelming—one-boxing leads to riches.

  • The Predictor is nearly perfect—99.9% accuracy
  • If you're the type who one-boxes, Box B has the million
  • If you're the type who two-boxes, Box B is empty
  • Be the type who one-boxes, and walk away rich

Decision theory: This aligns with Evidential Decision Theory— choose the action that is evidence of good outcomes.

📦📦 The Two-Boxer's Argument

Causal reasoning: The Predictor has already made its decision. The boxes are already filled. Your choice now cannot change the past.

  • If Box B has $1M: taking both gets $1,001,000 (vs $1M)
  • If Box B has $0: taking both gets $1,000 (vs $0)
  • Either way, taking both is strictly better
  • This is the dominance principle—always choose the dominant option

Decision theory: This aligns with Causal Decision Theory— choose the action that causally produces better outcomes.

"I have put this problem to a large number of people... To almost everyone it is perfectly clear and obvious what should be done. The difficulty is that these people seem to divide almost evenly on the problem, with large numbers thinking that the opposing half is just being silly." — Robert Nozick, "Newcomb's Problem and Two Principles of Choice" (1969)

Why It Matters

Newcomb's Problem isn't just a puzzle—it exposes a fundamental divide in how we reason about decisions. Causal Decision Theory says you should choose actions that cause good outcomes. Evidential Decision Theory says you should choose actions that are evidence of good outcomes.

In most cases, these agree. But Newcomb's Problem pits them against each other. And the stakes extend beyond philosophy—questions about AI alignment, precommitment, and strategic interaction all touch on similar issues.

If you're building an AI that will face predictors (other AIs, game-theoretic opponents, or even its own future self), which decision theory should it use? The debate continues.

Source: Stanford Encyclopedia of Philosophy — Causal Decision Theory | LessWrong — Newcomb's Problem