A Thought Experiment About Identity and Meaning
Imagine the philosopher Donald Davidson is walking through a swamp when he is suddenly struck by lightning and completely disintegrated. At that exact moment, by pure cosmic coincidence, a second lightning bolt strikes a nearby dead tree and rearranges its molecules into an exact physical replica of Davidson—down to the last atom. This creature, the Swampman, walks out of the swamp, goes to Davidson's home, writes philosophy papers, and lives Davidson's life in every observable way.
Meaning and mental content are externally determined—they depend on causal connections between a person and their environment.
When Swampman says "tree," his utterance has no meaning, because he's never actually seen a tree—he just has brain states that coincidentally match Davidson's.
Swampman's apparent "memories" of Davidson's mother are not real memories—he never met her. They're just physical brain configurations that happen to match Davidson's.
Therefore, despite being physically identical, Swampman lacks genuine thoughts and meaningful language. He is behaviorally indistinguishable but mentally empty.
"My replica can't recognize my friends... He can't recognize anything, since he never cognized anything in the first place."— Donald Davidson, "Knowing One's Own Mind" (1987)
Many philosophers argue that since Swampman's brain is physically identical to Davidson's, his mind must also be identical. Brain states just are mental states. If Davidson was conscious before the lightning strike, Swampman must be conscious after. Any theory that denies this is counterintuitive and probably wrong.
Swampman supports Davidson's view that meaning depends on causal-historical relations to the external world, not just internal brain states.
What your thoughts are "about" depends on your actual history of interactions with things—not just your current neural configuration.
If Swampman lacks genuine mental content, what about subjective experience? Can there be qualia without intentionality?
If we copy a brain atom-by-atom, would the copy have genuine thoughts? Swampman suggests the answer might be no—at least not initially.
If teleportation destroys the original and creates a copy, is the copy really "you"? Swampman raises doubts about whether continuity of pattern is enough.
Swampman might acquire genuine meaning after the fact, through future interactions. How long until his words become meaningful?