Falsification and the scientific method
Yesterday we went for a walk around a beautiful lake, near our house. With us was nephew Noah, 17 years old, tall as a maypole. He is affectionate and intelligent, burning with questions. This time it was about his tentative plan to study philosophy, after he finishes school, at university. He wanted to know what I thought of that, and how I assessed my own philosophical career.
I told Noah about my traumatic experience with classical and metaphysical philosophy, and how, just when I was on the verge of abandoning this course of studies, I discovered (the heavens be thanked) modern branches: linguistic philosophy, analytic ethics, evolutionary epistemology, the philosophy of science. Especially the latter was of particular interest to the lad. He wanted to know how scientific discovery works, how you can gain empirical proof, how it influences belief and behaviour, and how it contrasts with mystical and religious belief. Well, I gave him a first impression, with a simple lesson I have been using a lot lately.
“I will give you a series of numbers, Noah,” I said. “You must tell me the rule I am using to derive them. You can conduct tests by asking me the next number in the row, and the number after that, or an alternative number. You can do this as often as you like. At some stage you must stop and tell me the rule you have discovered.”
Then I gave him the numbers: two, four, six…
“That’s easy,” Noah said, “I know the rule already.” — “That’s very hasty,” I said. “Don’t you want to run some tests before you jump to a conclusion?” Okay, he said, and started to guess: Eight? Yes. Ten? Yes. Twelve? Yes. Fourteen? Yes. “Come on, it is quite clear. The rule is that each number is two more than the previous one.” — “Wrong!” I said. “Now you have two tasks: find the correct rule I am using, and tell me what you did wrong.”
Poor Noah. His next try was to ask: “Must the numbers be rational, positive integers?” That’s not how you explore, discover and deduce, I told him. You cannot ask such questions in the natural world. When for example exploring gravity you cannot ask nature whether it diminishes with the square of the distance, or whether there is a gravitational constant. You have to conduct tests or observations and deduce the laws of gravity from that. In our case you guess numbers and draw conclusions from the results you get.
But wasn’t that what he was doing? “Yes, but your error was caused by something known as ‘confirmation bias’. It is the tendency to search for information that confirms pre-existing theories or hypotheses. Like most people you jumped to the immanently plausible conclusion that the numbers increased by two, and you tested this theory with positive examples all the way to fourteen. That was enough to confirm to your mind that you had found the rule.”
So what had he done wrong? The proper empirical method would have been to try to disprove the initial theory, not confirm it. Try that, Noah, don’t look for positive instances, look for things that can disprove your hypothesis. And modify your theory accordingly. He did: seven? Yes. Nine? Yes. Ten and a half? Yes. “I think I know,” Noah said, “the next number must always be higher!” — “Are you sure?” I asked. “Haven’t you left out a critical test?” Yes, of course, two, four, six, and then five? No! That confirmed his new hypothesis further. Of course, it does not do so definitively, since we may be missing some parameters. For instance the rule could be higher numbers that are not multiples of the previous two. Further tests would confirm or refute that and similar theories.
Noah was enjoying all this immensely. But he was also a little alarmed: “How come I fell into the confirmation bias trap?” he said. For evolutionary reasons, I told him. Sapiens — but also most higher animals — have developed this bias in order to survive. Animals who wanted to critically test conjectures before accepting them — e.g. that a rustling in the undergrowth meant a predator was approaching — were at a disadvantage compared to those who drew hasty conclusions after a first observation. The latter often fled up a tree, quite unnecessarily, when the wind or a hare had caused the rustling. That was inconvenient, but not deadly. The former, on the other hand, often got eaten when critically testing their hypothesis.
So the conclusion is: science and empirical research is a strategy aimed at overcoming our confirmation bias, and falsification is one of the most important tools of the scientific method. I gave Noah a simple example:
Say you have a theory that says all swans are white (there were dozens of swans on the lake, watching us for signs of food we might have for them). How would we confirm the hypothesis? Would we search for more white swans, take pictures, record where we had seen them, and present that as proof? No, we would contact naturalists and observers all over the world, asking them whether they had ever seen black swans, or swans of any other colour than white. A single example, like the one on the left, would completely refute the theory, while thousands of new pictures of white swans would not significantly contribute to its confirmation.
One more point I made to Noah, who is very interested in formal logic: the statement “All swans are white” is logically equivalent to the statement “All non-white things are not swans.” I pointed to a beautiful early autumn tree, and then to a dark rock, a yellow blossom, the clear blue sky — see, I am proving that all swans are white, from a purely logical point of view. He liked this example.
I was introduced to the concept of falsifiability by Karl Popper, one of the pioneers of the philosophy of science. It is a cornerstone of scientific epistemology — according to Popper statements that are not falsifiable are unscientific, and unfalsifiable theories are pseudoscience. I studied Popper and even visited the great man at his home in England. He and his successors, Kuhn, Lakatos and others, had a profound influence on my intellectual development. In fact, they launched me on a sceptical career.