WORLDVIEW: When facts stop being facts

Network scientist and mathematician Samuel Arbesman explains why everything we know has an expiration date.

Scientists measure the “half-life” of substances in order to determine how quickly they break down.  Arbesman argues that facts, as we know them, also decay in a predictable way.

The way most people understand it, a fact is an immutable truth. How can a fact have a half-life?

The way I’m using it, a fact doesn’t necessarily have to be true – it doesn’t need to be the way the world actually is.
Unlike truth, which we’re getting closer and closer towards, there’s a certain amount of flux in all the different facts that we know. So, what’s in our textbooks might no longer be true, facts on the state of the world can change, and even scientific and technological facts can change.
This is not surprising – these types of facts are of course prone to change. But by looking at how they change, we find that underlying all the flux and change of facts, there are actually regularities to how knowledge grows, how it gets overturned, how it spreads from person to person, and how it changes over time.
How do you calculate the half-life of a fact?
There are many ways to do this. Theory of medicine, to give an example, is a field that changes quite rapidly. Many students are actually taught in medical school that a large fraction of what they learn is going to become obsolete within only a few years of their graduation. So, about 10 years ago, a team of physicians set out to measure this. They gave a series of papers from two medical fields to a panel of experts, and said, “Which of these papers are still true, and which ones have been overturned or otherwise rendered obsolete?”
What they found is that you can graph it, and see a clear decay: As the age of a paper increases, the likelihood that the paper is no longer true increases as well. Using this method of analysis, they found that it takes about 45 years for half of the medical knowledge in these two fields to be overturned or otherwise rendered obsolete.
How does a fact become a non-fact?
When my grandfather was in dental school back in the 1930s, he learned that there were 48 chromosomes in every human cell. We now know that there are 46 – in the mid-’50s, someone went back and recounted. So, some facts are just wrong.
There are also facts that subsume other knowledge. When Einstein came along and provided a more general theory than Newton’s theory of gravitation, for example, it wasn’t that Newton was wrong, it was that his theory could be subsumed within Einstein’s.
Don’t we have a natural resistance to changing facts that we’ve already accepted?
A professor of mine told me that he once gave a lecture on a Tuesday, and then on the Wednesday read a paper that invalidated everything he had taught the day before. So he went in on the Thursday and said, “Remember everything I taught you? It’s wrong. But if that bothers you, get out of science.” So, scientists are pretty well adapted to this.
For the rest of us, though, it’s a lot harder. We learn a lot of things when we’re young that we use to create a framework for understanding the world, and this provides a certain sense of control. So, when we’re told that things we thought were true are no longer true, it can be a bit alarming.
There are a lot of biases in how we assimilate knowledge: If a change in a fact confirms something we already held dear, we’re of course more likely to accept it than if it goes against everything we know.
More than 100 years ago, the physician Ignaz Semmelweis recommended that doctors who were going directly from performing autopsies to delivering babies should wash their hands in between. At that point, the miasma theory of disease – the idea that low-lying clouds carry sickness – was ascendant, and under the miasma theory, there’s no reason to wash your hands. So people ignored him.
One gets the impression from reading your book that in your view, facts are more like Wikipedia entries – things that change through collaboration – than like entries in Encyclopedia Britannica, which were written by experts and then printed and bound. Does that ring true to you in any way?
Science and knowledge are always in draft form. We’re never totally sure about everything. At the same time, there’s a distinction between the core of what we know and the frontier. The frontier is where there’s this constant churn and flux, and where a lot of the new papers are being published, whereas the core is more the Encyclopedia Britannica approach.
There are some cases in which the Britannica approach is more appropriate than the Wikipedia approach, but oftentimes, we have to recognize that a lot of what we know is more in a draft form, and more Wikipedia-like, than we might want.
That said, we’re still constantly approaching truth, which I think is similar to Wikipedia’s goal: Through this constant collaboration and redrafting, we’re getting closer and closer to a true understanding of what the answers should be.
If facts change all the time, how can we make intelligent decisions about things like climate change? Is thee evrer a point where we can reachse a cosnnus and say, “These are the facts”?

Certainly. I think if it’s pretty well established – if there’s a consensus, or if many studies have supported a certain fact – we should not be overly skeptical. A certain amount of skepticism is healthy, but an overwhelming amount can be detrimental, and prevent people from adhering to new ideas.
Isaac Asimov once said, “When people thought the Earth was flat, they were wrong. When people thought the Earth was spherical, they were wrong. But if you think that thinking the Earth is spherical is just as wrong as thinking the Earth is flat, than your view is wronger than both of them put together.” In other words, even though a perfectly spherical view of the world is incorrect – it’s actually something called an oblate spheroid – a flat-Earth worldview is even more wrong.
So, people should not say, “Knowledge is being overturned, therefore everything is unknowable and we should just throw our hands up in the air.” As additional knowledge is gained, we get closer and closer to an understanding of the universe.
In your book, you said we should “Stop memorizing and just give up.” What did you mean by that?
[Laughs] I think that phrase by itself might be a little extreme.
A number of people have said that in the age of Google and the Internet, people’s memories are not as good as they used to be, because people now use the Internet as a crutch, and constantly look things up.
To a certain degree, that can actually be a good thing. People are going to have a more up-to-date view of the world if they’re constantly looking up things that they’re not entirely sure of. It’s a lot better than quoting at a dinner party some facts that you read five or 10 years ago in some magazine, not realizing that they’re entirely incorrect.
It’s certainly good to have a decent understanding of the world ¬– I’m not saying we shouldn’t bother learning anything – but if we look things up more often, we’re more likely to have a truer view of the world around us.
Samuel Arbesman is an applied mathematician and network scientist, and the author of The Half-Life of Facts