According to Karl Popper - the super star of epistemology (the critical study of science) - science never proves that theories are true. Instead, science constructs so-called verifiable theories. A verifiable theory can be confirmed or disproved by another researcher. This does not mean that it is true. A well-known example: I can theorize that all crows are black. When other researchers find black crows, they verify the theory. But if, by misfortune, a researcher finds a white crow, then the theory that all crows are black is false (more details in this article by Pierre Thierry)..
Empiricism, what-is-that ?
Scientists do not normally claim to define immutable theories engraved in rock, but are on the contrary ready to question their knowledge at any time (normally). This way of thinking is what I call empirism.
To think empirically, for me, consists in building one's world view on verifiable facts. Corroborating this is that I must also be aware of the theories I believe in, despite the absence of proof, and remain very open about them.
For example, I am firmly convinced that a (much) more left-wing economic policy would make it possible to better combat inequalities and the ecological crisis. On the other hand, I have little empirical data that would allow me to confirm this. I do have some clues, but I know that they have to be treated with caution.
Let's take another theory - at random : making the labour market more flexible creates jobs. It's not a constructed theory. It's an empirical one. We can hope, magically, that it will work. But, objectively, it is difficult to find empirical evidence for this phenomenon. The best way to confirm it empirically would be to set up a scientific protocol. For example, we would have companies in which we would simplify dismissals and control companies in which we would keep the current dismissal legislation. If firms with less legislation create more jobs, then I would be willing to believe this theory (it is difficult to implement, but it is for me the only way to confirm the theory empirically).
As a good Bayesian, I try to adjust the credence in my theories about the world (credence=confidence in my theories) according to the empirical data I learn.
In my opinion, empiricism is the best way to build a correct vision of the world: a world where one is aware of what one does not know, and a world where - by accumulating diversified evidence - one knows how to evaluate the certainty with which one believes in a theory.
And in practice?
Condescending politicians like to talk about the lack of critical spirit of their very (few) dear fellow citizens by talking about the famous scourge of fake news (the infox instilled by the Russians, for example) as if it were the biggest barrier to an enlightened democracy. It's easy to beat up the Russian bad guys. Rocky was already doing it in '85.
In reality the lack of empiricism is a pandemic that blindly affects all strata of society. Because empiricism is long, boring, and it can even prove us wrong (if even researchers find it hard to accept that their theories are undermined by facts, then it's understandable that normal humans are not very sensitive to this cause).
When I want to refine my belief in a theory, I can rely on several categories of clues (from the most convincing to the least convincing):
- large scientific studies
- the opinion of experts that I trust
- isolated scientific studies
- anecdotes, stories and other miscellaneous facts...
Generally large scientific studies (meta-analyses, double-blind, several thousand samples...) are rare, but when there is one, we can rely on it (besides, we are still waiting for the meta-study showing that homeopathy cures better than placebo).
The experts I absolutely trust are quite few in number (a few dozen at the most). They have written books that I have read, or they have behind them some facts of arms in their field. I take their opinion into account when it comes to issues close to their field. For example, Nassim Nicholas Taleb said that the world is on the verge of an economic crisis worse than in 2007He is a successful former trader, now a university professor who has written several fundamental books on economics, who is close to many intellectuals (including Mandelbrot, for example), who speaks many languages, and who spends most of his time reading hundreds of books by authors he considers wiser than himself. He is very, very critical of the financial world, even though he was an insider at the beginning of his life. His prediction of another impending crisis increases my credence in that theory. I'm not sure there will be a crisis, but it's possible. I therefore estimate that there is a 2:1 chance of a major economic crisis in the next 3 years (note that 2:1, 67% is not a huge certainty either). Moreover, I will adjust my confidence in Taleb if the crisis doesn't happen.
It is noticeable that I have more confidence in these experts than in isolated studies. I have learned to be wary of isolated studies for several reasons:
- They may have been misinterpreted
- many studies are wrong (maybe even the majority!)
- publication biases mean that 'surprising' studies are more likely to gain great
mediatic coverage, even though they are more likely to be wrong.
The anecdote: cancer of the society
The news item or personal fact are the worst arguments that can come out of the mouth of a human being. And yet, they are used in all of their form daily and in all social strata...
In my next article, I will come back precisely to them and the treatments to apply in their presence.
(to be continued)
Edit : thanks to Pierre Lavie for the corrections 🙂