We all have probably heard people say things like the following:
- “I don’t know anyone who got sick / died from this.”
- “As far as I can tell, summers have always been warm.”
- “I can get anywhere with a bike, I don’t know why people need cars.”
- “I cannot imagine anyone voting for x.”
- “I have heard about this, but not seen it myself / in my social circles, so it cannot be real.”
- “Everyone I know agrees with me on this.”
And so it goes. Certainly, there can be many variations on these, and they all speak to a pattern: human beings are deeply relational people, we are highly social (as Aristotle already observed when calling us ζῷοι πολιτικοί “political animals”), and we seem to value information originating from our known circles (and our own observations) much more than that from people we do not personally know. We may make exceptions for celebrities, but that’s typically about it. We thrive, as it seems, on anecdotal knowledge – which is knowledge not gained by systematic, comparative, tested or even peer-reviewed research. We see our social circle as an extension of ourselves, and just as we tend to trust our own perception more than that of others, we trust the perception of those we know more than that of those we don’t know.
This is understandable, and it is an artifact of our evolutionary history. We can know and have relationships with only approximately 150 people, according to Robin Dunbar, and beyond that, everything becomes abstract. When it comes to our senses, barring disability, we are audiovisual creatures. We tend to believe our eyes and ears, even though such confidence is overrated. This may explain why you can so easily manipulate people with fake news on video – if we see something rather than only hear or read about it, we seem to believe it to be real.
As Robby Duffy has pointed out in The Perils of Perception, we do not really know most of the things we believe to know. Science and statistics are deeply counter-intuitive. They focus on the big picture, and we are limited by nature to our small picture. Our limited focus may have made sense when we lived in small groups as just another ape species roaming the planet – this is what evolution made, and it is all evolution could have “planned” for – but in larger societies, on a globalized planet, or even as we are set to roam across the Solar System, such a myopic view of the world will be deeply problematic, counterproductive, even dangerous. For us, as a species, to survive and succeed, we need to grow out of our narrow beginnings.
Science has allowed us to grow beyond that, but we seem to have taken its effects for granted and have largely forgotten how we got here: it has been the ability to overcome the shortcomings of the short-sightedness of our anecdotal approach to knowledge which have made it possible to effectively push back smallpox, measles, polio, cholera and a whole host of other plagues. Now, we seem to have unlearned that lesson. Hygiene is something we embarrassingly have to be reminded about. People refuse to vaccinate against measles, Covid-19, and other avoidable pests. Some of us listen to the already disproven anti-vaccine hysteria, believe that homeopathic treatments are anything else than heavily diluted quackery, some even believe in global conspiracies, chemtrails, flat-eartherism or other strange things. Obvious truths like global warming and climate change are ignored, and a world view steeped in the solipsistic insistence of one’s own self-centered beliefs is celebrated as a stance in favor of whatever is mistaken for “freedom,” while real freedom, real benefits to yourself and those around you can only come with the realization that we are not alone in the world, that our perception is limited, that we need science to overcome our own biases, and that we can only be free if we take care of each other.
To step out of the cave of our own perceptions, to use Plato’s metaphor, we need something to show us the path, to light the way – and as Carl Sagan has pointed out, that can only be science as our candle in the dark.
We know that, and yet, human beings do not seem to have changed that much. Take everything modern, everything enlightened, everything away that we have discovered, as a species, since the middle ages (and yes, the Middle Ages were a fantastic time of discoveries laying the groundwork for the Renaissance and the Enlightenment – after all, they couldn’t have come out of nothing!), take all this away, and we would still be the same old superstitious, tribally thinking people.
(Those familiar with old Doctor Who will remember “The Face of Evil”, the Fourth Doctor classic that introduces his companion Leela. Ostentatiously introduced as a pre-modern character from the “Sevateem” tribe, she actually grew up as part of a group of survivors, the “Survey Team” of a crashed spacecraft whose descendants reverted back to pre-literate and pre-scienctific and irrational people, illustrating the thin line distinguishing us from our distant ancestors, even though somehow their shaman knows it’s all BS.)
But once we realize – and as we are reminded again specifically during the Pandemic – that just because we live in (Post-)Modernity, that we have never been modern – as Bruno Latour insists – what does this mean?
Two things. First, we need to accept the fact that humans are what they are, and that only education can make a difference. This means that we should not become complacent in our assumptions that people should simply know these things, that “by now”, “in this century”, we should have “moved on.” Nonsense. We are what we are, and we have – mostly for better than for worse – outwitted evolution. It is a good thing that those less adapted to current society don’t have to die to remove their genes from the “gene pool.” It is an advancement in morality that we at least pretend to honor all human life – even though we still have to live up to this more seriously. There is no “unworthy” life, nobody is irredeemable, and we are all in this together. But that again means that education is the key, and we must not let up on this. Our children do not magically know better just because they are born in our times, they need to be taught science and scientific thinking, coupled with an ethic of care.
Second, we probably need to figure out how to turn anecdotal thinking from a weakness to a strength. Bill Clinton knew this very well: Always have both the data (science) and the personal interest story (anecdotal knowledge), always combine facts with narrative. If we focus too much on “cold,” hard facts, we miss the “warm” sense of belonging, of trust, of care. One of the lessons that can be drawn from Theodor Adorno and Max Horkheimer’s Dialectic of the Enlightenment is that “instrumental reason” – the use of reason as a means to an end – can lead to the opposite of what Immanuel Kant had called for, to never see humans as a means but always an end. Could the very ideas of the enlightenment – the call for reason and individuality – have indeed contributed to the rise of totalitarianism? Were not Italian fascism and Soviet socialism deeply tied to a futurist aesthetic?
If we only use reason as a yardstick, we are ignoring that human beings – through both their nature and their culture – are also irrational beings, stubbornly and proudly so. This is what makes us what we are. Only in accepting this fact can we move on to adding science to our repertoire.
What would be the consequences of such a two-pronged approach? I must admit, it pains me to say this, but it seems that Stephen J. Gould was right all along vis-à-vis Richard Dawkins. Gould famously called religion and science non-overlapping magisteria, meaning that both have a role to play and that they do not necessarily stand in opposition. Of course, this stands in stark contrast to Dawkins’ (otherwise very logical stance with which I have largely agreed so far) against religion – but here is the kicker: Dawkins and his colleagues – including Christopher Hitchens and others – criticize a certain form of religion, which almost amounts to a strawman argument. If religion – as arguably is the case for many – is indeed deeply irrational, if it is a belief in a literal deity, in wonders, in anti-science, then of course Dawkins would be correct. But is that all that religion is? Just as there can be a blind belief in science as a mere collection of “facts” (rather than as a process of continuous discovery and occasional revision and self-correction), there is also a form of religion that could be characterized as blind or child-like belief. But there is also such a thing as serious theology, theological criticism, etc. If we do not speak to the “soul” but only to the mind, we do not succeed in making a transition to modernity. Certainly, Dawkins himself has realized this – alas his many publications on insisting that there are plenty of wonders in the natural world that do not need any supernatural exaggeration. But if we see religion not as something that necessarily is about the supernatural but rather about psychology, the crass contrast between religion and science becomes nothing else that the contrast between the “inner” world of human dreams, desires and fears, and the “outer” world of measurable existence.
Now, that does not mean that we should accept anecdotal thinking as equal to scientific thinking, certainly not. But it means that if we want to be inclusive, to reach everyone, to counter the clear and present dangers in our world, we need to find a way to address the hopes and fears of those who believe that pure reason is never enough. If we cannot convince others of what we believe and know to be true, and what can be scientifically demonstrated, then something is amiss. I am not saying that this is easy, or that I have the answers on how to achieve this (I don’t), but I am not yet willing to give up.