by Ben Bowles
What happens when two experts give you completely opposing advice?
This, for me, is the question at the heart of what many have called the confused, reckless, or even, in the opinion of some commentators, criminally negligent approach to the Coronavirus outbreak taken by the UK Government, especially in the early days during their dalliance with herd immunity strategies. It reveals assumptions about science and its role in society that I think are useful for the anthropologist to unpack.
There is an assumption, born out of the enlightenment, positivism, empiricism and all of the other troubling “isms”, that science is a repository of perfect and pure knowledge about the real world, out there, as it actually exists, represented perfectly through abstract models and ideas. When I say assumption, I do not mean that very many scientists hold this view. Indeed, most scientists will speak at length about all the things they do not know, the fragments of knowledge that they grasp on to as best as they can, the models that they understand only work as well as they work for now. They will talk about how science is always trying to prove itself wrong, never take anything for granted, and free itself from ideology and assumptions. Penny Harvey and Hannah Knox, in their book “Roads: An Anthropology of Infrastructure and Expertise,” make the distinction between the models (completely abstract and unrealistic) that engineers build in order to convince people to let them build roads in Peru, and the actual work that they do – messy, full of workarounds, rules of thumb, best guesses and imperfections, assuming nothing beforehand – in interaction with flattened road surfaces, to actually build the thing and stop it from collapsing. The engineers know that their models are just rhetorical devices; what they actually do in the end is not a form of expertise based on modelling or abstract formulae, but more one akin to muddling through.
This may well be true, but ideologies are sticky, especially when communities begin to assume that that they do not have them. The view of scientific thinking with which I opened the paragraph above is most commonly held by decision-makers, often in governments, a lot of the public, and many of the people who write about science. This makes it float around in the discourse (if I may borrow Foucault) and means that when things go wrong, there is an understandable tendency to call the experts in and expect accurate, realistic versions of what the world looks like. Models that we can then use for transformative purposes.
A detour into economics
This can be seen most clearly, for example, when in the aftermath of the financial crisis of 2007/2008, politicians in a number of states were replaced by “technocrats”; economists and administrators assumed to be more rational, less biased and to carry less ideology than the politicians. Now, anyone who knows any neoclassical economists knows that they are steeped in a very particular ideology of how the world works, based on the validity of particular models called “Computable General Equilibrium” (GCE) models. These models are used to predict the economy, but they do not work, not least because they assume rational actors (people who always want to maximise economic returns) and an economic system that returns itself to a harmonious equilibrium if it ever is upset. Adam Smith’s invisible hand informs these models, manipulating them into perfectly working order. Indeed, the world of economics is so divided that the whole of mainstream economics is concerned with tinkering with these models and applying them, whereas everything else that deals with… anything else, is sidelined entirely. This runs the range from Marxist political economists who want to consider how power and coercion works in the system, to the behavioural economists who want to look at people’s actual actions in economies (including the irrational stuff, coining for us such useful ideas as “confirmation bias” and “peak end bias”). These thinkers, often called “heterodox” or “pluralist” economists, are relegated to the edges of the discipline, from where they are unable to publish in highly ranked economics journals, or get jobs in most economics departments, or take their work out into the public sector. So much for “technocrats” being without ideology.
“People in this country have had enough of experts” – Michael Gove
What I have described so far is of course just another way into French post-structuralism. There is, as Foucault, Derrida, Deleuze and others tell us, no objective point from which to observe the world. Everything and everyone has a subject position from which they look out into reality. Ideas about how the world actually exist change and mutate. What is certain now will look strange or foolish in twenty years time, and was unthinkable twenty years ago. Foucault’s examples of how this works in the world(s) of science are especially revealing, and I would recommend his archaeology of how the idea of madness has changed through European intellectual history to any reader (Madness and Civilisation).
So, is there no truth then? We have had enough of experts because they just give us their own positioned and biased truth? Sounds a lot like one of the main devices of populism; the grand and worrying denial of facts in the post-truth world. And this is one of the major mistakes of populism: to equate the fact that 1) truth claims are contingent and positioned (yes, sure) with 2) that they are all as useful, for all purposes, as one another (absolutely not). I would rather cross a dangerous river on a bridge made by an engineer with an experience-led best-guess application of some set of models, even if they are slightly incorrect abstractions of reality, than a bridge made by, for instance, Michael Gove, or most anthropologists for that matter. Ideas and models are good enough to get jobs done; maps are not the territory, but can be useful enough to get you down the road. On the one hand, there are people convinced that the model is a perfect representation of reality (living in what Lenny Smith calls “model land”, a kind of utopia of mistaken thinking in which people live who take their abstractions of the world to be the world); on the other hand, some people are trying to tear down the ivory towers of the experts by ripping up the same models. Which gets us on to the bind that the UK Government found itself in…
“[We will] turn the tide” – Boris Johnson
The UK Government, many of whose members harbour populist and anti-expert tendencies, is currently headed by figures like Boris Johnson, Michael Gove and Jacob Rees-Mogg. What unites them is the way they seem to run their lives and political careers on narratives. Specifically, they are weavers of stories about the UK’s “glory days,” the days of Empire and power beyond the nation’s small size. This was of course the narrative content of Brexit. These ideas are behind the confused utterances we have been seeing from Boris Johnson, who seems to be focused on not wanting to impinge on the “freedoms” of the British people through acting like a (not so entrepreneurial, less stridently self-directing, continental! big state!) nation like France. It leads to the kind of absurd anti-action from the leadership that claimed the “tide will be turned” on Coronavirus within 12 weeks (as if viruses respond to rhetoric, and he is his hero Churchill making speeches against the Nazis).
At the same time, the Government has also expressed the opposite tendency that I described first: to call in the grownups. To call experts and to expect their abstracted models of the world to be accurate representations not just of what the world looks like now, but also what it will look like in the future. This led to the Government first following a strategy that was based on a very new form of modelling known as “behavioural science” that attempts to predict how large populations will react when they are scared and in danger. As an anthropologist, I am obviously skeptical about the use of these kinds of models, as was the Editor-in-Chief of The Lancet when he said “we are perhaps placing too much emphasis on behavioural science.” It led, it appears, to the Government following, for a time, a “herd immunity” strategy (although the Government do not now accept that this is what they were doing): the plan was to allow 60% of the population to become infected with Covid-19. The Government distanced itself from this plan after another paper, by another set of experts, demonstrated the shocking numbers of anticipated fatalities that such a strategy would cause.
What happens when two experts give you completely opposing advice? One result appears to be mixed messages: Boris Johnson telling the public not to go home for Mothers’ Day then saying he hoped to see his own mum; encouraging us to not go for a walk, but also remarking that fresh air is good for you. The Government seemed not to recognise that models are anything other than perfect representations of reality and that not all experts are the same. Some experts give you material from the new and evolving Wild West frontier of behavioural science, others will give you their best guess of the course of a disease that everyone is trying to learn more about, given a wealth of experience, epidemiological data, models that are constantly being tested and updated, and educated best guesses. Science is not just simply science, an unvariegated vat of knowledge, and it certainly does not, in a simple equation, equal truth.
I worked on a project with some of the UK’s Cabinet Office last year and we were funded through a group called CRUISSE (Confronting Radical Uncertainty in Society, Science and the Environment) based at UCL. This group wants to enable decision-makers to accept that some decisions cannot be based on probabilities and modelling; some decisions involve radical uncertainty (also called “Knightian uncertainty”). This goes against, CRUISSE say, the tendencies of many sciences, and especially regimes of knowledge like economics, to put a probability to everything and to try to construct beautiful and useless models of a future that will never exist. David Tuckett, one of the leaders of CRUISSE, has helped to develop “Conviction Narrative Theory” that says, basically, when under the conditions of genuine uncertainty, decision-makers and modellers act based on how they think the world works (convictions) and stories (narratives). Are any anthropologists surprised by this conclusion?
The Government in the UK is caught between an un-nuanced and, sometimes paradoxically uncritical understanding of expertise on one hand, and a set of conviction narratives around the strength, durability and free-determining spirit of the nation on the other. This is because ultimately we don’t know very much about Covid-19, and that lack of certainty and knowledge goes against how people in power (and probably most people) hope that the world works. Living in radical uncertainty is not comforting. It is possible that the herd immunity strategy would have been better for most people in the long run (the perspective for arguing which implies a very specific Utilitarian way of thinking for a start, and one that is also entirely blind to social inequalities); but no-one knows, and under those conditions, such a strategy becomes unthinkable.
“This is my truth, now tell me yours” – Aneurin Bevan
So, if decisions are actually more uncertain than we like to think, and if experts can be (very, catastrophically) wrong, what on earth do we do? By we, I mean us, and I mean Government decision-makers, too. This is an argument for what I hope will be a new kind of knowledge worker. I’m not sure whether they need to be called anthropologists or not, but we urgently need people who can tell decision-makers what their own conviction narratives and half-hidden ideologies are so that they can try to think outside of them.
We also need to encourage the right kind of deference to expertise, one that is not uncritically consuming the advice of experts, but rather understanding what kind of experts, with what experience and ideas about the world, are giving the advice. We need to encourage a place for those experts with experience, with adaptable models that are not rigid and stubborn when new information comes in that invalidates them, and with practical workarounds and rules of thumb that provide us with useful tools for an uncertain world.
But most importantly, these knowledge workers need to stop expertise from becoming a place of monoculture like mainstream economics has become. When decisions are made by particularly positioned people (people who can think about “spending” lives during a crisis without thinking about whose lives, for example) it demonstrates the need for more experts with more perspectives and more experience, rather than fewer experts. These knowledge workers are needed in order to bring in perspectives from outside of what may look like the normal places that house expertise (for example, from anthropology, sociology, history, people’s own communities) that may be absent from the rooms where people receive expert advice and make decisions. With Nye Bevan, the architect of the NHS, we need to say “this is my truth, now tell me yours” and widen the number of (imperfect, contingent) truths we have at our disposal.
Decision makers need to be empowered to make difficult decisions under uncertainty, and that means a lot of carefully measured, and yes, probably conflicting, statements by experts. Which is scary; but, when we have been led to believe in either the infallibility or the complete redundancy of experts, it is unthinkable. We need it to be thinkable again. This type of decision-making is not what we have in the UK Government right now, and as a citizen reacting to an incoherent response to a genuinely unpredictable crisis, this scares me more than most other aspects of the pandemic.