Brighton – Why do we seem to be witnessing an increasing number of nasty technological surprises? Indeed, this year’s Fukushima nuclear disaster in Japan and last year’s BP oil spill in the Gulf of Mexico have taken their place alongside older problems, such as ozone depletion. We believe that the way in which scientific advice is developed and communicated lies at the heart of the question.
Science is increasingly used to support what are essentially public-policy decisions, particularly concerning new and complex technologies like genetically modified (GM) foods, novel chemicals, and contending energy infrastructures. Decisions about options and how to implement them are difficult, owing to uncertainties over hazards, benefits, and potential side effects. Doubts surround not just likelihoods, but extend to the outcomes themselves, and what they might mean. Powerful economic interests are often at stake, raising the pressures even more.
Too often, expert opinion is thought most useful to policymakers when presented as a single “definitive” interpretation. As a result, experts typically understate uncertainty. And, to the extent that they acknowledge uncertainty, they tend to reduce unknowns to measurable “risk.”
Yet risk is just one – relatively tractable – aspect of uncertainty. Beyond familiar notions of risk lie deeper predicaments of ambiguity and ignorance.
Such insights are not new, but they are often neglected in policymaking. They can be traced back to the economist Frank Knight’s 1921 book Risk, Uncertainty, and Profit. Knight recognized that a significant distinction should be drawn between outcomes whose probabilities are well characterized (“risks”) and those where this is not possible (“uncertainties”).
Examples of risk arise when individuals or groups are confident in their accumulated knowledge or experience. This is the case with many established consumer products, routine transport safety, or the incidence of familiar diseases.
Where we face uncertainty, though, we are confident in our knowledge of possible outcomes, but not of their likelihood – owing either to difficulties in prediction or to lack of information. And yet scientific advisers face strong temptations and pressures to treat every uncertainty as a risk.
Unfortunately, there are two further and even more problematic aspects of uncertainty. These arise where we are unsure not just about how likely different outcomes are, but also about which outcomes are relevant. What options should be considered? What might these mean given differing views and interests? How should we classify and prioritize costs and benefits?
Examples of such ambiguity arise in fields as diverse as nuclear power, GM food, and the Iraq War. Each is definitely happening (so “probability” is not the problem), but what do they mean? Do they leave the world better or worse? In what senses? What are the alternatives, if any?
Different experts, studies, and organizations adopt contrasting but often equally legitimate and scientifically-founded perspectives on these questions. To try to enforce a single definitive interpretation is deeply misleading, and thus unhelpful – and potentially dangerous – in policymaking. Indeed, there can be no guarantee under ambiguity that even the best scientific analysis will lead to a definitive policy answer. Consequently, fully “science-based decisions” are not just difficult to achieve; they are a contradiction in terms.
The final, most intractable aspect of incomplete knowledge is ignorance. Here, our understanding of both likelihoods and the possibilities themselves is problematic.
Of course, no one can reliably foresee the unpredictable, but we can learn from past mistakes. One example is the belated recognition that seemingly inert and benign halogenated hydrocarbons were interfering with the ozone layer. Another is the slowness to acknowledge the possibility of novel transmission mechanisms for spongiform encephalopathies (“mad cow disease”).
In their early stages, these sources of harm were not formally recognized, even as possibilities. Instead, they were “early warnings” offered by dissenting voices. Policy recommendations that miss such warnings court over-confidence and error.
The key question is how to move from the narrow focus on risk toward broader and deeper understandings of incomplete knowledge – and thus to better scientific policy advice.
One high-stakes – and thus particularly politicized – context for expert policy advice is the setting of financial interest rates. In the United Kingdom, the Bank of England’s Monetary Policy Committee describes its expert advisory process as a “two-way dialogue” – with a priority placed on public accountability. Great care is taken by officials to inform the Committee not just of the results of formal analysis by the sponsoring bodies, but also of complex real-world conditions and perspectives. Reports detail contrasting recommendations by individual members and explain reasons for differences. Why is this kind of thing not normal in scientific advising?
When faced with immeasurable uncertainties, it is much more common for a scientific committee to spend hours negotiating a single interpretation of the risks, even when faced with a range of contending but equally well-founded analyses and judgments, often from different (but equally scientific) fields and disciplines. As we know from the work of Thomas Kuhn and other philosophers of science, dominant paradigms do not always turn out to be the most accurate. Knowledge is constantly evolving, and it thrives on skepticism and diversity.
Experience with standard-setting for toxic substances, and policy processes concerning the safety of various GM and energy technologies, shows that it would often be more accurate and useful to accept divergent expert interpretations, and focus instead on documenting the reasons underlying disagreement. Concrete policy decisions could still be made – possibly more efficiently. Moreover, the decision’s relationship with the available science would be clearer, and the inherently political dimensions more transparent.
Instead of seeking definitive global judgments about the risks of particular choices, it is wiser to consider the assumptions behind such advice – since these are central in determining the conditions under which the advice is relevant. Above all, there is a constant need for humility about “science-based decisions.”
Andy Stirling is Research Director at SPRU (Science and Technology Policy Research) at the University of Sussex. Alister Scott is a leadership consultant and Visiting Fellow at SPRU.
Copyright: Project Syndicate, 2011.
Read more from Andy Stirling here, or Alister Scott here.
Truthout Is Preparing to Meet Trump’s Agenda With Resistance at Every Turn
Dear Truthout Community,
If you feel rage, despondency, confusion and deep fear today, you are not alone. We’re feeling it too. We are heartsick. Facing down Trump’s fascist agenda, we are desperately worried about the most vulnerable people among us, including our loved ones and everyone in the Truthout community, and our minds are racing a million miles a minute to try to map out all that needs to be done.
We must give ourselves space to grieve and feel our fear, feel our rage, and keep in the forefront of our mind the stark truth that millions of real human lives are on the line. And simultaneously, we’ve got to get to work, take stock of our resources, and prepare to throw ourselves full force into the movement.
Journalism is a linchpin of that movement. Even as we are reeling, we’re summoning up all the energy we can to face down what’s coming, because we know that one of the sharpest weapons against fascism is publishing the truth.
There are many terrifying planks to the Trump agenda, and we plan to devote ourselves to reporting thoroughly on each one and, crucially, covering the movements resisting them. We also recognize that Trump is a dire threat to journalism itself, and that we must take this seriously from the outset.
Last week, the four of us sat down to have some hard but necessary conversations about Truthout under a Trump presidency. How would we defend our publication from an avalanche of far right lawsuits that seek to bankrupt us? How would we keep our reporters safe if they need to cover outbreaks of political violence, or if they are targeted by authorities? How will we urgently produce the practical analysis, tools and movement coverage that you need right now — breaking through our normal routines to meet a terrifying moment in ways that best serve you?
It will be a tough, scary four years to produce social justice-driven journalism. We need to deliver news, strategy, liberatory ideas, tools and movement-sparking solutions with a force that we never have had to before. And at the same time, we desperately need to protect our ability to do so.
We know this is such a painful moment and donations may understandably be the last thing on your mind. But we must ask for your support, which is needed in a new and urgent way.
We promise we will kick into an even higher gear to give you truthful news that cuts against the disinformation and vitriol and hate and violence. We promise to publish analyses that will serve the needs of the movements we all rely on to survive the next four years, and even build for the future. We promise to be responsive, to recognize you as members of our community with a vital stake and voice in this work.
Please dig deep if you can, but a donation of any amount will be a truly meaningful and tangible action in this cataclysmic historical moment.
We’re with you. Let’s do all we can to move forward together.
With love, rage, and solidarity,
Maya, Negin, Saima, and Ziggy