Superforecasting: essential for good government or the latest fad?

Picture credit: trendingtopics is licensed under CC BY 2.0
Share this article

Dominic Cummings, the most senior government adviser, ordered his colleague advisers (SPADs) to do some homework prior to their ‘away-day’ in July. The book they are required to read is Superforecasting: The Art and Science of Prediction by Philip Tetlock and Dan Gardner. This article explores what is behind this drive to influence change through Tetlock and Gardiner’s work and the implication it has for government.

Cummings has made no secret of his desire for more SPADs and civil servants to be recruited from maths, science and forecasting backgrounds. His quest for a more numerate civil service comfortable with ‘Bayesian methods’ (assessing probability) is also shared by Michael Gove with whom he worked in the Department of Education, the Vote Leave campaign and now in the cabinet office.

In his speech last weekend heralding the reform of the civil service, Gove lamented the civil service was “lagging behind many others in terms of numerical proficiency. But so many policy and implementation decisions depend on understanding mathematical reasoning … We need to ensure more policy makers and decision makers feel comfortable discussing the Monte Carlo method or Bayesian statistics.” Superforecasting has become required reading for those who want to understand where Cummings and Gove are coming from and what the future might have in store.

Image by mikey3982 is licensed under CC BY-NC 2.0

Superforecasting is the sort of book you might find in the departure lounge of an airport. The authors’ most famous quote “the average expert was roughly as accurate as a dart throwing chimpanzee” sets the tone for what follows. It in an easy read with lots of examples and anecdotes to keep you engaged. It is well written and covers a sound theoretical premise, but without bullying the reader into submission by excessive use of tables and complex equations. It falls into the trap of many (very successful) management books of identifying who or what was successful and retro-fitting its own theories onto the examples. But, this also makes for an entertaining read and helps illustrate the concepts.

Essentially, Tetlock and Gardiner are arguing for greater specificity in predictions or assessments and they provide a road map for those who want to achieve this. The book is the result of several years’ work on the ‘good judgment’ project, initially in partnership with the US’s office of the director of national intelligence – IARPA (intelligence advanced research projects activity).

Under competitive tournament rules the researchers found that ‘lay’ forecasters had a better success rate than ‘experts in the field’. All were highly intelligent and highly numerate, but not at the top of the intelligence scale or with a background in data analysis or modelling. The authors maintain the forecasters’ success was down to attitude, approach and learned skills, and it is these skills that Cummings seems to want to instil in civil servants and advisers.

The research was run under experimental conditions and, under those conditions, the superforecasters performed well. The questions they were required to answer were very specific such as “will North Korea attempt to launch a multi-stage rocket between 7 January 2013 and 1 September 2013?” The skills, as the book describes them, are highly analytical. This is not ‘blue-sky’ thinking about what might happen in the future, or a search for ‘black swan’ events that are unforeseen but often change the course of history.

Tetlock and Gardiner divide forecasters into two types – those whose analysis is seen through an ideological or ‘expert’ lens, and those who are pragmatic experts. The former are not that good at forecasting while the latter are likely to be much better but may also (300 pages later) turn out to be dangerous.

More from Stella Perrott:

Those who are more ideologically inclined or whose expertise has given them a singular perspective may also be less likely to change their minds as new information becomes available. Their views often reflect who they are and their sense of self might dissolve with changing their mind. For example there are those who believed that without unilateral nuclear disarmament the cold war would erupt into mass destruction, while others believed that it was only through preserving a fully stocked arsenal of nuclear weapons that it would not. Those whose predictions are highly influenced by ideology or expertise are often over confident in their forecasts – and so much more favoured on TV shows.

Successful forecasters get into the thinking of a range of other interests, including the so-called enemy, and avoid groupthink by challenging others and welcoming challenges to their own hypotheses or predictions. They too are confident but “intellectually humble” and are inclined to “learning not gloating”. Superforecasters are open minded, curious, seek mental work (cognition) and have a learning mindset. The basic process is to “try, fail, analyse, adjust, try again”.

There is much that is useful in this book. For anyone who has had to predict events as part of their job – and I do not means forecasters, I mean ordinary occupations such as in the civil service, social work, business management, policing, university admissions, the parole board, health or local government policy, almost any job – there are concepts and tools to aid better decision making.

The development of an analytical, open and sceptical mindset is, surely, an advantage in most work settings. Who could not want to see an approach that enables one to, “flush ignorance into the open”? The authors also suggest a “pre-mortem” process, where a team assumes failure of a draft decision and then members have to find explanations for the failure to double check the soundness of the decision.

The influence of the book on the current government can be seen – in Gove’s criticisms of the government’s failure to evaluate policy and test if it works; in Dominic Cummings’ search for more mathematicians and modellers in the Downing St team; and in some of the questions that were asked of Sage during the early days of managing the pandemic. But, on the whole, there is little evidence for it having much effect.

Tetlock and Gardener would, I think, be horrified at the politicisation of the civil service and the search for a Brexiter to be cabinet secretary: “for superforecasters, beliefs are hypotheses to be tested, not treasures to be guarded”. The government’s management of coronavirus is an example of where a superforecaster would insist on not just one review at some time in the future when the final death toll after a second or third wave may be known, but ongoing reviews as the figures and new information emerge over the course of the pandemic.

The government’s continued assertion that the pandemic has been managed well fits exactly Tetlock and Gardner’s examples of how ideologically bound forecasters are unable to see any folly in their predictions. Instead, they martial the arguments to fit their preferred view – and can always hide behind the lack of specificity about the end date to avoid their predictions being found wanting.

The government’s lack of precision about what it is trying to achieve, its bluster about the data on which decisions are based, and the absence of any feedback into its approach would be anathema to superforecasters. The lack of humility or preparedness to change direction when the circumstances demand it would also shock them, as would the favouring of militaristic approaches to decision-making and revising plans.

An indication of the success of the UK strategy to date

Superforecasters gives the example of the initially brilliant Wehrmacht (German armed forces) in Germany’s expansionist aims of the second world war. The authors attribute the success to decentralisation and professional autonomy on the battle field in contrast with the highly centralised control of the British and allied armies. Tetlock and Gardiner argue that once military action has been decided, it should be expedited quickly and with determination and local commanders should be free to adjust plans and take the initiative as circumstances on the ground demand. They do not recommend minute-by-minute re-calibrations from the centre once an attack has started. They maintain that strategies should be adhered to until and unless defeat is certain.

The government treated coronavirus as a war and commenced battle on a number of fronts. It pursued policies with the determination of battle generals but without the required conditions for success being in place. Successful battle strategy requires wise generals, clear lines of communication, devolved and competent authority and sufficient troops on the ground to give the strategy a fighting chance of success. Consequently, the government was forced to make hasty and disorganised retreats on lockdown, testing, schools reopening and so on. A less militaristic mindset might have understood better the early signals received from the external environment that its strategy would not work and adjusted plans sooner.

I suspect that Cummings would like his team to have the capabilities of superforecasters but not to use them unless they suit his aims, and certainly not as a challenge to his decisions. Evidence suggests that those who challenge his way of thinking are likely to get drummed out of the civil service, or even be sacked as ministers. His disdain for civil servants may also lead him to use the type of questioning Superforecasting encourages to expose personal weakness, rather than to achieve better policy making; forecasting as a weapon not a tool.

I suspect too that politicians, selected as candidates and then elected on the basis of their ideological orthodoxy, will tire of SPADs who try to enumerate the precise risks attached to policies they believe in. In criminal justice for example, it is rare for a home secretary to care much about the impact of their policies so long as it meets their ‘tough on crime’ election promises. In some areas of policy making, politicians may prefer not to know or want to feign ignorance about the outcomes of their decisions – for example, the immigration ‘hostile environment’ – and would resist measures that would expose the reality.

Superforecasting has the potential to support better government and many civil servants would welcome it as an addition to their skills repertoire. Sadly, I think politics will intervene and it will be a seven-day-wonder.

Can you help us reach more readers?