Tag Archives: risk

Policy Analysis in 750 words: Using Statistics and Explaining Risk (Spiegelhalter and Gigerenzer)

Please see the Policy Analysis in 750 words series overview before reading the summary. This post is close to 750 words if you divide it by 2.

David Spiegelhalter (2018) The Art of Statistics: Learning from Data (Pelican, hardback)

Gerd Gigerenzer (2015) Risk Savvy (Penguin)

Spiegelhalter cover

Policy analysis: the skilful consumption and communication of information

Some use the phrase ‘lies, damned lies, and statistics’ to suggest that people can manipulate the presentation of information to reinforce whatever case they want to make. Common examples include the highly selective sharing of data, and the use of misleading images to distort the size of an effect or strength of a relationship between ‘variables’ (when we try to find out if a change in one thing causes a change in another).

In that context, your first aim is to become a skilled consumer of information.

Or, you may be asked to gather and present data as part of your policy analysis, and don’t seek to mislead people (see Mintrom and compare with Riker).

Your second aim is to become an ethical and skilled communicator of information.

In each case, a good rule of thumb is to assume that the analysts who help policymakers learn how to consume and interpret evidence are more influential than the researchers who produce it.

Research and policy analysis are not so different

Although research is not identical to policy analysis, it highlights similar ambitions and issues. Indeed, Spiegelhalter’s (2018: 6-7) description of ‘using data to help us understand the world and make better judgements’ sounds like Mintrom, and the PPDAC approach – identify a problem, plan how to study it, collect and manage data, analyse, and draw/ communicate conclusions (2018: 14) – is not so different from the ‘steps’ to policy analysis that you will find in Bardach or Weimer and Vining.

PPDAC requires us to understand what people need to do to ‘turn the world into data’, such as to produce precise definitions of things and use observation and modelling to estimate their number or the likelihood of their occurrence (2018: 6-7).

More importantly, consider our inability to define things precisely – e.g. economic activity and unemployment, or wellbeing and happiness – and need to accept that our estimates (a) come with often high levels of uncertainty, and (b) are ‘only the starting point to real understanding of the world’ (2018: 8-9).

In that context, the technical skill to gather and analyse information is necessary for research, while the skill to communicate findings is necessary to avoid misleading your audience.

The pitfalls of information communication

Speigelhalter’s initial discussion highlights the great potential to mislead, via:

  1. deliberate manipulation,
  2. a poor grasp of statistics, and/ or
  3. insufficient appreciation of (a) your non-specialist audience’s potential reaction to (b) different ways to frame the same information (2018: 354-62), perhaps based on
  4. the unscientific belief that scientists are objective and can communicate the truth in a neutral way, rather than storytellers with imperfect data (2018: 68-9; 307; 338; 342-53).

Potentially influential communications include (2018: 19-38):

  1. The type of visual, with bar or line-based charts often more useful than pie charts (and dynamic often better than static – 2018: 71)
  2. The point at which you cut off the chart’s axis to downplay or accentuate the difference between results
  3. Framing the results positively (e.g. survival rate) versus negatively (e.g. death rate)
  4. Describing a higher relative risk (e.g. 18%) or absolute risk (e.g. from 6 in 100 to 7 in 100 cases)
  5. Describing risk in relation to decimal places, percentages, or numbers out of 100
  6. Using the wrong way to describe an average (mode, median, or mean – 2018: 46)
  7. Using a language familiar to specialists but confusing to – and subject to misinterpretation by – non-specialists (e.g. odds ratios)
  8. Translating numbers into words (e.g. what does ‘very likely’ mean?) to describe probability (2018: 320).

These problems with the supply of information combine with the ways that citizens and policymakers consume it.

People use cognitive shortcuts, such as emotions and heuristics, to process information (see p60 of Understanding Public Policy, reproduced below).

It can make them vulnerable to framing and manipulation, and prompt them to change their behaviour after misinterpreting evidence in relation to risk: e.g. eating certain foods (2018: 33), anticipating the weather, taking medicines, or refusing to fly after a vivid event (Gigerenzer, 2015: 2-13).

p60 UPP 2nd ed heuristics

Dealing with scientific uncertainty

Communication is important, but the underlying problem may be actual scientific uncertainty about the ability of our data to give us accurate knowledge of the world, such as when:

  1. We use a survey of a sample population, in the hope that (a) respondents provide accurate answers, and (b) their responses provide a representative picture of the population we seek to understand. In such cases, professional standards and practices exist to minimise, but not remove biases associated with questions and sampling (2018: 74).
  2. Some people ignore (and other people underestimate) the ‘margin of error’ in surveys, even though they could be larger than the reported change in data (2018: 189-92; 247).
  3. Alternatives to surveys have major unintended consequences, such as when government statistics are collected unsystematically or otherwise misrepresent outcomes (2018: 84-5)
  4. ‘Correlation does not equal causation’ (see also The Book of Why).
  • The cause of an association between two things could be either of those things, or another thing (2018: 95-9; 110-15).
  • It is usually prohibitively expensive to conduct and analyse research – such as multiple ‘randomised control trials’ to establish cause and effect in the same ways as medicines trials (2018: 104) – to minimise doubt.
  • Further, our complex and uncontrolled world is not as conducive to the experimental trials of social and economic policies.
  1. The misleading appearance of a short term trend often relates to ‘chance variation’ rather than a long-term trend (e.g. in PISA education tables or murder rates – 2018: 131; 249).
  2. The algorithms used to process huge amounts of data may contain unhelpful rules and misplaced assumptions that bias the results, and this problem is worse if the rules are kept secret (2018: 178-87)
  3. Calculating the probability of events is difficult to do, agree how to do, and to understand (2018: 216-20; 226; 239; 304-7).
  4. The likelihood of identifying ‘false positive’ results in research is high (2018: 278-80). Note the comparison to finding someone guilty when innocent, or innocent when guilty (2018: 284 and compare with Gigerenzer, 2015: 33-7; 161-8). However, the professional incentive to minimise these outcomes or admit the research’s limitations is low (2018: 278; 287; 294-302)

Developing statistical and risk ‘literacy’

In that context, Spiegelhalter (2019: 369-71) summarises key ways to consume data effectively, asking: how rigorous is the study, how much uncertainty remains, if the measures are chosen and communicated well, if you can trust the source to do the work well and not spin the results, if the claim fits with other evidence and has a good explanation, and if the effect is important and relevant to key populations. However:

  1. Many such texts describe how they would like the world to work, and give advice to people to help foster that world. The logical conclusion is that this world does not exist, and most people do not have the training, or use these tips, to describe or consume statistics and their implications well.
  2. Policymaking is about making choices, often under immense time and political pressure, in the face of uncertainty versus ambiguity, and despite our inability to understand policy problems or the likely impact of solutions.

I’m not suggesting that, as a result, you should go full Riker. Rather, as with most of the posts in this series, reflect on how you would act – and expect others to act – during the (very long/ not very likely) transition from your world to this better one. What if your collective task is to make just enough sense of the available information, and your options, to make good enough choices?

Gigerenzer cover

In that context, Gigerenzer (2015) identifies the steps we can take to become ‘risk savvy’.

Begin by rejecting (a) the psychological drive to seek the ‘safety blanket’ of certainty (and avoid the ‘fear of doing something wrong and being blamed’), which causes people to (b) place too much faith in necessarily-flawed technologies or tests to reduce uncertainty, instead of (c) learning some basic tools to assess risk while accepting the inevitability of uncertainty and ambiguity (2015: 18-20; 43; 32-40).

Then, employ simple ‘mind tools’ to assess and communicate risk in each case:

  1. Communicate risk using appropriate visuals, categories, and descriptions of risk (e.g. with reference to absolute risk and ‘natural frequences’, expressed as a proportion of 100 (not a % or decimal) and be sceptical if others do not (2015: 25-7; 168)
  • E.g. do not confuse calculations of risk based (a) on known frequencies (such as the coin toss), and (b) unknown frequencies (such as outcomes of complex systems) (2015: 21-6).
  • E.g. be aware of the difference between (a) the accuracy of a test of a problem (its ability to minimise false positive/ negative results), (b) the likelihood that you have the problem it is testing, and (c) the extent to which you will benefit from an intervention for that problem (2015: 33-7; 161-8; 194-7)
  1. Use heuristics that are shown to be efficient and reliable in particular situations.
  • rely frequently on ‘gut feeling’ (‘a judgment 1. that appears quickly in consciousness, 2. whose underlying reasons we are not fully aware of, yet 3. it is strong enough to act upon’)
  • accept the counterintuitive sense that ‘ignoring information can lead to better, faster, and safer decisions’ (2018: 30-1)
  • equate intuition with ‘unconscious intelligence based on personal experience and smart rules of thumb. You need both intuition and reasoning to be rational’ (2018: 123-4)
  • find efficient ways to trust in other people and practices (2018: 99-103)
  • ‘satisfice’ (choose the first option that satisfies an adequate threshold, rather than consider every option) (2018: 148-9)
  1. Value ‘good errors’ that allow us to learn efficiently (via ‘trial and error’) (2015: 47-51)

Wait a minute

I like these Gigerenzer-style messages a lot and, on reflection, seem to make most of my choices using my gut and trial-and-error (and I only electrocuted myself that one time; I’m told that I barked).

Some of his examples – e.g. ask if a hospital uses airline-style checklists (2015: 53; see also Radin’s checklist), or ask your doctor how they would treat their relative, not yours (2015: 63) – are intuitively appealing. The explainers on risk are profoundly important.

However, note that Gigerenzer devotes a lot of his book to describing the defensive nature of sectors such as business, medicine, and government, linked strongly the absence of the right ‘culture’ to allow learning through error.

Trial and error is a big feature in complexity theory, and Lindblom’s incrementalism, but also be aware that you are surrounded by people whose heuristic may be ‘make sure you don’t get the blame’ (or ‘procedure over performance’ (2018: 65). To recommend trial-and-error policy analysis may be a hard sell.

Further reading

This post is part of the Policy Analysis in 750 words series

The 500 and 1000 words series describe how people act under conditions of bounded rationality and policymaking complexity

Winners and losers: communicating the potential impacts of policies (by Cameron Brick, Alexandra Freeman, Steven Wooding, William Skylark, Theresa Marteau & David Spiegelhalter)

See Policy in 500 Words: Social Construction and Policy Design and ask yourself if Gigerenzer’s (2015: 69) ‘fear whatever your social group fears’ is OK when you are running from a lion, but not if you are cooperating with many target populations.

The study of punctuated equilibrium theory is particularly relevant, since its results reject the sense that policy change follows a ‘normal distribution’. See the chart below (from Theories of the Policy Process 2; also found in 5 Images of the Policy Process) and visit the Comparative Agendas Project to see how they gather the data.

True et al figure 6.2

11 Comments

Filed under 750 word policy analysis, public policy