Tag Archives: psychology

Emotion and reason in politics: the rational/ irrational distinction

In ‘How to communicate effectively with policymakers’, Richard Kwiatkowski and I use the distinction between ‘rational’ and ‘irrational’ cognitive shortcuts ‘provocatively’. I sort of wish we had been more direct, because I have come to realise that:

  1. My attempts to communicate with sarcasm and facial gestures may only ever appeal to a niche audience, and
  2. even if you use the scare quotes – around a word like ‘irrational’ – to denote the word’s questionable use, it’s not always clear what I’m questioning, because
  3. you need to know the story behind someone’s discussion to know what they are questioning.*

So, here are some of the reference points I’m using when I tell a story about ‘irrationality’:

1. I’m often invited to be the type of guest speaker that challenges the audience, it is usually a scientific audience, and the topic is usually evidence based policymaking.

So, when I say ‘irrational’, I’m speaking to (some) scientists who think of themselves as rational and policymakers as irrational, and use this problematic distinction to complain about policy-based evidence, post-truth politics, and perhaps even the irrationality of voters for Brexit. Action based on this way of thinking would be counterproductive. In that context, I use the word ‘irrational’ as a way into some more nuanced discussions including:

  • all humans combine cognition and emotion to make choices; and,
  • emotions are one of many sources of ‘fast and frugal heuristics’ that help us make some decisions very quickly and often very well.

In other words, it is silly to complain that some people are irrational, when we are all making choices this way, and such decision-making is often a good thing.

2. This focus on scientific rationality is part of a wider discussion of what counts as good evidence or valuable knowledge. Examples include:

  • Policy debates on the value of bringing together many people with different knowledge claims – such as through user and practitioner experience – to ‘co-produce’ evidence.
  • Wider debates on the ‘decolonization of knowledge’ in which narrow ‘Western’ scientific principles help exclude the voices of many populations by undermining their claims to knowledge.

3. A focus on rationality versus irrationality is still used to maintain sexist and racist caricatures or stereotypes, and therefore dismiss people based on a misrepresentation of their behaviour.

I thought that, by now, we’d be done with dismissing women as emotional or hysterical, but apparently not. Indeed, as some recent racist and sexist coverage of Serena Williams demonstrates, the idea that black women are not rational is still tolerated in mainstream discussion.

4. Part of the reason that we can only conclude that people combine cognition and emotion, without being able to separate their effects in a satisfying way, is that the distinction is problematic.

It is difficult to demonstrate empirically. It is also difficult to assign some behaviours to one camp or the other, such as when we consider moral reasoning based on values and logic.

To sum up, I’ve been using the rational/irrational distinction explicitly to make a simple point that is relevant to the study of politics and policymaking:

  • All people use cognitive shortcuts to help them ignore almost all information about the world, to help them make decisions efficiently.
  • If you don’t understand and act on this simple insight, you’ll waste your time by trying to argue someone into submission or giving them a 500-page irrelevant report when they are looking for one page written in a way that makes sense to them.

Most of the rest has been mostly implicit, and communicated non-verbally, which is great when you want to keep a presentation brief and light, but not if you want to acknowledge nuance and more serious issues.

 

 

 

 

*which is why I’m increasingly interested in Riker’s idea of heresthetics, in which the starting point of a story is crucial. We can come to very different conclusions about a problem and its solution by choosing different starting points, to accentuate one aspect of a problem and downplay another, even when our beliefs and preferences remain basically the same.

 

 

5 Comments

Filed under Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy, Storytelling

Three ways to communicate more effectively with policymakers

By Paul Cairney and Richard Kwiatkowski

Use psychological insights to inform communication strategies

Policymakers cannot pay attention to all of the things for which they are responsible, or understand all of the information they use to make decisions. Like all people, there are limits on what information they can process (Baddeley, 2003; Cowan, 2001, 2010; Miller, 1956; Rock, 2008).

They must use short cuts to gather enough information to make decisions quickly: the ‘rational’, by pursuing clear goals and prioritizing certain kinds of information, and the ‘irrational’, by drawing on emotions, gut feelings, values, beliefs, habits, schemata, scripts, and what is familiar, to make decisions quickly. Unlike most people, they face unusually strong pressures on their cognition and emotion.

Policymakers need to gather information quickly and effectively, often in highly charged political atmospheres, so they develop heuristics to allow them to make what they believe to be good choices. Perhaps their solutions seem to be driven more by their values and emotions than a ‘rational’ analysis of the evidence, often because we hold them to a standard that no human can reach.

If so, and if they have high confidence in their heuristics, they will dismiss criticism from researchers as biased and naïve. Under those circumstances, we suggest that restating the need for ‘rational’ and ‘evidence-based policymaking’ is futile, naively ‘speaking truth to power’ counterproductive, and declaring ‘policy based evidence’ defeatist.

We use psychological insights to recommend a shift in strategy for advocates of the greater use of evidence in policy. The simple recommendation, to adapt to policymakers’ ‘fast thinking’ (Kahneman, 2011) rather than bombard them with evidence in the hope that they will get round to ‘slow thinking’, is already becoming established in evidence-policy studies. However, we provide a more sophisticated understanding of policymaker psychology, to help understand how people think and make decisions as individuals and as part of collective processes. It allows us to (a) combine many relevant psychological principles with policy studies to (b) provide several recommendations for actors seeking to maximise the impact of their evidence.

To ‘show our work’, we first summarise insights from policy studies already drawing on psychology to explain policy process dynamics, and identify key aspects of the psychology literature which show promising areas for future development.

Then, we emphasise the benefit of pragmatic strategies, to develop ways to respond positively to ‘irrational’ policymaking while recognising that the biases we ascribe to policymakers are present in ourselves and our own groups. Instead of bemoaning the irrationality of policymakers, let’s marvel at the heuristics they develop to make quick decisions despite uncertainty. Then, let’s think about how to respond effectively. Instead of identifying only the biases in our competitors, and masking academic examples of group-think, let’s reject our own imagined standards of high-information-led action. This more self-aware and humble approach will help us work more successfully with other actors.

On that basis, we provide three recommendations for actors trying to engage skilfully in the policy process:

  1. Tailor framing strategies to policymaker bias. If people are cognitive misers, minimise the cognitive burden of your presentation. If policymakers combine cognitive and emotive processes, combine facts with emotional appeals. If policymakers make quick choices based on their values and simple moral judgements, tell simple stories with a hero and moral. If policymakers reflect a ‘group emotion’, based on their membership of a coalition with firmly-held beliefs, frame new evidence to be consistent with those beliefs.
  2. Identify ‘windows of opportunity’ to influence individuals and processes. ‘Timing’ can refer to the right time to influence an individual, depending on their current way of thinking, or to act while political conditions are aligned.
  3. Adapt to real-world ‘dysfunctional’ organisations rather than waiting for an orderly process to appear. Form relationships in networks, coalitions, or organisations first, then supply challenging information second. To challenge without establishing trust may be counterproductive.

These tips are designed to produce effective, not manipulative, communicators. They help foster the clearer communication of important policy-relevant evidence, rather than imply that we should bend evidence to manipulate or trick politicians. We argue that it is pragmatic to work on the assumption that people’s beliefs are honestly held, and policymakers believe that their role is to serve a cause greater than themselves. To persuade them to change course requires showing simple respect and seeking ways to secure their trust, rather than simply ‘speaking truth to power’. Effective engagement requires skilful communication and good judgement as much as good evidence.


This is the introduction to our revised and resubmitted paper to the special issue of Palgrave Communications The politics of evidence-based policymaking: how can we maximise the use of evidence in policy? Please get in touch if you are interested in submitting a paper to the series.

Full paper: Cairney Kwiatkowski Palgrave Comms resubmission CLEAN 14.7.17

2 Comments

Filed under agenda setting, Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy

Writing for Impact: what you need to know, and 5 ways to know it

This is a post for my talk at the ‘Politheor: European Policy Network’ event Write For Impact: Training In Op-Ed Writing For Policy Advocacy. There are other speakers with more experience of, and advice on, ‘op-ed’ writing. My aim is to describe key aspects of politics and policymaking to help the audience learn why they should write op-eds in a particular way for particular audiences.

A key rule in writing is to ‘know your audience’, but it’s easier said than done if you seek many sympathetic audiences in many parts of a complex policy process. Two simple rules should help make this process somewhat clearer:

  1. Learn how policymakers simplify their world, and
  2. Learn how policy environments influence their attention and choices.

We can use the same broad concepts to help explain both processes, in which many policymakers and influencers interact across many levels and types of government to produce what we call ‘policy’:

  1. Policymaker psychology: tell an evidence-informed story

Policymakers receive too much information, and seek ways to ignore most of it while making decisions. To do so, they use ‘rational’ and ‘irrational’ means: selecting a limited number of regular sources of information, and relying on emotion, gut instinct, habit, and familiarity with information. In other words, your audience combines cognition and emotion to deal with information, and they can ignore information for long periods then quickly shift their attention towards it, even if that information has not really changed.

Consequently, an op-ed focusing solely ‘the facts’ can be relatively ineffective compared to an evidence-informed story, perhaps with a notional setting, plot, hero, and moral. Your aim shifts from providing more and more evidence to reduce uncertainty about a problem, to providing a persuasive reason to reduce ambiguity. Ambiguity relates to the fact that policymakers can understand a policy problem in many different ways – such as tobacco as an economic good, issue of civil liberties, or public health epidemic – but often pay exclusive attention to one.

So, your aim may be to influence the simple ways in which people understand the world, to influence their demand for more information. An emotional appeal can transform a factual case, but only if you know how people engage emotionally with information. Sometimes, the same story can succeed with one audience but fail with another.

  1. Institutions: learn the ‘rules of the game’

Institutions are the rules people use in policymaking, including the formal, written down, and well understood rules setting out who is responsible for certain issues, and the informal, unwritten, and unclear rules informing action. The rules used by policymakers can help define the nature of a policy problem, who is best placed to solve it, who should be consulted routinely, and who can safely be ignored. These rules can endure for long periods and become like habits, particularly if policymakers pay little attention to a problem or why they define it in a particular way.

  1. Networks and coalitions: build coalitions and establish trust

Such informal rules, about how to understand a problem and who to speak with about it, can be reinforced in networks of policymakers and influencers.

‘Policy community’ partly describes a sense that most policymaking is processed out of the public spotlight, often despite minimal high level policymaker interest. Senior policymakers delegate responsibility for policymaking to bureaucrats, who seek information and advice from groups. Groups exchange information for access to, and potential influence within, government, and policymakers have ‘standard operating procedures’ that favour particular sources of evidence and some participants over others

‘Policy community’ also describes a sense that the network seems fairly stable, built on high levels of trust between participants, based on factors such as reliability (the participant was a good source of information, and did not complain too much in public about decisions), a common aim or shared understanding of the problem, or the sense that influencers represent important groups.

So, the same policy case can have a greater impact if told by a well trusted actor in a policy community. Or, that community member may use networks to build key coalitions behind a case, use information from the network to understand which cases will have most impact, or know which audiences to seek.

  1. Ideas: learn the ‘currency’ of policy argument

This use of networks relates partly to learning the language of policy debate in particular ‘venues’, to learn what makes a convincing case. This language partly reflects a well-established ‘world view’ or the ‘core beliefs’ shared by participants. For example, a very specific ‘evidence-based’ language is used frequently in public health, while treasury departments look for some recognition of ‘value for money’ (according to a particular understanding of how you determine VFM). So, knowing your audience is knowing the terms of debate that are often so central to their worldview that they take them for granted and, in contrast, the forms of argument that are more difficult to pursue because they are challenging or unfamiliar to some audiences. Imagine a case that challenges completely someone’s world view, or one which is entirely consistent with it.

  1. Socioeconomic factors and events: influence how policymakers see the outside world

Some worldviews can be shattered by external events or crises, but this is a rare occurrence. It may be possible to generate a sense of crisis with reference to socioeconomic changes or events, but people will interpret these developments through the ‘lens’ of their own beliefs. In some cases, events seem impossible to ignore but we may not agree on their implications for action. In others, an external event only matters if policymakers pay attention to them. Indeed, we began this discussion with the insight that policymakers have to ignore almost all such information available to them.

Know your audience revisited: practical lessons from policy theories

To take into account all of these factors, while trying to make a very short and persuasive case, may seem impossible. Instead, we might pick up some basic rules of thumb from particular theories or approaches. We can discuss a few examples from ongoing work on ‘practical lessons from policy theories’.

Storytelling for policy impact

If you are telling a story with a setting, plot, hero, and moral, it may be more effective to focus on a hero than villain. More importantly, imagine two contrasting audiences: one is moved by your personal and story told to highlight some structural barriers to the wellbeing of key populations; another is unmoved, judges that person harshly, and thinks they would have done better in their shoes (perhaps they prefer to build policy on stereotypes of target populations). ‘Knowing your audience’ may involve some trial-and-error to determine which stories work under which circumstances.

Appealing to coalitions

Or, you may decide that it is impossible to write anything to appeal to all relevant audiences. Instead, you might tailor it to one, to reinforce its beliefs and encourage people to act. The ‘advocacy coalition framework’ describes such activities as routine: people go into politics to translate their beliefs into policy, they interpret the world through those beliefs, and they romanticise their own cause while demonising their opponents. If so, would a bland op-ed have much effect on any audience?

Learning from entrepreneurs

Policy entrepreneurs’ draw on three rules, two of which seem counterintuitive:

  1. Don’t focus on bombarding policymakers with evidence. Scientists focus on making more evidence to reduce uncertainty, but put people off with too much information. Entrepreneurs tell a good story, grab the audience’s interest, and the audience demands information.
  2. By the time people pay attention to a problem it’s too late to produce a solution. So, you produce your solution then chase problems.
  3. When your environment changes, your strategy changes. For example, in the US federal level, you’re in the sea, and you’re a surfer waiting for the big wave. In the smaller subnational level, on a low attention and low budget issue, you can be Poseidon moving the ‘streams’. In the US federal level, you need to ‘soften’ up solutions over a long time to generate support. In subnational or other countries, you have more opportunity to import and adapt ready-made solutions.

It all adds up to one simple piece of advice – timing and luck matters when making a policy case – but policy entrepreneurs know how to influence timing and help create their own luck.

On the day, we can use such concepts to help us think through the factors that you might think about while writing op-eds, even though it is very unlikely that you would mention them in your written work.

3 Comments

Filed under agenda setting, Evidence Based Policymaking (EBPM), public policy, Storytelling

Psychology Based Policy Studies: 5 heuristics to maximise the use of evidence in policymaking

Richard Kwiatkowski and I combine policy studies and psychology to (a) take forward ‘Psychology Based Policy Studies’, and (b) produce practical advice for actors engaged in the policy process.

Cairney Kwiatkowski abstract

Most policy studies, built on policy theory, explain policy processes without identifying practical lessons. They identify how and why people make decisions, and situate this process of choice within complex systems of environments in which there are many actors at multiple levels of government, subject to rules, norms, and group influences, forming networks, and responding to socioeconomic dynamics. This approach helps generate demand for more evidence of the role of psychology in these areas:

  1. To do more than ‘psychoanalyse’ a small number of key actors at the ‘centre’ of government.
  2. To consider how and why actors identify, understand, follow, reproduce, or seek to shape or challenge, rules within their organisations or networks.
  3. To identify the role of network formation and maintenance, and the extent to which it is built on heuristics to establish trust and the regular flow of information and advice.
  4. To examine the extent to which persuasion can be used to prompt actors to rethink their beliefs – such as when new evidence or a proposed new solution challenges the way that a problem is framed, how much attention it receives, and how it is solved.
  5. To consider (a) the effect of events such as elections on the ways in which policymakers process evidence (e.g. does it encourage short-term and vote-driven calculations?), and (b) what prompts them to pay attention to some contextual factors and not others.

This literature highlights the use of evidence by actors who anticipate or respond to lurches of attention, moral choices, and coalition formation built on bolstering one’s own position, demonising competitors, and discrediting (some) evidence. Although this aspect of choice should not be caricatured – it is not useful simply to bemoan ‘post-truth’ politics and policymaking ‘irrationality’ – it provides a useful corrective to the fantasy of a linear policy process in which evidence can be directed to a single moment of authoritative and ‘comprehensively rational’ choice based only on cognition. Political systems and human psychology combine to create a policy process characterised by many actors competing to influence continuous policy choice built on cognition and emotion.

What are the practical implications?

Few studies consider how those seeking to influence policy should act in such environments or give advice about how they can engage effectively in the policy process. Of course context is important, and advice needs to be tailored and nuanced, but that is not necessarily a reason to side-step the issue of moving beyond description. Further, policymakers and influencers do not have this luxury. They need to gather information quickly and effectively to make good choices. They have to take the risk of action.

To influence this process we need to understand it, and to understand it more we need to study how scientists try to influence it. Psychology-based policy studies can provide important insights to help actors begin to measure and improve the effectiveness of their engagement in policy by: taking into account cognitive and emotional factors and the effect of identity on possible thought; and, considering how political actors are ‘embodied’ and situated in time, place, and social systems.

5 tentative suggestions

However, few psychological insights have been developed from direct studies of policymaking, and there is a limited evidence base. So, we provide preliminary advice by identifying the most relevant avenues of conceptual research and deriving some helpful ‘tools’ to those seeking to influence policy.

Our working assumption is that policymakers need to gather information quickly and effectively, so they develop heuristics to allow them to make what they believe to be good choices. Their solutions often seem to be driven more by their emotions than a ‘rational’ analysis of the evidence, partly because we hold them to a standard that no human can reach. If so, and if they have high confidence in their heuristics, they will dismiss our criticism as biased and naïve. Under those circumstances, restating the need for ‘evidence-based policymaking’ is futile, and naively ‘speaking truth to power’ counterproductive.

For us, heuristics represent simple alternative strategies, built on psychological insights to use psychological insights in policy practice. They are broad prompts towards certain ways of thinking and acting, not specific blueprints for action in all circumstances:

  1. Develop ways to respond positively to ‘irrational’ policymaking

Instead of automatically bemoaning the irrationality of policymakers, let’s marvel at the heuristics they develop to make quick decisions despite uncertainty. Then, let’s think about how to respond in a ‘fast and frugal’ way, to pursue the kinds of evidence informed policymaking that is realistic in a complex and constantly changing policymaking environment.

  1. Tailor framing strategies to policymaker bias

The usual advice is to minimise the cognitive burden of your presentation, and use strategies tailored to the ways in which people pay attention to, and remember information (at the beginning and end of statements, with repetition, and using concrete and immediate reference points).

What is the less usual advice? If policymakers are combining cognitive and emotive processes, combine facts with emotional appeals. If policymakers are making quick choices based on their values and simple moral judgements, tell simple stories with a hero and a clear moral. If policymakers are reflecting a group emotion, based on their membership of a coalition with firmly-held beliefs, frame new evidence to be consistent with the ‘lens’ through which actors in those coalitions understand the world.

 

  1. Identify the right time to influence individuals and processes

Understand what it means to find the right time to exploit ‘windows of opportunity’. ‘Timing’ can refer to the right time to influence an individual, which is relatively difficult to identify but with the possibility of direct influence, or to act while several political conditions are aligned, which presents less chance for you to make a direct impact.

  1. Adapt to real-world dysfunctional organisations rather than waiting for an orderly process to appear

Politicians may appear confident of policy and with a grasp of facts and details, but are (a) often vulnerable and defensive, and closed to challenging information, and/ or (b) inadequate in organisational politics, or unable to change the rules of their organisations. In the absence of institutional reforms, and presence of ‘dysfunctional’ processes, develop pragmatic strategies: form relationships in networks, coalitions, or organisations first, then supply challenging information second. To challenge without establishing trust may be counterproductive.

  1. Recognise that the biases we ascribe to policymakers are present in ourselves and our own groups.

Identifying only the biases in our competitors may help mask academic/ scientific examples of group-think, and it may be counterproductive to use euphemistic terms like ‘low information’ to describe actors whose views we do not respect. This is a particular problem for scholars if they assume that most people do not live up to their own imagined standards of high-information-led action.

It may be more effective to recognise that: (a) people’s beliefs are honestly held, and policymakers believe that their role is to serve a cause greater than themselves.; and, (b) a fundamental aspect of evolutionary psychology is that people need to get on with each other, so showing simple respect – or going further, to ‘mirror’ that person’s non-verbal signals – can be useful even if it looks facile.

This leaves open the ethical question of how far we should go to identify our biases, accept the need to work with people whose ways of thinking we do not share, and how far we should go to secure their trust without lying about one’s beliefs.

4 Comments

Filed under Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy

Using psychological insights in politics: can we do it without calling our opponents mental, hysterical, or stupid?

One of the most dispiriting parts of fierce political debate is the casual use of mental illness or old and new psychiatric terms to undermine an opponent: she is mad, he is crazy, she is a nutter, they are wearing tin foil hats, get this guy a straitjacket and the men in white coats because he needs to lie down in a dark room, she is hysterical, his position is bipolar, and so on. This kind of statement reflects badly on the campaigner rather than their opponent.

I say this because, while doing some research on a paper on the psychology of politics and policymaking (this time with Richard Kwiatkowski, as part of this special collection), there are potentially useful concepts that seem difficult to insulate from such political posturing. There is great potential to use them cynically against opponents rather than benefit from their insights.

The obvious ‘live’ examples relate to ‘rational’ versus ‘irrational’ policymaking. For example, one might argue that, while scientists develop facts and evidence rationally, using tried and trusted and systematic methods, politicians act irrationally, based on their emotions, ideologies, and groupthink. So, we as scientists are the arbiters of good sense and they are part of a pathological political process that contributes to ‘post truth’ politics.

The obvious problem with such accounts is that we all combine cognitive and emotional processes to think and act. We are all subject to bias in the gathering and interpretation of evidence. So, the more positive, but less tempting, option is to consider how this process works – when both competing sides act ‘rationally’ and emotionally – and what we can realistically do to mitigate the worst excesses of such exchanges. Otherwise, we will not get beyond demonising our opponents and romanticising our own cause. It gives us the warm and fuzzies on twitter and in academic conferences but contributes little to political conversations.

A less obvious example comes from modern work on the links between genes and attitudes. There is now a research agenda which uses surveys of adult twins to compare the effect of genes and environment on political attitudes. For example, Oskarsson et al (2015: 650) argue that existing studies ‘report that genetic factors account for 30–50% of the variation in issue orientations, ideology, and party identification’. One potential mechanism is cognitive ability: put simply, and rather cautiously and speculatively, with a million caveats, people with lower cognitive ability are more likely to see ‘complexity, novelty, and ambiguity’ as threatening and to respond with fear, risk aversion, and conservatism (2015: 652).

My immediate thought, when reading this stuff, is about how people would use it cynically, even at this relatively speculative stage in testing and evidence gathering: my opponent’s genes make him stupid, which makes him fearful of uncertainty and ambiguity, and therefore anxious about change and conservative in politics (in other words, the Yoda hypothesis applied only to stupid people). It’s not his fault, but his stupidity is an obstacle to progressive politics. If you add in some psychological biases, in which people inflate their own sense of intelligence and underestimate that of their opponents, you have evidence-informed, really shit political debate! ‘My opponent is stupid’ seems a bit better than ‘my opponent is mental’ but only in the sense that eating a cup of cold sick is preferable to eating shit.

I say this as we try to produce some practical recommendations (for scientist and advocates of EBPM) to engage with politicians to improve the use of evidence in policy. I’ll let you know if it goes beyond a simple maxim: adapt to their emotional and cognitive biases, but don’t simply assume they’re stupid.

See also: the many commentaries on how stupid it is to treat your political opponents as stupid

Stop Calling People “Low Information Voters

1 Comment

Filed under Evidence Based Policymaking (EBPM), Uncategorized

Is there any hope for evidence in emotional debates and chaotic government?

Two recent news features sum up the role of emotion and ‘chaos’ in policymaking.

The first is ‘Irritation and anger’ may lead to Brexit, says influential psychologist in The Telegraph. The headline suggests that people may vote Leave for emotional reasons, rather than with reference to a more ‘rational’ process in which people identify the best evidence and use it to weight up the short and long term consequences of action.

Yet, the article confirms that we’re all at it! To be human is to use emotional, gut-level, and habitual thinking to turn a complex world, with too much information, into a good enough decision in a necessary short amount of time.

In debate, evidence is mentioned a lot, but only to praise the evidence backing my decision and rejecting yours. Or, you only trust the evidence from people you trust. If you trust the evidence from certain scientists, you stress their scientific credentials. If not, you find some from other experts. Or, if all else is lost, you reject experts as condescending elites with a hidden agenda. Or, you say simply that they can’t be that clever if they agree with smarmy Cameron/ Johnson.

Lesson 1: you can see these emotional and manipulative approaches to policymaking play out in the EU referendum. Don’t assume that policymaking behind closed doors, on other issues, is any different.

The second feature is Lost in Transit: chaos in government research in The Economist.

It describes a Sense about Science report which (a) was commissioned ‘following a spate of media stories about government research being suppressed or delayed’, and (b) finds that ‘The UK government spends around £2.5 billion a year on research for policy, but does not know how many studies it has commissioned or which of them have been published’.

The Economist reports the perhaps-unexpected result of the inquiry:

But the main gripe is the sheer disorganisation of it all. The report’s afterword states that “Sir Stephen looked for suppression and found chaos”.

Such accounts reflect the two contradictory stories that we often tell about government. The first relates to the Westminster model of democratic accountability which helps concentrate power at the centre of government: if you know who is in charge, you know who to blame.

The second, regarding complex government, describes a complicated world of public policy in which no-one seems to be in control. For example, we make reference to: the huge size and reach of government; the potential for ministerial ‘overload’ and need to simplify decision-making; the blurry boundaries between the actors who make and influence policy; the multi-level nature of policymaking; and, the proliferation of rules and regulations, many of which may undermine each other.

The problem with the first story is that (a) although it is easy to tell during elections and inquiries, (b) you always struggle to find it when you actually study government.

The problem with the second is that, (a) although it seems realistic when you study government, (b) few people will buy it when they are seeking to hold ministers and governments to account. This problem may be exacerbated by the terms of reference of reports: few will accept a pragmatic response, based on the second story of complexity, if you start out by using the first story of central control to say that you will track down and solve the problem!

Lesson 2: if you assume central control you will find chaos (and struggle to produce feasible recommendations to deal with it). The manipulation of evidence takes place in a complex policymaking system over which no individual or ‘core executive’ has control. Indeed, no single person or organisation could even pay attention to all that goes on within government. This insight requires pragmatic inquiries and solutions, not the continuous reassertion of central control and discovery of ‘chaos’.

It might be possible to develop a third lesson if we put these two together. One part of the EU debate reflects our inability to understand EU policymaking and relate it to the relatively clear processes in the UK, in which you know who is in charge and therefore who to blame. The EU seems less democratic because it is so complex and remote. Yet, if we follow this other story about complexity in the UK, we often find that UK politics is also difficult to follow. Its image does not describe reality.

Lesson 3: when you find policymaking complexity in the EU, don’t assume it is any better in the UK! Instead, try to compare like with like.

See also

I expand on both lessons in The Politics of Evidence-Based Policymaking

Cock-up, not conspiracy, conceals evidence for policy

Government buries its own research – and that’s bad for democracy

The rationality paradox of Nudge: rational tools of government in a world of bounded rationality

6 Comments

Filed under Evidence Based Policymaking (EBPM), public policy, UK politics and policy

Policy Concepts in 1000 Words: Framing

framing main

(podcast download)

‘Framing’ is a metaphor to describe the ways in which we understand, and use language selectively to portray, policy problems. There are many ways to describe this process in many disciplines, including communications, psychological, and sociological research. There is also more than one way to understand the metaphor.

For example, I think that most scholars describe this image (from litemind) of someone deciding which part of the world on which to focus.

framing with hands

However, I have also seen colleagues use this image, of a timber frame, to highlight the structure of a discussion which is crucial but often unseen and taken for granted:

timber frame

  1. Intentional framing and cognition.

The first kind of framing relates to bounded rationality or the effect of our cognitive processes on the ways in which we process information (and influence how others process information):

  • We use major cognitive shortcuts to turn an infinite amount of information into the ‘signals’ we perceive or pay attention to.
  • These cognitive processes often produce interesting conclusions, such as when (a) we place higher value on the things we own/ might lose rather than the things we don’t own/ might gain (‘prospect theory’) or (b) we value, or pay more attention to, the things with which we are most familiar and can process more easily (‘fluency’).
  • We often rely on other people to process and select information on our behalf.
  • We are susceptible to simple manipulation based on the order (or other ways) in which we process information, and the form it takes.

In that context, you can see one meaning of framing: other actors portray information selectively to influence the ways in which we see the world, or which parts of the world capture our attention (here is a simple example of wind farms).

In policy theory, framing studies focus on ambiguity: there are many ways in which we can understand and define the same policy problem (note terms such as ‘problem definition’ and a ‘policy image’). Therefore, actors exercise power to draw attention to, and generate support for, one particular understanding at the expense of others. They do this with simple stories or the selective presentation of facts, often coupled with emotional appeals, to manipulate the ways in which we process information.

  1. Frames as structures

Think about the extent to which we take for granted certain ways to understand or frame issues. We don’t begin each new discussion with reference to ‘first principles’. Instead, we discuss issues with reference to:

(a) debates that have been won and may not seem worth revisiting (imagine, for example, the ways in which ‘socialist’ policies are treated in the US)

(b) other well-established ways to understand the world which, when they seem to dominate our ways of thinking, are often described as ‘hegemonic’ or with reference to paradigms.

In such cases, the timber frame metaphor serves two purposes:

(a) we can conclude that it is difficult but not impossible to change.

(b) if it is hidden by walls, we do not see it; we often take it for granted even though we should know it exists.

Framing the social, not physical, world

These metaphors can only take us so far, because the social world does not have such easily identifiable physical structures. Instead, when we frame issues, we don’t just choose where to look; we also influence how people describe what we are looking at. Or, ‘structural’ frames relate to regular patterns of behaviour or ways of thinking which are more difficult to identify than in a building. Consequently, we do not all describe structural constraints in the same way even though, ostensibly, we are looking at the same thing.

In this respect, for example, the well-known ‘Overton window’ is a sort-of helpful but also problematic concept, since it suggests that policymakers are bound to stay within the limits of what Kingdon calls the ‘national mood’. The public will only accept so much before it punishes you in events such as elections. Yet, of course, there is no such thing as the public mood. Rather, some actors (policymakers) make decisions with reference to their perception of such social constraints (how will the public react?) but they also know that they can influence how we interpret those constraints with reference to one or more proxies, including opinion polls, public consultations, media coverage, and direct action:

JEPP public opinion

They might get it wrong, and suffer the consequences, but it still makes sense to say that they have a choice to interpret and adapt to such ‘structural’ constraints.

Framing, power and the role of ideas

We can bring these two ideas about framing together to suggest that some actors exercise power to reinforce dominant ways to think about the world. Power is not simply about visible conflicts in which one group with greater material resources wins and another loses. It also relates to agenda setting. First, actors may exercise power to reinforce social attitudes. If the weight of public opinion is against government action, maybe governments will not intervene. The classic example is poverty – if most people believe that it is caused by fecklessness, what is the role of government? In such cases, power and powerlessness may relate to the (in)ability of groups to persuade the public, media and/ or government that there is a reason to make policy; a problem to be solved.  In other examples, the battle may be about the extent to which issues are private (with no legitimate role for government) or public (and open to legitimate government action), including: should governments intervene in disputes between businesses and workers? Should they intervene in disputes between husbands and wives? Should they try to stop people smoking in private or public places?

Second, policymakers can only pay attention to a tiny amount of issues for which they are responsible. So, actors exercise power to keep some issues on their agenda at the expense of others.  Issues on the agenda are sometimes described as ‘safe’: more attention to these issues means less attention to the imbalances of power within society.

45 Comments

Filed under 1000 words, agenda setting, PhD, public policy

The Psychology of Evidence Based Policymaking: Who Will Speak For the Evidence if it Doesn’t Speak for Itself?

Let’s begin with a simple – and deliberately naïve – prescription for evidence based policymaking (EBPM): there should be a much closer link between (a) the process in which scientists and knowledge brokers identify major policy problems, and (b) the process in which politicians make policy decisions. We should seek to close the ‘evidence-policy gap’. The evidence should come first and we should bemoan the inability of policymakers to act accordingly. I discuss why that argument is naïve here and here, but in terms of the complexity of policy processes and the competing claims for knowledge-based policy. This post is about the link between EBPM and psychology.

Let’s consider the role of two types of thought process common to all people, policymakers included: (a) the intuitive, gut, emotional or other heuristics we use to process and act on information quickly; and (b) goal-oriented and reasoned, thoughtful behaviour. As Daniel Kahneman’s Thinking, Fast and Slow (p 20) puts it: ‘System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations … often associated with the subjective experience of agency, choice and concentration’

The naïve description of EBPM requires System 2 (‘slow’) thinking, but what happens if most policymaking is characterised by System 1 (‘fast’)? The answer is ‘a whole bunch of cognitive shortcuts’, including:

  • the ‘availability heuristic’, when people relate the size, frequency or probability of a problem to how easy it is to remember or imagine
  • the ‘representativeness heuristic’, when people overestimate the probability of vivid events
  • ‘prospect theory’, when people value losses more than equivalent gains
  • ‘framing effects’, based on emotional and moral judgements
  • confirmation bias
  • optimism bias, or unrealistic expectations about our aims working out well when we commit to them
  • status quo bias
  • a tendency to use exemplars of social groups to represent general experience; and
  • a ‘need for coherence’ and to establish patterns and causal relationships when they may not exist (see Paul Lewis, p 7).

The ‘availability heuristic’ may also be linked to more recent studies of ‘processing fluency’ – which suggests that people’s decisions are influenced by their familiarity with things; with the ease in which they process information (see Alter and Oppenheimer, 2009). Fluency can take several forms, including conceptual, perceptual, and linguistic. For example, people may pay more attention to an issue or statement if they already possess some knowledge of it and find it easy to understand or recall. They may pay attention to people when their faces seem familiar and find fewer faults with systems they comprehend. They may place more value on things they find familiar, such as their domestic currency, items that they own compared to items they would have to buy, or the stocks of companies with more pronounceable names – even if they are otherwise identical. Or, their ability to imagine things in an abstract or concrete form may relate to their psychological ‘distance’ from it.

Is fast thinking bad thinking? Views from psychology

Alter and Oppenheimer use these insights to warn policymakers against taking the wrong attitude to regulation or spending based on flawed assessments of risk – for example, they might spend disproportionate amounts of money on projects designed to address risks with which they are most familiar (Slovic suggests that feelings towards risk may even be influenced by the way in which it is described, for example as a percentage versus a 1 in X probability). Alter and Oppenheimer also worry about medical and legal judgements swayed by fluid diagnoses and stories. Haidt argues that the identification of the ‘intuitive basis of moral judgment’ can be used to help policymakers ‘avoid mistakes’ or allow people to develop ‘programs’ or an ‘environment’ to ‘improve the quality of moral judgment and behavior’. These studies compare with arguments focusing on the positive role of emotions of decision-making, either individually (Frank) or as part of social groups, with emotional responses providing useful information in the form of social cues (Van Kleef et al).

Is fast thinking bad thinking? Views from the political and policy sciences

Social Construction Theory suggests that policymakers make quick, biased, emotional judgements, then back up their actions with selective facts to ‘institutionalize’ their understanding of a policy problem and its solution. They ‘socially construct’ their target populations to argue that they are deserving either of governmental benefits or punishments. Schneider and Ingram (forthcoming) argue that the outcomes of social construction are often dysfunctional and not based on a well-reasoned, goal-oriented strategy: ‘Studies have shown that rules, tools, rationales and implementation structures inspired by social constructions send dysfunctional messages and poor choices may hamper the effectiveness of policy’.

However, not all policy scholars make such normative pronouncements. Indeed, the value of policy theory is often to show that policy results from the interaction between large numbers of people and institutions. So, the actions of a small number of policymakers would not be the issue; we need to know more about the cumulative effect of individual emotional decision making in a collective decision-making environment – in organisations, networks and systems. For example:

  • The Advocacy Coalition Framework suggests that people engage in coordinated activity to cooperate with each other and compete with other coalitions, based on their shared beliefs and a tendency to demonise their opponents. In some cases, there are commonly accepted ways to interpret the evidence. In others, it is a battle of ideas.
  • Multiple Streams Analysis and Punctuated Equilibrium Theory focus on uncertainty and ambiguity, exploring the potential for policymaker attention to lurch dramatically from one problem or ‘image’ (the way the problem is viewed or understood). They identify the framing strategies – of actors such as ‘entrepreneurs’, ‘venue shoppers’ and ‘monopolists’ – based on a mixture of empirical facts and ‘emotional appeals’.
  • The Narrative Policy Framework combines a discussion of emotion with the identification of narrative strategies. Each narrative has a setting, characters, plot and moral. They can be compared to marketing, as persuasion based more on appealing to an audience’s beliefs (or exploiting their thought processes) than the evidence. People will pay attention to certain narratives because they are boundedly rational, seeking shortcuts to gather sufficient information – and prone to accept simple stories that seem plausible, confirm their biases, exploit their emotions, and/ or come from a source they trust.

In each case, we might see our aim as going beyond the simple phrase: ‘the evidence doesn’t speak for itself’. If ‘fast thinking’ is pervasive in policymaking, then ‘the evidence’ may only be influential if it can be provided in ways that are consistent with the thought processes of certain policymakers – such as by provoking a strong emotional reaction (to confirm or challenge biases), or framing messages in terms that are familiar to (and can be easily processed by) policymakers.

These issues are discussed further in these posts:

Is Evidence-Based Policymaking the same as good policymaking?

Policy Concepts in 1000 Words: The Psychology of Policymaking

And at more length in these papers:

PSA 2014 Cairney Psychology Policymaking 7.4.14

Cairney PSA 2014 EBPM 5.3.14

See also: Joseph Rowntree Foundation, Evidence alone won’t bring about social change

Discover Society (Delaney and Henderson) Risk and Choice in the Scottish Independence debate

4 Comments

Filed under Evidence Based Policymaking (EBPM), public policy

Policy Concepts in 1000 Words: The Psychology of Policymaking

(podcast download)

Psychology is at the heart of policymaking, but the literature on psychology is not always at the heart of policy theory. Most theories identify ‘bounded rationality’ which, on its own, is little more than a truism: people do not have the time, resources and cognitive ability to consider all information, all possibilities, all solutions, or anticipate all consequences of their actions. Consequently, they use informational shortcuts or heuristics – perhaps to produce ‘good-enough’ decisions. This is where psychology comes in, to:

  1. Describe the thought processes that people use to turn a complex world into something simple enough to understand and/ or respond to; and
  2. To compare types of thought process, such as (a) goal-oriented and reasoned, thoughtful behaviour and (b) the intuitive, gut, emotional or other heuristics we use to process and act on information quickly.

Where does policy theory come in? It seeks to situate these processes within a wider examination of policymaking systems and their environments, identifying the role of:

  • A wide range of actors making choices.
  • Institutions, as the rules, norms, and practices that influence behaviour.
  • Policy networks, as the relationships between policymakers and the ‘pressure participants’ with which they consult and negotiate.
  • Ideas – a broad term to describe beliefs, and the extent to which they are shared within groups, organisations, networks and political systems.
  • Context and events, to describe the extent to which a policymaker’s environment is in her control or how it influences her decisions.

Putting these approaches together is not easy. It presents us with an important choice regarding how to treat the role of psychology within explanations of complex policymaking systems – or, at least, on which aspect to focus.

Our first choice is to focus specifically on micro-level psychological processes, to produce hypotheses to test propositions regarding individual thought and action. There are many from which to choose, although from Daniel Kahneman’s Thinking, Fast and Slow (p 20), we can identify a basic distinction between two kinds ‘System 1 operates automatically and quickly, with little or no effort and no sense of voluntary control. System 2 allocates attention to the effortful mental activities that demand it, including complex computations … often associated with the subjective experience of agency, choice and concentration’. Further, system 1 can be related to a series of cognitive shortcuts which develop over time as people learn from experience, including:

  • the ‘availability heuristic’, when people relate the size, frequency or probability of a problem to how easy it is to remember or imagine
  • the ‘representativeness heuristic’, when people overestimate the probability of vivid events
  • ‘prospect theory’, when people value losses more than equivalent gains
  • ‘framing effects’, based on emotional and moral judgements
  • confirmation bias
  • optimism bias, or unrealistic expectations about our aims working out well when we commit to them
  • status quo bias
  • a tendency to use exemplars of social groups to represent general experience; and
  • a ‘need for coherence’ and to establish patterns and causal relationships when they may not exist (see Paul Lewis, p 7).

The ‘availability heuristic’ may also be linked to more recent studies of ‘processing fluency’ – which suggests that people’s decisions are influenced by their familiarity with things; with the ease in which they process information (see Alter and Oppenheimer, 2009). Fluency can take several forms, including conceptual, perceptual, and linguistic. For example, people may pay more attention to an issue or statement if they already possess some knowledge of it and find it easy to understand or recall. They may pay attention to people when their faces seem familiar and find fewer faults with systems they comprehend. They may place more value on things they find familiar, such as their domestic currency, items that they own compared to items they would have to buy, or the stocks of companies with more pronounceable names – even if they are otherwise identical. Or, their ability to imagine things in an abstract or concrete form may relate to their psychological ‘distance’ from it.

Our second choice is to treat these propositions as assumptions, allowing us to build larger (‘meso’ or ‘macro’ level) models that produce other hypotheses. We ask what would happen if these assumptions were true, to allow us to theorise a social system containing huge numbers of people, and/ or focus on the influence of the system or environment in which people make decisions.

These choices are made in different ways in the policy theory literature:

  • The Advocacy Coalition Framework has tested the idea of ‘devil shift’ (coalitions romanticize their own cause and demonise their opponents, misperceiving their power, beliefs and/ or motives) but also makes assumptions about belief systems and prospect theory to build models and test other assumptions.
  • Multiple Streams Analysis and Punctuated Equilibrium Theory focus on uncertainty and ambiguity, exploring the potential for policymaker attention to lurch dramatically from one problem or ‘image’ (the way the problem is viewed or understood). They identify the framing strategies of actors such as ‘entrepreneurs’, ‘venue shoppers’ and ‘monopolists’.
  • Social Construction Theory argues that policymakers make quick, biased, emotional judgements, then back up their actions with selective facts to ‘institutionalize’ their understanding of a policy problem and its solution.
  • The Narrative Policy Framework combines a discussion of emotion with the identification of ‘homo narrans’ (humans as storytellers – in stated contrast to ‘homo economicus’, or humans as rational beings). Narratives are used strategically to reinforce or oppose policy measures. Each narrative has a setting, characters, plot and moral. They can be compared to marketing, as persuasion based more on appealing to an audience’s beliefs (or exploiting their thought processes) than the evidence. People will pay attention to certain narratives because they are boundedly rational, seeking shortcuts to gather sufficient information – and prone to accept simple stories that seem plausible, confirm their biases, exploit their emotions, and/ or come from a source they trust.

These issues are discussed at more length in this paper: PSA 2014 Cairney Psychology Policymaking 7.4.14

34 Comments

Filed under 1000 words, agenda setting, public policy