Tag Archives: psychology of policymaking

Emotion and reason in politics: the rational/ irrational distinction

In ‘How to communicate effectively with policymakers’, Richard Kwiatkowski and I use the distinction between ‘rational’ and ‘irrational’ cognitive shortcuts ‘provocatively’. I sort of wish we had been more direct, because I have come to realise that:

  1. My attempts to communicate with sarcasm and facial gestures may only ever appeal to a niche audience, and
  2. even if you use the scare quotes – around a word like ‘irrational’ – to denote the word’s questionable use, it’s not always clear what I’m questioning, because
  3. you need to know the story behind someone’s discussion to know what they are questioning.*

So, here are some of the reference points I’m using when I tell a story about ‘irrationality’:

1. I’m often invited to be the type of guest speaker that challenges the audience, it is usually a scientific audience, and the topic is usually evidence based policymaking.

So, when I say ‘irrational’, I’m speaking to (some) scientists who think of themselves as rational and policymakers as irrational, and use this problematic distinction to complain about policy-based evidence, post-truth politics, and perhaps even the irrationality of voters for Brexit. Action based on this way of thinking would be counterproductive. In that context, I use the word ‘irrational’ as a way into some more nuanced discussions including:

  • all humans combine cognition and emotion to make choices; and,
  • emotions are one of many sources of ‘fast and frugal heuristics’ that help us make some decisions very quickly and often very well.

In other words, it is silly to complain that some people are irrational, when we are all making choices this way, and such decision-making is often a good thing.

2. This focus on scientific rationality is part of a wider discussion of what counts as good evidence or valuable knowledge. Examples include:

  • Policy debates on the value of bringing together many people with different knowledge claims – such as through user and practitioner experience – to ‘co-produce’ evidence.
  • Wider debates on the ‘decolonization of knowledge’ in which narrow ‘Western’ scientific principles help exclude the voices of many populations by undermining their claims to knowledge.

3. A focus on rationality versus irrationality is still used to maintain sexist and racist caricatures or stereotypes, and therefore dismiss people based on a misrepresentation of their behaviour.

I thought that, by now, we’d be done with dismissing women as emotional or hysterical, but apparently not. Indeed, as some recent racist and sexist coverage of Serena Williams demonstrates, the idea that black women are not rational is still tolerated in mainstream discussion.

4. Part of the reason that we can only conclude that people combine cognition and emotion, without being able to separate their effects in a satisfying way, is that the distinction is problematic.

It is difficult to demonstrate empirically. It is also difficult to assign some behaviours to one camp or the other, such as when we consider moral reasoning based on values and logic.

To sum up, I’ve been using the rational/irrational distinction explicitly to make a simple point that is relevant to the study of politics and policymaking:

  • All people use cognitive shortcuts to help them ignore almost all information about the world, to help them make decisions efficiently.
  • If you don’t understand and act on this simple insight, you’ll waste your time by trying to argue someone into submission or giving them a 500-page irrelevant report when they are looking for one page written in a way that makes sense to them.

Most of the rest has been mostly implicit, and communicated non-verbally, which is great when you want to keep a presentation brief and light, but not if you want to acknowledge nuance and more serious issues.

 

 

 

 

*which is why I’m increasingly interested in Riker’s idea of heresthetics, in which the starting point of a story is crucial. We can come to very different conclusions about a problem and its solution by choosing different starting points, to accentuate one aspect of a problem and downplay another, even when our beliefs and preferences remain basically the same.

 

 

1 Comment

Filed under Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy, Storytelling

Writing for Impact: what you need to know, and 5 ways to know it

This is a post for my talk at the ‘Politheor: European Policy Network’ event Write For Impact: Training In Op-Ed Writing For Policy Advocacy. There are other speakers with more experience of, and advice on, ‘op-ed’ writing. My aim is to describe key aspects of politics and policymaking to help the audience learn why they should write op-eds in a particular way for particular audiences.

A key rule in writing is to ‘know your audience’, but it’s easier said than done if you seek many sympathetic audiences in many parts of a complex policy process. Two simple rules should help make this process somewhat clearer:

  1. Learn how policymakers simplify their world, and
  2. Learn how policy environments influence their attention and choices.

We can use the same broad concepts to help explain both processes, in which many policymakers and influencers interact across many levels and types of government to produce what we call ‘policy’:

  1. Policymaker psychology: tell an evidence-informed story

Policymakers receive too much information, and seek ways to ignore most of it while making decisions. To do so, they use ‘rational’ and ‘irrational’ means: selecting a limited number of regular sources of information, and relying on emotion, gut instinct, habit, and familiarity with information. In other words, your audience combines cognition and emotion to deal with information, and they can ignore information for long periods then quickly shift their attention towards it, even if that information has not really changed.

Consequently, an op-ed focusing solely ‘the facts’ can be relatively ineffective compared to an evidence-informed story, perhaps with a notional setting, plot, hero, and moral. Your aim shifts from providing more and more evidence to reduce uncertainty about a problem, to providing a persuasive reason to reduce ambiguity. Ambiguity relates to the fact that policymakers can understand a policy problem in many different ways – such as tobacco as an economic good, issue of civil liberties, or public health epidemic – but often pay exclusive attention to one.

So, your aim may be to influence the simple ways in which people understand the world, to influence their demand for more information. An emotional appeal can transform a factual case, but only if you know how people engage emotionally with information. Sometimes, the same story can succeed with one audience but fail with another.

  1. Institutions: learn the ‘rules of the game’

Institutions are the rules people use in policymaking, including the formal, written down, and well understood rules setting out who is responsible for certain issues, and the informal, unwritten, and unclear rules informing action. The rules used by policymakers can help define the nature of a policy problem, who is best placed to solve it, who should be consulted routinely, and who can safely be ignored. These rules can endure for long periods and become like habits, particularly if policymakers pay little attention to a problem or why they define it in a particular way.

  1. Networks and coalitions: build coalitions and establish trust

Such informal rules, about how to understand a problem and who to speak with about it, can be reinforced in networks of policymakers and influencers.

‘Policy community’ partly describes a sense that most policymaking is processed out of the public spotlight, often despite minimal high level policymaker interest. Senior policymakers delegate responsibility for policymaking to bureaucrats, who seek information and advice from groups. Groups exchange information for access to, and potential influence within, government, and policymakers have ‘standard operating procedures’ that favour particular sources of evidence and some participants over others

‘Policy community’ also describes a sense that the network seems fairly stable, built on high levels of trust between participants, based on factors such as reliability (the participant was a good source of information, and did not complain too much in public about decisions), a common aim or shared understanding of the problem, or the sense that influencers represent important groups.

So, the same policy case can have a greater impact if told by a well trusted actor in a policy community. Or, that community member may use networks to build key coalitions behind a case, use information from the network to understand which cases will have most impact, or know which audiences to seek.

  1. Ideas: learn the ‘currency’ of policy argument

This use of networks relates partly to learning the language of policy debate in particular ‘venues’, to learn what makes a convincing case. This language partly reflects a well-established ‘world view’ or the ‘core beliefs’ shared by participants. For example, a very specific ‘evidence-based’ language is used frequently in public health, while treasury departments look for some recognition of ‘value for money’ (according to a particular understanding of how you determine VFM). So, knowing your audience is knowing the terms of debate that are often so central to their worldview that they take them for granted and, in contrast, the forms of argument that are more difficult to pursue because they are challenging or unfamiliar to some audiences. Imagine a case that challenges completely someone’s world view, or one which is entirely consistent with it.

  1. Socioeconomic factors and events: influence how policymakers see the outside world

Some worldviews can be shattered by external events or crises, but this is a rare occurrence. It may be possible to generate a sense of crisis with reference to socioeconomic changes or events, but people will interpret these developments through the ‘lens’ of their own beliefs. In some cases, events seem impossible to ignore but we may not agree on their implications for action. In others, an external event only matters if policymakers pay attention to them. Indeed, we began this discussion with the insight that policymakers have to ignore almost all such information available to them.

Know your audience revisited: practical lessons from policy theories

To take into account all of these factors, while trying to make a very short and persuasive case, may seem impossible. Instead, we might pick up some basic rules of thumb from particular theories or approaches. We can discuss a few examples from ongoing work on ‘practical lessons from policy theories’.

Storytelling for policy impact

If you are telling a story with a setting, plot, hero, and moral, it may be more effective to focus on a hero than villain. More importantly, imagine two contrasting audiences: one is moved by your personal and story told to highlight some structural barriers to the wellbeing of key populations; another is unmoved, judges that person harshly, and thinks they would have done better in their shoes (perhaps they prefer to build policy on stereotypes of target populations). ‘Knowing your audience’ may involve some trial-and-error to determine which stories work under which circumstances.

Appealing to coalitions

Or, you may decide that it is impossible to write anything to appeal to all relevant audiences. Instead, you might tailor it to one, to reinforce its beliefs and encourage people to act. The ‘advocacy coalition framework’ describes such activities as routine: people go into politics to translate their beliefs into policy, they interpret the world through those beliefs, and they romanticise their own cause while demonising their opponents. If so, would a bland op-ed have much effect on any audience?

Learning from entrepreneurs

Policy entrepreneurs’ draw on three rules, two of which seem counterintuitive:

  1. Don’t focus on bombarding policymakers with evidence. Scientists focus on making more evidence to reduce uncertainty, but put people off with too much information. Entrepreneurs tell a good story, grab the audience’s interest, and the audience demands information.
  2. By the time people pay attention to a problem it’s too late to produce a solution. So, you produce your solution then chase problems.
  3. When your environment changes, your strategy changes. For example, in the US federal level, you’re in the sea, and you’re a surfer waiting for the big wave. In the smaller subnational level, on a low attention and low budget issue, you can be Poseidon moving the ‘streams’. In the US federal level, you need to ‘soften’ up solutions over a long time to generate support. In subnational or other countries, you have more opportunity to import and adapt ready-made solutions.

It all adds up to one simple piece of advice – timing and luck matters when making a policy case – but policy entrepreneurs know how to influence timing and help create their own luck.

On the day, we can use such concepts to help us think through the factors that you might think about while writing op-eds, even though it is very unlikely that you would mention them in your written work.

3 Comments

Filed under agenda setting, Evidence Based Policymaking (EBPM), public policy, Storytelling

Psychology Based Policy Studies: 5 heuristics to maximise the use of evidence in policymaking

Richard Kwiatkowski and I combine policy studies and psychology to (a) take forward ‘Psychology Based Policy Studies’, and (b) produce practical advice for actors engaged in the policy process.

Cairney Kwiatkowski abstract

Most policy studies, built on policy theory, explain policy processes without identifying practical lessons. They identify how and why people make decisions, and situate this process of choice within complex systems of environments in which there are many actors at multiple levels of government, subject to rules, norms, and group influences, forming networks, and responding to socioeconomic dynamics. This approach helps generate demand for more evidence of the role of psychology in these areas:

  1. To do more than ‘psychoanalyse’ a small number of key actors at the ‘centre’ of government.
  2. To consider how and why actors identify, understand, follow, reproduce, or seek to shape or challenge, rules within their organisations or networks.
  3. To identify the role of network formation and maintenance, and the extent to which it is built on heuristics to establish trust and the regular flow of information and advice.
  4. To examine the extent to which persuasion can be used to prompt actors to rethink their beliefs – such as when new evidence or a proposed new solution challenges the way that a problem is framed, how much attention it receives, and how it is solved.
  5. To consider (a) the effect of events such as elections on the ways in which policymakers process evidence (e.g. does it encourage short-term and vote-driven calculations?), and (b) what prompts them to pay attention to some contextual factors and not others.

This literature highlights the use of evidence by actors who anticipate or respond to lurches of attention, moral choices, and coalition formation built on bolstering one’s own position, demonising competitors, and discrediting (some) evidence. Although this aspect of choice should not be caricatured – it is not useful simply to bemoan ‘post-truth’ politics and policymaking ‘irrationality’ – it provides a useful corrective to the fantasy of a linear policy process in which evidence can be directed to a single moment of authoritative and ‘comprehensively rational’ choice based only on cognition. Political systems and human psychology combine to create a policy process characterised by many actors competing to influence continuous policy choice built on cognition and emotion.

What are the practical implications?

Few studies consider how those seeking to influence policy should act in such environments or give advice about how they can engage effectively in the policy process. Of course context is important, and advice needs to be tailored and nuanced, but that is not necessarily a reason to side-step the issue of moving beyond description. Further, policymakers and influencers do not have this luxury. They need to gather information quickly and effectively to make good choices. They have to take the risk of action.

To influence this process we need to understand it, and to understand it more we need to study how scientists try to influence it. Psychology-based policy studies can provide important insights to help actors begin to measure and improve the effectiveness of their engagement in policy by: taking into account cognitive and emotional factors and the effect of identity on possible thought; and, considering how political actors are ‘embodied’ and situated in time, place, and social systems.

5 tentative suggestions

However, few psychological insights have been developed from direct studies of policymaking, and there is a limited evidence base. So, we provide preliminary advice by identifying the most relevant avenues of conceptual research and deriving some helpful ‘tools’ to those seeking to influence policy.

Our working assumption is that policymakers need to gather information quickly and effectively, so they develop heuristics to allow them to make what they believe to be good choices. Their solutions often seem to be driven more by their emotions than a ‘rational’ analysis of the evidence, partly because we hold them to a standard that no human can reach. If so, and if they have high confidence in their heuristics, they will dismiss our criticism as biased and naïve. Under those circumstances, restating the need for ‘evidence-based policymaking’ is futile, and naively ‘speaking truth to power’ counterproductive.

For us, heuristics represent simple alternative strategies, built on psychological insights to use psychological insights in policy practice. They are broad prompts towards certain ways of thinking and acting, not specific blueprints for action in all circumstances:

  1. Develop ways to respond positively to ‘irrational’ policymaking

Instead of automatically bemoaning the irrationality of policymakers, let’s marvel at the heuristics they develop to make quick decisions despite uncertainty. Then, let’s think about how to respond in a ‘fast and frugal’ way, to pursue the kinds of evidence informed policymaking that is realistic in a complex and constantly changing policymaking environment.

  1. Tailor framing strategies to policymaker bias

The usual advice is to minimise the cognitive burden of your presentation, and use strategies tailored to the ways in which people pay attention to, and remember information (at the beginning and end of statements, with repetition, and using concrete and immediate reference points).

What is the less usual advice? If policymakers are combining cognitive and emotive processes, combine facts with emotional appeals. If policymakers are making quick choices based on their values and simple moral judgements, tell simple stories with a hero and a clear moral. If policymakers are reflecting a group emotion, based on their membership of a coalition with firmly-held beliefs, frame new evidence to be consistent with the ‘lens’ through which actors in those coalitions understand the world.

 

  1. Identify the right time to influence individuals and processes

Understand what it means to find the right time to exploit ‘windows of opportunity’. ‘Timing’ can refer to the right time to influence an individual, which is relatively difficult to identify but with the possibility of direct influence, or to act while several political conditions are aligned, which presents less chance for you to make a direct impact.

  1. Adapt to real-world dysfunctional organisations rather than waiting for an orderly process to appear

Politicians may appear confident of policy and with a grasp of facts and details, but are (a) often vulnerable and defensive, and closed to challenging information, and/ or (b) inadequate in organisational politics, or unable to change the rules of their organisations. In the absence of institutional reforms, and presence of ‘dysfunctional’ processes, develop pragmatic strategies: form relationships in networks, coalitions, or organisations first, then supply challenging information second. To challenge without establishing trust may be counterproductive.

  1. Recognise that the biases we ascribe to policymakers are present in ourselves and our own groups.

Identifying only the biases in our competitors may help mask academic/ scientific examples of group-think, and it may be counterproductive to use euphemistic terms like ‘low information’ to describe actors whose views we do not respect. This is a particular problem for scholars if they assume that most people do not live up to their own imagined standards of high-information-led action.

It may be more effective to recognise that: (a) people’s beliefs are honestly held, and policymakers believe that their role is to serve a cause greater than themselves.; and, (b) a fundamental aspect of evolutionary psychology is that people need to get on with each other, so showing simple respect – or going further, to ‘mirror’ that person’s non-verbal signals – can be useful even if it looks facile.

This leaves open the ethical question of how far we should go to identify our biases, accept the need to work with people whose ways of thinking we do not share, and how far we should go to secure their trust without lying about one’s beliefs.

4 Comments

Filed under Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy

Using psychological insights in politics: can we do it without calling our opponents mental, hysterical, or stupid?

One of the most dispiriting parts of fierce political debate is the casual use of mental illness or old and new psychiatric terms to undermine an opponent: she is mad, he is crazy, she is a nutter, they are wearing tin foil hats, get this guy a straitjacket and the men in white coats because he needs to lie down in a dark room, she is hysterical, his position is bipolar, and so on. This kind of statement reflects badly on the campaigner rather than their opponent.

I say this because, while doing some research on a paper on the psychology of politics and policymaking (this time with Richard Kwiatkowski, as part of this special collection), there are potentially useful concepts that seem difficult to insulate from such political posturing. There is great potential to use them cynically against opponents rather than benefit from their insights.

The obvious ‘live’ examples relate to ‘rational’ versus ‘irrational’ policymaking. For example, one might argue that, while scientists develop facts and evidence rationally, using tried and trusted and systematic methods, politicians act irrationally, based on their emotions, ideologies, and groupthink. So, we as scientists are the arbiters of good sense and they are part of a pathological political process that contributes to ‘post truth’ politics.

The obvious problem with such accounts is that we all combine cognitive and emotional processes to think and act. We are all subject to bias in the gathering and interpretation of evidence. So, the more positive, but less tempting, option is to consider how this process works – when both competing sides act ‘rationally’ and emotionally – and what we can realistically do to mitigate the worst excesses of such exchanges. Otherwise, we will not get beyond demonising our opponents and romanticising our own cause. It gives us the warm and fuzzies on twitter and in academic conferences but contributes little to political conversations.

A less obvious example comes from modern work on the links between genes and attitudes. There is now a research agenda which uses surveys of adult twins to compare the effect of genes and environment on political attitudes. For example, Oskarsson et al (2015: 650) argue that existing studies ‘report that genetic factors account for 30–50% of the variation in issue orientations, ideology, and party identification’. One potential mechanism is cognitive ability: put simply, and rather cautiously and speculatively, with a million caveats, people with lower cognitive ability are more likely to see ‘complexity, novelty, and ambiguity’ as threatening and to respond with fear, risk aversion, and conservatism (2015: 652).

My immediate thought, when reading this stuff, is about how people would use it cynically, even at this relatively speculative stage in testing and evidence gathering: my opponent’s genes make him stupid, which makes him fearful of uncertainty and ambiguity, and therefore anxious about change and conservative in politics (in other words, the Yoda hypothesis applied only to stupid people). It’s not his fault, but his stupidity is an obstacle to progressive politics. If you add in some psychological biases, in which people inflate their own sense of intelligence and underestimate that of their opponents, you have evidence-informed, really shit political debate! ‘My opponent is stupid’ seems a bit better than ‘my opponent is mental’ but only in the sense that eating a cup of cold sick is preferable to eating shit.

I say this as we try to produce some practical recommendations (for scientist and advocates of EBPM) to engage with politicians to improve the use of evidence in policy. I’ll let you know if it goes beyond a simple maxim: adapt to their emotional and cognitive biases, but don’t simply assume they’re stupid.

See also: the many commentaries on how stupid it is to treat your political opponents as stupid

Stop Calling People “Low Information Voters

1 Comment

Filed under Evidence Based Policymaking (EBPM), Uncategorized