Category Archives: Psychology Based Policy Studies

Policy Concepts in 500 Words: Social Construction and Policy Design

Why would a democratic political system produce ‘degenerative’ policy that undermines democracy? Social Construction and Policy Design (SCPD) describes two main ways in which policymaking alienates many citizens:

1. The Social Construction of Target Populations

High profile politics and electoral competition can cause alienation:

  1. Political actors compete to tell ‘stories’ to assign praise or blame to groups of people. For example, politicians describe value judgements about who should be rewarded or punished by government. They base them on stereotypes of ‘target populations’, by (a) exploiting the ways in which many people think about groups, or (b) making emotional and superficial judgements, backed up with selective use of facts.
  2. These judgements have a ‘feed-forward’ effect: they are reproduced in policies, practices, and institutions. Such ‘policy designs’ can endure for years or decades. The distribution of rewards and sanctions is cumulative and difficult to overcome.
  3. Policy design has an impact on citizens, who participate in politics according to how they are characterised by government. Many know they will be treated badly; their engagement will be dispiriting.

Some groups have the power to challenge the way they are described by policymakers (and the media and public), and receive benefits behind the scenes despite their poor image. However, many people feel powerless, become disenchanted with politics, and do not engage in the democratic process.

SCTP depicts this dynamic with a 2-by-2 table in which target populations are described positively/ negatively and more or less able to respond:

SCPD 500 words 2 by 2

2. Bureaucratic and expert politics

Most policy issues are not salient and politicised in this way. Yet, low salience can exacerbate problems of citizen exclusion. Policies dominated by bureaucratic interests often alienate citizens receiving services. Or a small elite dominates policymaking when there is high acceptance that (a) the best policy is ‘evidence based’, and (b) the evidence should come from experts.

Overall, SCPD describes a political system with major potential to diminish democracy, containing key actors (a) politicising issues to reward or punish populations or (b) depoliticising issues with reference to science and objectivity. In both cases, policy design is not informed by routine citizen participation.

Take home message for students: SCPD began as Schneider and Ingram’s description of the US political system’s failure to solve major problems including inequality, poverty, crime, racism, sexism, and effective universal healthcare and education. Think about how its key drivers apply elsewhere: (1) some people make and exploit quick and emotional judgements for political gain, and others refer to expertise to limit debate; (2) these judgements inform policy design; and, (3) policy design sends signals to citizens which can diminish or boost their incentive to engage in politics.

For more, see the 1000-word and 5000-word versions. The latter has a detailed guide to further reading.

 

 

 

 

Leave a comment

Filed under 500 words, Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy, Storytelling

Stop Treating Cognitive Science Like a Disease

ssloman-lg

At the beginning is a guest post by Professor Steven Sloman, responding to Professor Daniel Sarewitz’s post in the Guardian called Stop treating science denial like a disease.  At the end is Dan Sarewitz’s reply. If you are wondering why this debate is now playing out on my website, there is a connection of sorts, in: (a) the work of the European Commission’s JRC, with Sloman speaking at its annual conference EU4Facts, and (b) the work of INGSA on government-science advice, in which Sarewitz plays a key role.  

Modern science has its problems. As reviewed in a recent editorial by Daniel Sarewitz, many branches of science have been suffering from a replication crisis. Scientists are under tremendous pressure to publish, and cutting scientific corners has, in some fields, become normal. This, he thinks, justifies a kind of science denialism, one that recognizes that not every word expressed by a scientist should be taken on faith.

Sarewitz is right on a couple of counts: Not every branch of science has equal authority. And in many areas, too much of too little value is being published. Some of it does not pass even weak tests of scientific care and rigor. But his wild claim in favor of denialism is bluster: Science is making faster progress today than at any time in history.

Sarewitz’s intended victim in his piece is cognitive science. He argues that cognitive science appeals to a deficit model (my term) to explain science denialism. People are ignorant, in Sarewitz’s parody of cognitive science, and therefore they fail to understand science. If only they were smarter, or taught the truth about science, they wouldn’t deny it, but rather use it as a guide to truth, justice, and all things good.

This is a position in cognitive science, especially cognitive science of the 70’s and 80’s. But even cognitive science makes progress and today it is a minority view. What does modern cognitive science actually suggest about our understanding of science denial? The answer is detailed in our book The Knowledge Illusion that Sarewitz takes issue with. He would have done well to read it before reviewing it because what we say is diametrically opposed to his report, and largely consistent with his view, though a whole lot more nuanced.

The deficit model applies to one form of reasoning, what we call intuition. The human brain generates beliefs based on naïve causal models about how the world works. These are often sketchy and flawed (consider racists’ understanding of people of other races). Individuals are quite ignorant about how the world works, not because people are stupid, but because the world is so complex. The chaotic, uncertain nature of the universe means that everything we encounter is a tangle of enormous numbers of elements and interactions, far more than any individual could comprehend, never mind retain. As we show in our book, even the lowly ballpoint pen represents untold complexity. The source of ignorance is not so much about the biology of the individual; it’s about the complexity of the world that the individual lives in.

Despite their ignorance, humans have accomplished amazing things, from creating symphonies to laptops. How? In large part by relying on a second form of human reasoning, deliberation. Deliberation is not constrained wholly by biology because it extends beyond the individual. Deliberative thought uses the body to remember for us and even to compute. That’s why emotions are critical for good decision making and why children use their fingers to count. Thinking also uses the world. We compute whether it’s safe to cross the street by looking to see if a car is coming, and we use the presence of dirty dishes on the counter to tell us whether the dishes need doing.

But more than anything, deliberation uses other people. Whether we’re getting our dishwasher fixed, our spiritual lives developed, or our political opinions formed, we are guided by those we deem experts and those we respect in our communities. To a large extent, people are not the rational processors of information that some enlightenment philosophers dreamed about; we are shills for our communities.

The positive side of this is that people are built to collaborate; we are social entities in the most fundamental way, as thinkers. The negative side is that we can subscribe to ideologies that are perpetuated to pursue the self-interest of community leaders, ideologies that have no rational basis. Indeed, the most fervent adherents of a view tend to know the least about it. Fortunately, we have found (not just assumed as Sarewitz says) that when people are asked to explain the consequences of the policies they adhere to, they become less extremist as they discover they don’t really understand.

Scientists live in communities too, and science is certainly vulnerable to these same social forces. That’s why the scientific method was developed: to put ideas to the test and let the cream rise to the top. This takes time, but because science reports eventually to the truth inherent in nature, human foible and peer review can only steer it off course temporarily.

Cognitive science has historically bought into the deficit model, treating failures of science literacy as a kind of disease. But Sarewitz should practice the care and rigor that he preaches by reporting correctly: Cognitive science, like many forms of science, is slowly getting it right.

Reply by Dan Sarewitz

Normally I don’t respond to this kind of thing but a couple points demand rebuttal.

First:  I actually did read their book, cover-to-cover.  Neither the Guardian piece, nor the longer talk from which it draws, are book reviews, they are critiques of the larger intellectual program that The Knowledge Illusion positions itself within.

Second, the idea that deliberative and collaborative activities are a powerful sources of human creativity that overcome the cognitive limits of the individual is an entirely familiar one that has been well-recognized for centuries.  As Professor Sloman indicates, it occupies much of his book, and much of his comment above.  But it was not relevant to my concerns, which were how Sloman and co-author Fernbach position human cognitive limits as a source of so much difficulty in today’s world.  They write:  “Because we confuse the knowledge in our heads with the knowledge we have access to, we are largely unaware of how little we understand.  We live with the belief that we understand more than we do.  As we will explore in the rest of the book, many of society’s most pressing problems stem from this illusion.” [p. 129, my italics]  They wrote this, I didn’t.

Third, having read their book carefully, I am indeed well aware that Sloman and Fernbach understand the limits of the deficit model.  But as they make clear in a subsection of chapter 8, entitled “Filling the Deficit,” they still believe that IF ONLY people understood more about science, then “many of the societies most pressing problems” could be more effectively addressed:  “And the knowledge illusion means that we don’t check our understanding often or deeply enough.  This is a recipe for antiscientific thinking.” [p. 169] A page later they write: “[P]erhaps it is too soon to give up on the deficit model.”

This is where my posting sought to engage with Professor Sloman’s book.  I don’t think that people’s understanding of scientific knowledge has much at all to do with “many of society’s most pressing problems,” for reasons that I point toward in the Guardian piece, and have written extensively about in many other forums.  Professor Sloman may not agree with this position, but his comments above fail to indicate that he actually recognizes or understands it.

Finally, Professor Sloman writes, both tendentiously and with an apparently tone-deaf ear, that my “wild claim in favor of denialism is bluster: Science is making faster progress today than at any time in history.”  The progress of science is irrelevant to my argument, which addresses the intersection of politics and a scientific enterprise that is always pushing into the realm of the uncertain, the unknown, the unknowable, the contestable, the contingent—even as it also, sometimes and in some directions, makes magnificent progress.

Perhaps there is a valuable discussion to be had about whether poor understanding of science by the public is relevant to “many of society’s most pressing problems.”  My view is that this is an overblown, distracting, and to some extent dangerous belief on the part of some scientists, as I indicate in the Guardian post and in many other writings.  Professor Sloman may disagree, but his complaints here are about something else entirely, something that I didn’t write.

Leave a comment

Filed under Psychology Based Policy Studies, Uncategorized

Three ways to communicate more effectively with policymakers

By Paul Cairney and Richard Kwiatkowski

Use psychological insights to inform communication strategies

Policymakers cannot pay attention to all of the things for which they are responsible, or understand all of the information they use to make decisions. Like all people, there are limits on what information they can process (Baddeley, 2003; Cowan, 2001, 2010; Miller, 1956; Rock, 2008).

They must use short cuts to gather enough information to make decisions quickly: the ‘rational’, by pursuing clear goals and prioritizing certain kinds of information, and the ‘irrational’, by drawing on emotions, gut feelings, values, beliefs, habits, schemata, scripts, and what is familiar, to make decisions quickly. Unlike most people, they face unusually strong pressures on their cognition and emotion.

Policymakers need to gather information quickly and effectively, often in highly charged political atmospheres, so they develop heuristics to allow them to make what they believe to be good choices. Perhaps their solutions seem to be driven more by their values and emotions than a ‘rational’ analysis of the evidence, often because we hold them to a standard that no human can reach.

If so, and if they have high confidence in their heuristics, they will dismiss criticism from researchers as biased and naïve. Under those circumstances, we suggest that restating the need for ‘rational’ and ‘evidence-based policymaking’ is futile, naively ‘speaking truth to power’ counterproductive, and declaring ‘policy based evidence’ defeatist.

We use psychological insights to recommend a shift in strategy for advocates of the greater use of evidence in policy. The simple recommendation, to adapt to policymakers’ ‘fast thinking’ (Kahneman, 2011) rather than bombard them with evidence in the hope that they will get round to ‘slow thinking’, is already becoming established in evidence-policy studies. However, we provide a more sophisticated understanding of policymaker psychology, to help understand how people think and make decisions as individuals and as part of collective processes. It allows us to (a) combine many relevant psychological principles with policy studies to (b) provide several recommendations for actors seeking to maximise the impact of their evidence.

To ‘show our work’, we first summarise insights from policy studies already drawing on psychology to explain policy process dynamics, and identify key aspects of the psychology literature which show promising areas for future development.

Then, we emphasise the benefit of pragmatic strategies, to develop ways to respond positively to ‘irrational’ policymaking while recognising that the biases we ascribe to policymakers are present in ourselves and our own groups. Instead of bemoaning the irrationality of policymakers, let’s marvel at the heuristics they develop to make quick decisions despite uncertainty. Then, let’s think about how to respond effectively. Instead of identifying only the biases in our competitors, and masking academic examples of group-think, let’s reject our own imagined standards of high-information-led action. This more self-aware and humble approach will help us work more successfully with other actors.

On that basis, we provide three recommendations for actors trying to engage skilfully in the policy process:

  1. Tailor framing strategies to policymaker bias. If people are cognitive misers, minimise the cognitive burden of your presentation. If policymakers combine cognitive and emotive processes, combine facts with emotional appeals. If policymakers make quick choices based on their values and simple moral judgements, tell simple stories with a hero and moral. If policymakers reflect a ‘group emotion’, based on their membership of a coalition with firmly-held beliefs, frame new evidence to be consistent with those beliefs.
  2. Identify ‘windows of opportunity’ to influence individuals and processes. ‘Timing’ can refer to the right time to influence an individual, depending on their current way of thinking, or to act while political conditions are aligned.
  3. Adapt to real-world ‘dysfunctional’ organisations rather than waiting for an orderly process to appear. Form relationships in networks, coalitions, or organisations first, then supply challenging information second. To challenge without establishing trust may be counterproductive.

These tips are designed to produce effective, not manipulative, communicators. They help foster the clearer communication of important policy-relevant evidence, rather than imply that we should bend evidence to manipulate or trick politicians. We argue that it is pragmatic to work on the assumption that people’s beliefs are honestly held, and policymakers believe that their role is to serve a cause greater than themselves. To persuade them to change course requires showing simple respect and seeking ways to secure their trust, rather than simply ‘speaking truth to power’. Effective engagement requires skilful communication and good judgement as much as good evidence.


This is the introduction to our revised and resubmitted paper to the special issue of Palgrave Communications The politics of evidence-based policymaking: how can we maximise the use of evidence in policy? Please get in touch if you are interested in submitting a paper to the series.

Full paper: Cairney Kwiatkowski Palgrave Comms resubmission CLEAN 14.7.17

1 Comment

Filed under agenda setting, Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy

The role of evidence in UK policymaking after Brexit

We are launching a series of papers on evidence and policy in Palgrave Communications. Of course, we used Brexit as a hook, to tap into current attention to instability and major policy change. However, many of the issues we discuss are timeless and about surprising levels of stability and continuity in policy processes, despite periods of upheaval.

In my day, academics would build their careers on being annoying, and sometimes usefully annoying. This would involve developing counterintuitive insights, identifying gaps in analysis, and challenging a ‘common wisdom’ in political studies. Although not exactly common wisdom, the idea of ‘post truth’ politics, a reduction in respect for ‘experts’, and a belief that Brexit is a policymaking game-changer, are great candidates for some annoyingly contrary analysis.

In policy studies, many of us argue that things like elections, changes of government, and even constitutional changes are far less important than commonly portrayed. In media and social media accounts, we find hyperbole about the destabilising and changing impact of the latest events. In policy studies, we often stress stability and continuity.  My favourite old example regards the debates from the 1970s about electoral reform. While some were arguing that first-past-the-post was a disastrous electoral system since it produces swings of government, instability, and incoherent policy change, Richardson and Jordan would point out surprisingly high levels of stability and continuity.

Finer and Jordan Cairney

In part, this is because the state is huge, policymakers can only pay attention to a tiny part of it, and therefore most of it is processed as a low level of government, out of the public spotlight.

UPP p106

These insights still have profound relevance today, for two key reasons.

  1. The role of experts is more important than you think

This larger process provides far more opportunities for experts than we’d associate with ‘tip of the iceberg’ politics.

Some issues are salient. They command the interest of elected politicians, and those politicians often have firm beliefs that limit the ‘impact’ of any evidence that does not support their beliefs.

However, most issues are not salient. They command minimal interest, they are processed by other policymakers, and those policymakers are looking for information and advice from reliable experts.

Indeed, a lot of policy studies highlight the privileged status of certain experts, at the expense of most members of the public (which is a useful corrective to the story, associated with Brexit, that the public is too emotionally driven, too sceptical of experts, and too much in charge of the future of constitutional change).

So, Brexit will change the role of experts, but expect that change to relate to the venue in which they engage, and the networks of which they are a part, more than the practices of policymakers. Much policymaking is akin to an open door to government for people with useful information and a reputation for being reliable in their dealings with policymakers.

  1. Provide less evidence for more impact

If the problem is that policymakers can only pay attention to a tiny proportion of their responsibilities, the solution is not to bombard them with a huge amount of evidence. Instead, assume that they seek ways to ignore almost all information while still managing to make choices. The trick may be to provide just enough information to prompt demand for more, not oversupply evidence on the assumption that you have only one chance for influence.

With Richard Kwiatkoswki, I draw on policy and psychology studies to help us understand how to supply evidence to anyone using ‘rational’ and ‘irrational’ ways to limit their attention, information processing, and thought before making decisions.

Our working assumption is that policymakers need to gather information quickly and effectively, so they develop heuristics to allow them to make what they believe to be good choices. Their solutions often seem to be driven more by their emotions than a ‘rational’ analysis of the evidence, partly because we hold them to a standard that no human can reach. If so, and if they have high confidence in their heuristics, they will dismiss our criticism as biased and naïve. Under those circumstances, restating the need for ‘evidence-based policymaking’ is futile, and naively ‘speaking truth to power’ counterproductive.

Instead, try out these strategies:

  1. Develop ways to respond positively to ‘irrational’ policymaking

Instead of automatically bemoaning the irrationality of policymakers, let’s marvel at the heuristics they develop to make quick decisions despite uncertainty. Then, let’s think about how to respond pragmatically, to pursue the kinds of evidence informed policymaking that is realistic in a complex and constantly changing policymaking environment.

  1. Tailor framing strategies to policymaker cognition

The usual advice is to minimise the cognitive burden of your presentation, and use strategies tailored to the ways in which people pay attention to, and remember information.

The less usual advice includes:

  • If policymakers are combining cognitive and emotive processes, combine facts with emotional appeals.
  • If policymakers are making quick choices based on their values and simple moral judgements, tell simple stories with a hero and a clear moral.
  • If policymakers are reflecting a ‘group emotion’, based on their membership of a coalition with firmly-held beliefs, frame new evidence to be consistent with the ‘lens’ through which actors in those coalitions understand the world.
  1. Identify the right time to influence individuals and processes

Understand what it means to find the right time to exploit ‘windows of opportunity’.

‘Timing’ can refer to the right time to influence an individual, which involves how open they are to, say, new arguments and evidence.

Or, timing refers to a ‘window of opportunity’ when political conditions are aligned. I discuss the latter in a separate paper on effective ‘policy entrepreneurs’.

  1. Adapt to real-world organisations rather than waiting for an orderly process to appear

Politicians may appear confident of policy and with a grasp of facts and details, but are (a) often vulnerable and therefore defensive or closed to challenging information, and/ or (b) inadequate in organisational politics, or unable to change the rules of their organisations.

So, develop pragmatic strategies: form relationships in networks, coalitions, or organisations first, then supply challenging information second. To challenge without establishing trust may be counterproductive.

  1. Recognise that the biases we ascribe to policymakers are present in ourselves and our own groups.

Identifying only the biases in our competitors may help mask academic/ scientific examples of group-think, and it may be counterproductive to use euphemistic terms like ‘low information’ to describe actors whose views we do not respect. This is a particular problem for scholars if they assume that most people do not live up to their own imagined standards of high-information-led action (often described as a ‘deficit model’ of engagement).

It may be more effective to recognise that: (a) people’s beliefs are honestly held, and policymakers believe that their role is to serve a cause greater than themselves.; and, (b) a fundamental aspect of evolutionary psychology is that people need to get on with each other, so showing simple respect – or going further, to ‘mirror’ that person’s non-verbal signals – can be useful even if it looks facile.

This leaves open the ethical question of how far we should go to identify our biases, accept the need to work with people whose ways of thinking we do not share, and how far we should go to secure their trust without lying about one’s beliefs.

At the very least, we do not suggest these 5 strategies as a way to manipulate people for personal gain. They are better seen as ways to use psychology to communicate well. They are also likely to be as important to policy engagement regardless of Brexit. Venues may change quickly, but the ways in which people process information and make choices may not.

 

2 Comments

Filed under Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy, UK politics and policy

I know my audience, but does my other audience know I know my audience?

‘Know your audience’ is a key phrase for anyone trying to convey a message successfully. To ‘know your audience’ is to understand the rules they use to make sense of your message, and therefore the adjustments you have to make to produce an effective message. Simple examples include:

  • The sarcasm rules. The first rule is fairly explicit. If you want to insult someone’s shirt, you (a) say ‘nice shirt, pal’, but also (b) use facial expressions or unusual speech patterns to signal that you mean the opposite of what you are saying. Otherwise, you’ve inadvertently paid someone a compliment, which is just not on. The second rule is implicit. Sarcasm is sometimes OK – as a joke or as some nice passive aggression – and a direct insult (‘that shirt is shite, pal’) as a joke is harder to pull off.
  • The joke rule. If you say that you went to the doctor because a strawberry was growing out of your arse and the doctor gave you some cream for it, you’d expect your audience to know you were joking because it’s such a ridiculous scenario and there’s a pun. Still, there’s a chance that, if you say it quickly, with a straight face, your audience is not expecting a joke, and/ or your audience’s first language is not English, your audience will take you seriously, if only for a second. It’s hilarious if your audience goes along with you, and a bit awkward if your audience asks kindly about your welfare.
  • Keep it simple stupid. If someone says KISS, or some modern equivalent – ‘it’s the economy, stupid’, the rule is that, generally, they are not calling you stupid (even though the insertion of the comma, in modern phrases, makes it look like they are). They are referring to the value of a simple design or explanation that as many people as possible can understand. If your audience doesn’t know the phrase, they may think you’re calling them stupid, stupid.

These rules can be analysed from various perspectives: linguistics, focusing on how and why rules of language develop; and philosophy, to help articulate how and why rules matter in sense making.

There is also a key role for psychological insights, since – for example – a lot of these rules relate to the routine ways in which people engage emotionally with the ‘signals’ or information they receive.

Think of the simple example of twitter engagement, in which people with emotional attachments to one position over another (say, pro- or anti- Brexit), respond instantly to a message (say, pro- or anti- Brexit). While some really let themselves down when they reply with their own tweet, and others don’t say a word, neither audience is immune from that emotional engagement with information. So, to ‘know your audience’ is to anticipate and adapt to the ways in which they will inevitably engage ‘rationally’ and ‘irrationally’ with your message.

I say this partly because I’ve been messing around with some simple ‘heuristics’ built on insights from psychology, including Psychology Based Policy Studies: 5 heuristics to maximise the use of evidence in policymaking .

Two audiences in the study of ‘evidence based policymaking’

I also say it because I’ve started to notice a big unintended consequence of knowing my audience: my one audience doesn’t like the message I’m giving the other. It’s a bit like gossip: maybe you only get away with it if only one audience is listening. If they are both listening, one audience seems to appreciate some new insights, while the other wonders if I’ve ever read a political science book.

The problem here is that two audiences have different rules to understand the messages that I help send. Let’s call them ‘science’ and ‘political science’ (please humour me – you’ve come this far). Then, let’s make some heroic binary distinctions in the rules each audience would use to interpret similar issues in a very different way.

I could go on with these provocative distinctions, but you get the idea. A belief taken for granted in one field will be treated as controversial in another. In one day, you can go to one workshop and hear the story of objective evidence, post-truth politics, and irrational politicians with low political will to select evidence-based policies, then go to another workshop and hear the story of subjective knowledge claims.

Or, I can give the same presentation and get two very different reactions. If these are the expectations of each audience, they will interpret and respond to my messages in very different ways.

So, imagine I use some psychology insights to appeal to the ‘science’ audience. I know that,  to keep it on side and receptive to my ideas, I should begin by being sympathetic to its aims. So, my implicit story is along the lines of, ‘if you believe in the primacy of science and seek evidence-based policy, here is what you need to do: adapt to irrational policymaking and find out where the action is in a complex policymaking system’. Then, if I’m feeling energetic and provocative, I’ll slip in some discussion about knowledge claims by saying something like, ‘politicians (and, by the way, some other scholars) don’t share your views on the hierarchy of evidence’, or inviting my audience to reflect on how far they’d go to override the beliefs of other people (such as the local communities or service users most affected by the evidence-based policies that seem most effective).

The problem with this story is that key parts are implicit and, by appearing to go along with my audience, I provoke a reaction in another audience: don’t you know that many people have valid knowledge claims? Politics is about values and power, don’t you know?

So, that’s where I am right now. I feel like I ‘know my audience’ but I am struggling to explain to my original political science audience that I need to describe its insights in a very particular way to have any traction in my other science audience. ‘Know your audience’ can only take you so far unless your other audience knows that you are engaged in knowing your audience.

If you want to know more, see:

Kathryn Oliver and I have just published an article on the relationship between evidence and policy

How far should you go to secure academic ‘impact’ in policymaking? From ‘honest brokers’ to ‘research purists’ and Machiavellian manipulators

Why doesn’t evidence win the day in policy and policymaking?

The Science of Evidence-based Policymaking: How to Be Heard

When presenting evidence to policymakers, engage with the policy process that exists, not the process you wish existed

 

 

1 Comment

Filed under Academic innovation or navel gazing, agenda setting, Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy, Storytelling

Psychology Based Policy Studies: 5 heuristics to maximise the use of evidence in policymaking

Richard Kwiatkowski and I combine policy studies and psychology to (a) take forward ‘Psychology Based Policy Studies’, and (b) produce practical advice for actors engaged in the policy process.

Cairney Kwiatkowski abstract

Most policy studies, built on policy theory, explain policy processes without identifying practical lessons. They identify how and why people make decisions, and situate this process of choice within complex systems of environments in which there are many actors at multiple levels of government, subject to rules, norms, and group influences, forming networks, and responding to socioeconomic dynamics. This approach helps generate demand for more evidence of the role of psychology in these areas:

  1. To do more than ‘psychoanalyse’ a small number of key actors at the ‘centre’ of government.
  2. To consider how and why actors identify, understand, follow, reproduce, or seek to shape or challenge, rules within their organisations or networks.
  3. To identify the role of network formation and maintenance, and the extent to which it is built on heuristics to establish trust and the regular flow of information and advice.
  4. To examine the extent to which persuasion can be used to prompt actors to rethink their beliefs – such as when new evidence or a proposed new solution challenges the way that a problem is framed, how much attention it receives, and how it is solved.
  5. To consider (a) the effect of events such as elections on the ways in which policymakers process evidence (e.g. does it encourage short-term and vote-driven calculations?), and (b) what prompts them to pay attention to some contextual factors and not others.

This literature highlights the use of evidence by actors who anticipate or respond to lurches of attention, moral choices, and coalition formation built on bolstering one’s own position, demonising competitors, and discrediting (some) evidence. Although this aspect of choice should not be caricatured – it is not useful simply to bemoan ‘post-truth’ politics and policymaking ‘irrationality’ – it provides a useful corrective to the fantasy of a linear policy process in which evidence can be directed to a single moment of authoritative and ‘comprehensively rational’ choice based only on cognition. Political systems and human psychology combine to create a policy process characterised by many actors competing to influence continuous policy choice built on cognition and emotion.

What are the practical implications?

Few studies consider how those seeking to influence policy should act in such environments or give advice about how they can engage effectively in the policy process. Of course context is important, and advice needs to be tailored and nuanced, but that is not necessarily a reason to side-step the issue of moving beyond description. Further, policymakers and influencers do not have this luxury. They need to gather information quickly and effectively to make good choices. They have to take the risk of action.

To influence this process we need to understand it, and to understand it more we need to study how scientists try to influence it. Psychology-based policy studies can provide important insights to help actors begin to measure and improve the effectiveness of their engagement in policy by: taking into account cognitive and emotional factors and the effect of identity on possible thought; and, considering how political actors are ‘embodied’ and situated in time, place, and social systems.

5 tentative suggestions

However, few psychological insights have been developed from direct studies of policymaking, and there is a limited evidence base. So, we provide preliminary advice by identifying the most relevant avenues of conceptual research and deriving some helpful ‘tools’ to those seeking to influence policy.

Our working assumption is that policymakers need to gather information quickly and effectively, so they develop heuristics to allow them to make what they believe to be good choices. Their solutions often seem to be driven more by their emotions than a ‘rational’ analysis of the evidence, partly because we hold them to a standard that no human can reach. If so, and if they have high confidence in their heuristics, they will dismiss our criticism as biased and naïve. Under those circumstances, restating the need for ‘evidence-based policymaking’ is futile, and naively ‘speaking truth to power’ counterproductive.

For us, heuristics represent simple alternative strategies, built on psychological insights to use psychological insights in policy practice. They are broad prompts towards certain ways of thinking and acting, not specific blueprints for action in all circumstances:

  1. Develop ways to respond positively to ‘irrational’ policymaking

Instead of automatically bemoaning the irrationality of policymakers, let’s marvel at the heuristics they develop to make quick decisions despite uncertainty. Then, let’s think about how to respond in a ‘fast and frugal’ way, to pursue the kinds of evidence informed policymaking that is realistic in a complex and constantly changing policymaking environment.

  1. Tailor framing strategies to policymaker bias

The usual advice is to minimise the cognitive burden of your presentation, and use strategies tailored to the ways in which people pay attention to, and remember information (at the beginning and end of statements, with repetition, and using concrete and immediate reference points).

What is the less usual advice? If policymakers are combining cognitive and emotive processes, combine facts with emotional appeals. If policymakers are making quick choices based on their values and simple moral judgements, tell simple stories with a hero and a clear moral. If policymakers are reflecting a group emotion, based on their membership of a coalition with firmly-held beliefs, frame new evidence to be consistent with the ‘lens’ through which actors in those coalitions understand the world.

 

  1. Identify the right time to influence individuals and processes

Understand what it means to find the right time to exploit ‘windows of opportunity’. ‘Timing’ can refer to the right time to influence an individual, which is relatively difficult to identify but with the possibility of direct influence, or to act while several political conditions are aligned, which presents less chance for you to make a direct impact.

  1. Adapt to real-world dysfunctional organisations rather than waiting for an orderly process to appear

Politicians may appear confident of policy and with a grasp of facts and details, but are (a) often vulnerable and defensive, and closed to challenging information, and/ or (b) inadequate in organisational politics, or unable to change the rules of their organisations. In the absence of institutional reforms, and presence of ‘dysfunctional’ processes, develop pragmatic strategies: form relationships in networks, coalitions, or organisations first, then supply challenging information second. To challenge without establishing trust may be counterproductive.

  1. Recognise that the biases we ascribe to policymakers are present in ourselves and our own groups.

Identifying only the biases in our competitors may help mask academic/ scientific examples of group-think, and it may be counterproductive to use euphemistic terms like ‘low information’ to describe actors whose views we do not respect. This is a particular problem for scholars if they assume that most people do not live up to their own imagined standards of high-information-led action.

It may be more effective to recognise that: (a) people’s beliefs are honestly held, and policymakers believe that their role is to serve a cause greater than themselves.; and, (b) a fundamental aspect of evolutionary psychology is that people need to get on with each other, so showing simple respect – or going further, to ‘mirror’ that person’s non-verbal signals – can be useful even if it looks facile.

This leaves open the ethical question of how far we should go to identify our biases, accept the need to work with people whose ways of thinking we do not share, and how far we should go to secure their trust without lying about one’s beliefs.

4 Comments

Filed under Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy