Tag Archives: policymaking

#EU4Facts: 3 take-home points from the JRC annual conference

See EU4FACTS: Evidence for policy in a post-fact world

The JRC’s annual conference has become a key forum in which to discuss the use of evidence in policy. At this scale, in which many hundreds of people attend plenary discussions, it feels like an annual mass rally for science; a ‘call to arms’ to protect the role of science in the production of evidence, and the protection of evidence in policy deliberation. There is not much discussion of storytelling, but we tell each other a fairly similar story about our fears for the future unless we act now.

Last year, the main story was of fear for the future of heroic scientists: the rise of Trump and the Brexit vote prompted many discussions of post-truth politics and reduced trust in experts. An immediate response was to describe attempts to come together, and stick together, to support each other’s scientific endeavours during a period of crisis. There was little call for self-analysis and reflection on the contribution of scientists and experts to barriers between evidence and policy.

This year was a bit different. There was the same concern for reduced trust in science, evidence, and/ or expertise, and some references to post-truth politics and populism, but with some new voices describing the positive value of politics, often when discussing the need for citizen engagement, and of the need to understand the relationship between facts, values, and politics.

For example, a panel on psychology opened up the possibility that we might consider our own politics and cognitive biases while we identify them in others, and one panellist spoke eloquently about the importance of narrative and storytelling in communicating to audiences such as citizens and policymakers.

A focus on narrative is not new, but it provides a challenging agenda when interacting with a sticky story of scientific objectivity. For the unusually self-reflective, it also reminds us that our annual discussions are not particularly scientific; the usual rules to assess our statements do not apply.

As in studies of policymaking, we can say that there is high support for such stories when they remain vague and driven more by emotion than the pursuit of precision. When individual speakers try to make sense of the same story, they do it in different – and possibly contradictory – ways. As in policymaking, the need to deliver something concrete helps focus the mind, and prompts us to make choices between competing priorities and solutions.

I describe these discussions in two ways: tables, in which I try to boil down each speaker’s speech into a sentence or two (you can get their full details in the programme and the speaker bios); and a synthetic discussion of the top 3 concerns, paraphrasing and combining arguments from many speakers:

1. What are facts?

The key distinction began as between politics-values-facts which is impossible to maintain in practice.

Yet, subsequent discussion revealed a more straightforward distinction between facts and opinion, ‘fake news’, and lies. The latter sums up an ever-present fear of the diminishing role of science in an alleged ‘post truth’ era.

2. What exactly is the problem, and what is its cause?

The tables below provide a range of concerns about the problem, from threats to democracy to the need to communicate science more effectively. A theme of growing importance is the need to deal with the cognitive biases and informational shortcuts of people receiving evidence: communicate with reference to values, beliefs, and emotions; build up trust in your evidence via transparency and reliability; and, be prepared to discuss science with citizens and to be accountable for your advice. There was less discussion of the cognitive biases of the suppliers of evidence.

3. What is the role of scientists in relation to this problem?

Not all speakers described scientists as the heroes of this story:

  • Some described scientists as the good people acting heroically to change minds with facts.
  • Some described their potential to co-produce important knowledge with citizens (although primarily with like-minded citizens who learn the value of scientific evidence?).
  • Some described the scientific ego as a key barrier to action.
  • Some identified their low confidence to engage, their uncertainty about what to do with their evidence, and/ or their scientist identity which involves defending science as a cause/profession and drawing the line between providing information and advocating for policy. This hope to be an ‘honest broker’ was pervasive in last year’s conference.
  • Some (rightly) rejected the idea of separating facts/ values and science/ politics, since evidence is never context free (and gathering evidence without thought to context is amoral).

Often in such discussions it is difficult to know if some scientists are naïve actors or sophisticated political strategists, because their public statements could be identical. For the former, an appeal to objective facts and the need to privilege science in EBPM may be sincere. Scientists are, and should be, separate from/ above politics. For the latter, the same appeal – made again and again – may be designed to energise scientists and maximise the role of science in politics.

Yet, energy is only the starting point, and it remains unclear how exactly scientists should communicate and how to ‘know your audience’: would many scientists know who to speak to, in governments or the Commission, if they had something profoundly important to say?

Keynotes and introductory statements from panel chairs
Vladimír Šucha: We need to understand the relationship between politics, values, and facts. Facts are not enough. To make policy effectively, we need to combine facts and values.
Tibor Navracsics: Politics is swayed more by emotions than carefully considered arguments. When making policy, we need to be open and inclusive of all stakeholders (including citizens), communicate facts clearly and at the right time, and be aware of our own biases (such as groupthink).
Sir Peter Gluckman: ‘Post-truth’ politics is not new, but it is pervasive and easier to achieve via new forms of communication. People rely on like-minded peers, religion, and anecdote as forms of evidence underpinning their own truth. When describing the value of science, to inform policy and political debate, note that it is more than facts; it is a mode of thinking about the world, and a system of verification to reduce the effect of personal and group biases on evidence production. Scientific methods help us define problems (e.g. in discussion of cause/ effect) and interpret data. Science advice involves expert interpretation, knowledge brokerage, a discussion of scientific consensus and uncertainty, and standing up for the scientific perspective.
Carlos Moedas: Safeguard trust in science by (1) explaining the process you use to come to your conclusions; (2) provide safe and reliable places for people to seek information (e.g. when they Google); (3) make sure that science is robust and scientific bodies have integrity (such as when dealing with a small number of rogue scientists).
Pascal Lamy: 1. ‘Deep change or slow death’ We need to involve more citizens in the design of publicly financed projects such as major investments in science. Many scientists complain that there is already too much political interference, drowning scientists in extra work. However, we will face a major backlash – akin to the backlash against ‘globalisation’ – if we do not subject key debates on the future of science and technology-driven change (e.g. on AI, vaccines, drone weaponry) to democratic processes involving citizens. 2. The world changes rapidly, and evidence gathering is context-dependent, so we need to monitor regularly the fitness of our scientific measures (of e.g. trade).
Jyrki Katainen: ‘Wicked problems’ have no perfect solution, so we need the courage to choose the best imperfect solution. Technocratic policymaking is not the solution; it does not meet the democratic test. We need the language of science to be understandable to citizens: ‘a new age of reason reconciling the head and heart’.

Panel: Why should we trust science?
Jonathan Kimmelman: Some experts make outrageous and catastrophic claims. We need a toolbox to decide which experts are most reliable, by comparing their predictions with actual outcomes. Prompt them to make precise probability statements and test them. Only those who are willing to be held accountable should be involved in science advice.
Johannes Vogel: We should devote 15% of science funding to public dialogue. Scientific discourse, and a science-literature population, is crucial for democracy. EU Open Society Policy is a good model for stakeholder inclusiveness.
Tracey Brown: Create a more direct link between society and evidence production, to ensure discussions involve more than the ‘usual suspects’. An ‘evidence transparency framework’ helps create a space in which people can discuss facts and values. ‘Be open, speak human’ describes showing people how you make decisions. How can you expect the public to trust you if you don’t trust them enough to tell them the truth?
Francesco Campolongo: Claude Juncker’s starting point is that Commission proposals and activities should be ‘based on sound scientific evidence’. Evidence comes in many forms. For example, economic models provide simplified versions of reality to make decisions. Economic calculations inform profoundly important policy choices, so we need to make the methodology transparent, communicate probability, and be self-critical and open to change.

Panel: the politician’s perspective
Janez Potočnik: The shift of the JRC’s remit allowed it to focus on advocating science for policy rather than policy for science. Still, such arguments need to be backed by an economic argument (this policy will create growth and jobs). A narrow focus on facts and data ignores the context in which we gather facts, such as a system which undervalues human capital and the environment.
Máire Geoghegan-Quinn: Policy should be ‘solidly based on evidence’ and we need well-communicated science to change the hearts and minds of people who would otherwise rely on their beliefs. Part of the solution is to get, for example, kids to explain what science means to them.

Panel: Redesigning policymaking using behavioural and decision science
Steven Sloman: The world is complex. People overestimate their understanding of it, and this illusion is burst when they try to explain its mechanisms. People who know the least feel the strongest about issues, but if you ask them to explain the mechanisms their strength of feeling falls. Why? People confuse their knowledge with that of their community. The knowledge is not in their heads, but communicated across groups. If people around you feel they understand something, you feel like you understand, and people feel protective of the knowledge of their community. Implications? 1. Don’t rely on ‘bubbles’; generate more diverse and better coordinated communities of knowledge. 2. Don’t focus on giving people full information; focus on the information they need at the point of decision.
Stephan Lewandowsky: 97% of scientists agree that human-caused climate change is a problem, but the public thinks it’s roughly 50-50. We have a false-balance problem. One solution is to ‘inoculate’ people against its cause (science denial). We tell people the real figures and facts, warn them of the rhetorical techniques employed by science denialists (e.g. use of false experts on smoking), and mock the false balance argument. This allows you to reframe the problem as an investment in the future, not cost now (and find other ways to present facts in a non-threatening way). In our lab, it usually ‘neutralises’ misinformation, although with the risk that a ‘corrective message’ to challenge beliefs can entrench them.
Françoise Waintrop: It is difficult to experiment when public policy is handed down from on high. Or, experimentation is alien to established ways of thinking. However, our 12 new public innovation labs across France allow us to immerse ourselves in the problem (to define it well) and nudge people to action, working with their cognitive biases.
Simon Kuper: Stories combine facts and values. To change minds: persuade the people who are listening, not the sceptics; find go-betweens to link suppliers and recipients of evidence; speak in stories, not jargon; don’t overpromise the role of scientific evidence; and, never suggest science will side-line human beings (e.g. when technology costs jobs).

Panel: The way forward
Jean-Eric Paquet: We describe ‘fact based evidence’ rather than ‘science based’. A key aim is to generate ‘ownership’ of policy by citizens. Politicians are more aware of their cognitive biases than we technocrats are.
Anne Bucher: In the European Commission we used evidence initially to make the EU more accountable to the public, via systematic impact assessment and quality control. It was a key motivation for better regulation. We now focus more on generating inclusive and interactive ways to consult stakeholders.
Ann Mettler: Evidence-based policymaking is at the heart of democracy. How else can you legitimise your actions? How else can you prepare for the future? How else can you make things work better? Yet, a lot of our evidence presentation is so technical; even difficult for specialists to follow. The onus is on us to bring it to life, to make it clearer to the citizen and, in the process, defend scientists (and journalists) during a period in which Western democracies seem to be at risk from anti-democratic forces.
Mariana Kotzeva: Our facts are now considered from an emotional and perception point of view. The process does not just involve our comfortable circle of experts; we are now challenged to explain our numbers. Attention to our numbers can be unpredictable (e.g. on migration). We need to build up trust in our facts, partly to anticipate or respond to the quick spread of poor facts.
Rush Holt: In society we can find the erosion of the feeling that science is relevant to ‘my life’, and few US policymakers ask ‘what does science say about this?’ partly because scientists set themselves above politics. Politicians have had too many bad experiences with scientists who might say ‘let me explain this to you in a way you can understand’. Policy is not about science based evidence; more about asking a question first, then asking what evidence you need. Then you collect evidence in an open way to be verified.

Phew!

That was 10 hours of discussion condensed into one post. If you can handle more discussion from me, see:

Psychology and policymaking: Three ways to communicate more effectively with policymakers

The role of evidence in policy: EBPM and How to be heard  

Practical Lessons from Policy Theories

The generation of many perspectives to help us understand the use of evidence

How to be an ‘entrepreneur’ when presenting evidence

 

 

 

Leave a comment

Filed under Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy, Storytelling

How can governments better collaborate to address complex problems?

Swann Kim

This is a guest post by William L. Swann (left) and Seo Young Kim (right), discussing how to use insights from the Institutional Collective Action Framework to think about how to improve collaborative governance. The full paper has been submitted to the series for Policy and Politics called Practical Lessons from Policy Theories.

Collective Action_1

Many public policy problems cannot be addressed effectively by a single, solitary government. Consider the problems facing the Greater Los Angeles Area, a heavily fragmented landscape of 88 cities and numerous unincorporated areas and special districts. Whether it is combatting rising homelessness, abating the country’s worst air pollution, cleaning the toxic L.A. River, or quelling gang violence, any policy alternative pursued unilaterally is limited by overlapping authority and externalities that alter the actions of other governments.

Problems of fragmented authority are not confined to metropolitan areas. They are also found in multi-level governance scenarios such as the restoration of Chesapeake Bay, as well as in international relations as demonstrated by recent global events such as “Brexit” and the U.S.’s withdrawal from the Paris Climate Agreement. In short, fragmentation problems manifest at every scale of governance, horizontally, vertically, and even functionally within governments.

What is an ‘institutional collective action’ dilemma?

In many cases governments would be better off coordinating and working together, but they face barriers that prevent them from doing so. These barriers are what the policy literature refers to as ‘institutional collective action’ (ICA) dilemmas, or collective action problems in which a government’s incentives do not align with collectively desirable outcomes. For example, all governments in a region benefit from less air pollution, but each government has an incentive to free ride and enjoy cleaner air without contributing to the cost of obtaining it.

The ICA Framework, developed by Professor Richard Feiock, has emerged as a practical analytical instrument for understanding and improving fragmented governance. This framework assumes that governments must match the scale and coerciveness of the policy intervention (or mechanism) to the scale and nature of the policy problem to achieve efficient and desired outcomes.

For example, informal networks (a mechanism) can be highly effective at overcoming simple collective action problems. But as problems become increasingly complex, more obtrusive mechanisms, such as governmental consolidation or imposed collaboration, are needed to achieve collective goals and more efficient outcomes. The more obtrusive the mechanism, however, the more actors’ autonomy diminishes and the higher the transaction costs (monitoring, enforcement, information, and agency) of governing.

Collective Action_2

Three ways to improve institutional collaborative governance

We explored what actionable steps policymakers can take to improve their results with collaboration in fragmented systems. Our study offers three general practical recommendations based on the empirical literature that can enhance institutional collaborative governance.

First, institutional collaboration is more likely to emerge and work effectively when policymakers employ networking strategies that incorporate frequent, face-to-face interactions.

Government actors networking with popular, well-endowed actors (“bridging strategies”) as well as developing closer-knit, reciprocal ties with a smaller set of actors (“bonding strategies”) will result in more collaborative participation, especially when policymakers interact often and in-person.

Policy network characteristics are also important to consider. Research on estuary governance indicates that in newly formed, emerging networks, bridging strategies may be more advantageous, at least initially, because they can provide organizational legitimacy and access to resources. However, once collaboratives mature, developing stronger and more reciprocal bonds with fewer actors reduces the likelihood of opportunistic behavior that can hinder collaborative effectiveness.

Second, policymakers should design collaborative arrangements that reduce transaction costs which hinder collaboration.

Well-designed collaborative institutions can lower the barriers to participation and information sharing, make it easier to monitor the behaviors of partners, grant greater flexibility in collaborative work, and allow for more credible commitments from partners.

Research suggests policymakers can achieve this by

  1. identifying similarities in policy goals, politics, and constituency characteristics with institutional partners
  2. specifying rules such as annual dues, financial reporting, and making financial records reviewable by third parties to increase commitment and transparency in collaborative arrangements
  3. creating flexibility by employing adaptive agreements with service providers, especially when services have limited markets/applications and performance is difficult to measure.

Considering the context, however, is crucial. Collaboratives that thrive on informal, close-knit, reciprocal relations, for example, may be severely damaged by the introduction of monitoring mechanisms that signal distrust.

Third, institutional collaboration is enhanced by the development and harnessing of collaborative capacity.

Research suggests signaling organizational competencies and capacities, such as budget, political support, and human resources, may be more effective at lowering barriers to collaboration than ‘homophily’ (a tendency to associate with similar others in networks). Policymakers can begin building collaborative capacity by seeking political leadership involvement, granting greater managerial autonomy, and looking to higher-level governments (e.g., national, state, or provincial governments) for financial and technical support for collaboration.

What about collaboration in different institutional contexts?

Finally, we recognize that not all policymakers operate in similar institutional contexts, and collaboration can often be mandated by higher-level authorities in more centralized nations. Nonetheless, visible joint gains, economic incentives, transparent rules, and equitable distribution of joint benefits and costs are critical components of voluntary or mandated collaboration.

Conclusions and future directions

The recommendations offered here are, at best, only the tip of the iceberg on valuable practical insight that can be gleaned from collaborative governance research. While these suggestions are consistent with empirical findings from broader public management and policy networks literatures, much could be learned from a closer inspection of the overlap between ICA studies and other streams of collaborative governance work.

Collaboration is a valuable tool of governance, and, like any tool, it should be utilized appropriately. Collaboration is not easily managed and can encounter many obstacles. We suggest that governments generally avoid collaborating unless there are joint gains that cannot be achieved alone. But the key to solving many of society’s intractable problems, or just simply improving everyday public service delivery, lies in a clearer understanding of how collaboration can be used effectively within different fragmented systems.

1 Comment

Filed under public policy

The Politics of Evidence

This is a draft of my review of Justin Parkhurst (2017) The Politics of Evidence (Routledge, Open Access)

Justin Parkhurst’s aim is to identify key principles to take forward the ‘good governance of evidence’. The good governance of scientific evidence in policy and policymaking requires us to address two fundamentally important ‘biases’:

  1. Technical bias. Some organisations produce bad evidence, some parts of government cherry-pick, manipulate, or ignore evidence, and some politicians misinterpret the implications of evidence when calculating risk. Sometimes, these things are done deliberately for political gain. Sometimes they are caused by cognitive biases which cause us to interpret evidence in problematic ways. For example, you can seek evidence that confirms your position, and/ or only believe the evidence that confirms it.
  2. Issue bias. Some evidence advocates use the mantra of ‘evidence based policy’ to depoliticise issues or downplay the need to resolve conflicts over values. They also focus on the problems most conducive to study via their most respected methods such as randomised control trials (RCTs). Methodological rigour trumps policy relevance and simple experiments trump the exploration of complex solutions. So, we lose sight of the unintended consequences of producing the ‘best’ evidence to address a small number of problems, and making choices about the allocation of research resources and attention. Again, this can be deliberate or caused by cognitive biases, such as to seek simpler and more answerable questions than complex questions with no obvious answer.

To address both problems, Parkhurst seeks pragmatic ways to identify principles to decide what counts as ‘good evidence to inform policy’ and ‘what constitutes the good use of evidence within a policy process’:

‘it is necessary to consider how to establish evidence advisory systems that promote the good governance of evidence – working to ensure that rigorous, sys­tematic and technically valid pieces of evidence are used within decision-making processes that are inclusive of, representative of and accountable to the multiple social interests of the population served’ (p8).

Parkhurst identifies some ways in which to bring evidence and policy closer together. First, to produce evidence more appropriate for, or relevant to, policymaking (‘good evidence for policy’):

  1. Relate evidence more closely to policy goals.
  2. Modify research approaches and methods to answer policy relevant questions.
  3. Ensure that the evidence relates to the local or relevant context.

Second, to produce the ‘good use of evidence’, combine three forms of ‘legitimacy’:

  1. Input, to ensure democratic representative bodies have the final say.
  2. Throughput, to ensure widespread deliberation.
  3. Output, to ensure proper consideration the use of the most systematic, unbiased and rigorously produced scientific evidence relevant to the problem.

In the final chapter, Parkhurst suggests that these aims can be pursued in many ways depending on how governments want to design evidence advisory systems, but that it’s worth drawing on the examples of good practice he identifies. Parkhurst also explores the role for Academies of science, or initiatives such as the Cochrane Collaboration, to provide independent advice. He then outlines the good governance of evidence built on key principles: appropriate evidence, accountability in evidence use, transparency, and contestability (to ensure sufficient debate).

The overall result is a book full of interesting discussion and very sensible, general advice for people new to the topic of evidence and policy. This is no mean feat: most readers will seek a clearly explained and articulate account of the subject, and they get it here.

For me, the most interesting thing about Parkhurst’s book is the untold story, or often-implicit reasoning behind the way in which it is framed. We can infer that it is not a study aimed primarily at a political science or social science audience, because most of that audience would take its starting point for granted: the use of evidence is political, and politics involves values. Yet, Parkhurst feels the need to remind the reader of this point, in specific (“it is worth noting that the US presidency is a decidedly political role”, p43) and general circumstances (‘the nature of policymaking is inherently political’, p65). Throughout, the audience appears to be academics who begin with a desire for ‘evidence based policy’ without fully thinking through the implications, either about the lack of a magic bullet of evidence to solve a policy problem, how we might maintain a political system conducive to democratic principles and good evidence use, how we might design a system to reduce key ‘barriers’ between the supply of evidence by scientists and its demand by policymakers, and why few such designs have taken off.

In other words, the book appeals primarily to scientists trained outside social science, some of whom think about politics in their spare time, or encounter it in dispiriting encounters with policymakers. It appeals to that audience with a statement on the crucial role of high quality evidence in policymaking, highlights barriers to its use, tells scientists that they might be part of the problem, but then provides them with the comforting assurance that we can design better systems to overcome at least some of those barriers. For people trained in policy studies, this concluding discussion seems like a tall order, and I think most would read it with great scepticism.

Policy scientists might also be sceptical about the extent to which scientists from other fields think this way about hierarchies of scientific evidence and the desire to depoliticise politics with a primary focus on ‘what works’. Yet, I too hear this language regularly in interdisciplinary workshops (often while standing next to Justin!), and it is usually accompanied by descriptions of the pathology of policymaking, the rise of post-truth politics and rejection of experts, and the need to focus on the role of objective facts in deciding what policy solutions work best. Indeed, I was impressed recently by the skilled way in which another colleague prepared this audience for some provocative remarks when he suggested that the production and use of evidence is about power, not objectivity. OMG: who knew that policymaking was political and about power?!

So, the insights from this book are useful to a large audience of scientists while, for a smaller audience of policy scientists, they remind us that there is an audience out there for many of the statements that many of us would take for granted. Some evidence advocates use the language of ‘evidence based policymaking’ strategically, to get what they want. Others appear to use it because they believe it can exist. Keep this in mind when you read the book.

Parkhurst

3 Comments

Filed under Evidence Based Policymaking (EBPM)

Three ways to communicate more effectively with policymakers

By Paul Cairney and Richard Kwiatkowski

Use psychological insights to inform communication strategies

Policymakers cannot pay attention to all of the things for which they are responsible, or understand all of the information they use to make decisions. Like all people, there are limits on what information they can process (Baddeley, 2003; Cowan, 2001, 2010; Miller, 1956; Rock, 2008).

They must use short cuts to gather enough information to make decisions quickly: the ‘rational’, by pursuing clear goals and prioritizing certain kinds of information, and the ‘irrational’, by drawing on emotions, gut feelings, values, beliefs, habits, schemata, scripts, and what is familiar, to make decisions quickly. Unlike most people, they face unusually strong pressures on their cognition and emotion.

Policymakers need to gather information quickly and effectively, often in highly charged political atmospheres, so they develop heuristics to allow them to make what they believe to be good choices. Perhaps their solutions seem to be driven more by their values and emotions than a ‘rational’ analysis of the evidence, often because we hold them to a standard that no human can reach.

If so, and if they have high confidence in their heuristics, they will dismiss criticism from researchers as biased and naïve. Under those circumstances, we suggest that restating the need for ‘rational’ and ‘evidence-based policymaking’ is futile, naively ‘speaking truth to power’ counterproductive, and declaring ‘policy based evidence’ defeatist.

We use psychological insights to recommend a shift in strategy for advocates of the greater use of evidence in policy. The simple recommendation, to adapt to policymakers’ ‘fast thinking’ (Kahneman, 2011) rather than bombard them with evidence in the hope that they will get round to ‘slow thinking’, is already becoming established in evidence-policy studies. However, we provide a more sophisticated understanding of policymaker psychology, to help understand how people think and make decisions as individuals and as part of collective processes. It allows us to (a) combine many relevant psychological principles with policy studies to (b) provide several recommendations for actors seeking to maximise the impact of their evidence.

To ‘show our work’, we first summarise insights from policy studies already drawing on psychology to explain policy process dynamics, and identify key aspects of the psychology literature which show promising areas for future development.

Then, we emphasise the benefit of pragmatic strategies, to develop ways to respond positively to ‘irrational’ policymaking while recognising that the biases we ascribe to policymakers are present in ourselves and our own groups. Instead of bemoaning the irrationality of policymakers, let’s marvel at the heuristics they develop to make quick decisions despite uncertainty. Then, let’s think about how to respond effectively. Instead of identifying only the biases in our competitors, and masking academic examples of group-think, let’s reject our own imagined standards of high-information-led action. This more self-aware and humble approach will help us work more successfully with other actors.

On that basis, we provide three recommendations for actors trying to engage skilfully in the policy process:

  1. Tailor framing strategies to policymaker bias. If people are cognitive misers, minimise the cognitive burden of your presentation. If policymakers combine cognitive and emotive processes, combine facts with emotional appeals. If policymakers make quick choices based on their values and simple moral judgements, tell simple stories with a hero and moral. If policymakers reflect a ‘group emotion’, based on their membership of a coalition with firmly-held beliefs, frame new evidence to be consistent with those beliefs.
  2. Identify ‘windows of opportunity’ to influence individuals and processes. ‘Timing’ can refer to the right time to influence an individual, depending on their current way of thinking, or to act while political conditions are aligned.
  3. Adapt to real-world ‘dysfunctional’ organisations rather than waiting for an orderly process to appear. Form relationships in networks, coalitions, or organisations first, then supply challenging information second. To challenge without establishing trust may be counterproductive.

These tips are designed to produce effective, not manipulative, communicators. They help foster the clearer communication of important policy-relevant evidence, rather than imply that we should bend evidence to manipulate or trick politicians. We argue that it is pragmatic to work on the assumption that people’s beliefs are honestly held, and policymakers believe that their role is to serve a cause greater than themselves. To persuade them to change course requires showing simple respect and seeking ways to secure their trust, rather than simply ‘speaking truth to power’. Effective engagement requires skilful communication and good judgement as much as good evidence.


This is the introduction to our revised and resubmitted paper to the special issue of Palgrave Communications The politics of evidence-based policymaking: how can we maximise the use of evidence in policy? Please get in touch if you are interested in submitting a paper to the series.

Full paper: Cairney Kwiatkowski Palgrave Comms resubmission CLEAN 14.7.17

2 Comments

Filed under agenda setting, Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy

The impact of multi-level policymaking on the UK energy system

Cairney et al UKERC

In September, we will begin a one-year UKERC-funded project examining current and future energy policy and multi-level policymaking and its impact on ‘energy systems’. This is no mean feat, since the meaning of policy, policymaking (or the ‘policy process’), and ‘system’ are not clear, and our description of the components parts of an energy system and a complex policymaking system may differ markedly. So, one initial aim is to provide some way to turn a complex field of study into something simple enough to understand and engage with.

We do so by focusing on ‘multi-level policymaking’ – which can encompass concepts such as multi-level governance and intergovernmental relations – to reflect the fact that the responsibility for policies relevant to energy are often Europeanised, devolved, and shared between several levels of government. Brexit will produce a major effect on energy and non-energy policies, and prompt the UK and devolved governments to produce relationships, but we all need more clarity on the dynamics of current arrangements before we can talk sensibly about the future. To that end, we pursue three main work packages:

1. What is the ‘energy policymaking system’ and how does it affect the energy system?

Chaudry et al (2009: iv) define the UK energy system as ‘the set of technologies, physical infrastructure, institutions, policies and practices located in and associated with the UK which enable energy services to be delivered to UK consumers’. UK policymaking can have a profound impact, and constitutional changes might produce policy change, but their impacts require careful attention. So, we ‘map’ the policy process and the effect of policy change on energy supply and demand. Mapping sounds fairly straightforward but contains a series of tasks whose level of difficulty rises each time:

  1. Identify which level or type of government is responsible – ‘on paper’ and in practice – for the use of each relevant policy instrument.
  2. Identify how these actors interact to produce what we call ‘policy’, which can range from statements of intent to final outcomes.
  3. Identify an energy policy process containing many actors at many levels, the rules they follow, the networks they form, the ‘ideas’ that dominate discussion, and the conditions and events (often outside policymaker control) which constrain and facilitate action. By this stage, we need to draw on particular policy theories to identify key venues, such as subsystems, and specific collections of actors, such as advocacy coalitions, to produce a useful model of activity.

2. Who is responsible for action to reduce energy demand?

Energy demand is more challenging to policymakers than energy supply because the demand side involves millions of actors who, in the context of household energy use, also constitute the electorate. There are political tensions in making policies to reduce energy demand and carbon where this involves cost and inconvenience for private actors who do not necessarily value the societal returns achieved, and the political dynamics often differ from policy to regulate industrial demand. There are tensions around public perceptions of whose responsibility it is to take action – including local, devolved, national, or international government agencies – and governments look like they are trying to shift responsibility to each other or individuals and firms.

So, there is no end of ways in which energy demand could be regulated or influenced – including energy labelling and product/building standards, emissions reduction measures, promotion of efficient generation, and buildings performance measures – but it is an area of policy which is notoriously diffuse and lacking in co-ordination. So, for the large part, we consider if Brexit provides a ‘window of opportunity’ to change policy and policymaking by, for example, clarifying responsibilities and simplifying relationships.

3: Does Brexit affect UK and devolved policy on energy supply?

It is difficult for single governments to coordinate an overall energy mix to secure supply from many sources, and multi-level policymaking adds a further dimension to planning and cooperation. Yet, the effect of constitutional changes is highly uneven. For example, devolution has allowed Scotland to go its own way on renewable energy, nuclear power and fracking, but Brexit’s impact ranges from high to low. It presents new and sometimes salient challenges for cooperation to supply renewable energy but, while fracking and nuclear are often the most politically salient issues, Brexit may have relatively little impact on policymaking within the UK.

We explore the possibility that renewables policy may be most impacted by Brexit, while nuclear and fracking are examples in which Brexit may have a minimal direct impact on policy. Overall, the big debates are about the future energy mix, and how local, devolved, and UK governments balance the local environmental impacts of, and likely political opposition to, energy development against the economic and energy supply benefits.

For more details, see our 4-page summary

Powerpoint for 13.7.17

Leave a comment

Filed under Fracking, public policy, UKERC

Policy in 500 Words: The Policy Process

We talk a lot about ‘the policy process’ without really saying what it is. If you are new to policy studies, maybe you think that you’ll learn what it is eventually if you read enough material. This would be a mistake! Instead, when you seek a definition of the policy process, you’ll find two common responses.

  1. Many will seek to define policy or public policy instead of ‘the policy process’.
  2. Some will describe the policy process as a policy cycle with stages.

Both responses seem inadequate: one avoids giving an answer, and another gives the wrong answer!

However, we can combine elements of each approach to give you just enough of a sense of ‘the policy process’ to continue reading:

  1. The beauty of the ‘what is policy?’ question is that we don’t give you an answer. I give you a working definition to help raise further questions. Look at the questions we need to ask if we begin with the definition, ‘the sum total of government action, from signals of intent to the final outcomes’.
  2. The beauty of the policy cycle approach is that it provides a simple way to imagine policy ‘dynamics’, or events and choices producing a sequence of other events and choices. Look at the stages to identify many different tasks within one ‘process’, and to get the sense that policymaking is continuous and often ‘its own cause’.

There are more complicated but better ways of describing policymaking dynamics

This picture is the ‘policy process’ equivalent of my definition of public policy. It captures the main elements of the policy process described (in different ways) by most policy theories. It is there to give you enough of an answer to help you ask the right questions.

Cairney 2017 image of the policy process

In the middle is ‘policy choice’. At the heart of most policy theory is ‘bounded rationality’, which describes (a) the cognitive limits of people, and (b) how they overcome those limits to make decisions. They use ‘rational’ and ‘irrational’ shortcuts to action.

Surrounding choice is what we’ll call the ‘policy environment’, containing: policymakers in many levels and types of government, the ideas or beliefs they share, the rules they follow, the networks they form with influencers, and the ‘structural’ or socioeconomic context in which they operate.

This picture is only the beginning of analysis, raising further questions that will make more sense when you read further, including: should policymaker choice be at the centre of this picture? Why are there arrows (describing the order of choice) in the cycle but not in my picture?

Take home message for students: don’t describe ‘the policy process’ without giving the reader some sense of its meaning. Its definition overlaps with ‘policy’ considerably, but the ‘process’ emphasises modes and dynamics of policymaking, while ‘policy’ emphasises outputs. Then, think about how each policy model or theory tries, in different ways, to capture the key elements of the process. A cycle focuses on ‘stages’ but most theories in this series focus on ‘environments’.

 

 

 

 

 

Leave a comment

Filed under 500 words, public policy

Three habits of successful policy entrepreneurs

This post is one part of a series: Practical Lessons from Policy Theories.

Policy entrepreneurs’ invest their time wisely for future reward, and possess key skills that help them adapt particularly well to their environments. They are the agents for policy change who possess the knowledge, power, tenacity, and luck to be able to exploit key opportunities. They draw on three strategies:

1. Don’t focus on bombarding policymakers with evidence.

Scientists focus on making more evidence to reduce uncertainty, but put people off with too much information. Entrepreneurs tell a good story, grab the audience’s interest, and the audience demands information.

Table 1

2. By the time people pay attention to a problem it’s too late to produce a solution.

So, you produce your solution then chase problems.

Table 2

3. When your environment changes, your strategy changes.

For example, in the US federal level, you’re in the sea, and you’re a surfer waiting for the big wave. In the smaller subnational level, on a low attention and low budget issue, you can be Poseidon moving the ‘streams’. In the US federal level, you need to ‘soften’ up solutions over a long time to generate support. In subnational or other countries, you have more opportunity to import and adapt ready-made solutions.

Table 3

It all adds up to one simple piece of advice – timing and luck matters when making a policy case – but policy entrepreneurs know how to influence timing and help create their own luck.

For the full paper, see: Cairney Practical Lessons Policy Entrepreneurs Revised 5 June 17

For more on ‘multiple streams’ see:

Paul Cairney and Michael Jones (2016) ‘Kingdon’s Multiple Streams Approach: What Is the Empirical Impact of this Universal Theory?’ Policy Studies Journal, 44, 1, 37-58 PDF (Annex to Cairney Jones 2016) (special issue of PSJ)

Paul Cairney and Nikos Zahariadis (2016) ‘Multiple streams analysis: A flexible metaphor presents an opportunity to operationalize agenda setting processes’ in Zahariadis, N. (eds) Handbook of Public Policy Agenda-Setting (Cheltenham: Edward Elgar) PDF see also

11 Comments

Filed under agenda setting, Evidence Based Policymaking (EBPM), Folksy wisdom, public policy, Storytelling