Tag Archives: communication

Policy Analysis in 750 words: Catherine Smith (2016) Writing Public Policy

Please see the Policy Analysis in 750 words series overview before reading the summary.

Catherine Smith (2016) Writing Public Policy (Oxford University Press)

Smith focuses on the communication of policy analysis within US government. Effective communication requires conceptual and contextual awareness’. Policy actors communicate from a particular viewpoint, representing their role, interests, and objectives.

In government, policy analysts often write (1) on behalf of policymakers, projecting a specific viewpoint, and (2) for policy makers, requiring them to (a) work remarkably quickly to (b) produce concise reports to (c) reflect the need to process information efficiently.

Actors outside government are less constrained by (1), but still need to write in a similar way. Their audience makes quick judgements on presentations: the source of information, its relevance, and if they should read it fully.

‘General Method of Communicating in a Policy Process’

Smith identifies the questions to ask yourself when communicating policy analysis, summarised as follows:

‘Step 1: Prepare’

  • To what policy do I refer?
  • Which audiences are relevant?
  • What is the political context, and the major sites of agreement/ disagreement?
  • How do I frame the problem, and which stories are relevant to my audience?

‘Step 2: Plan’

  • What is this communication’s purpose?
  • What is my story and message?
  • What is my role and interest?
  • ‘For whom does this communication speak?’
  • Who is my audience?
  • What will they learn?
  • What is the context and timeframe?
  • What should be the form, content, and tone of the communication?

‘Step 3. Produce’

  • Make a full draft, seek comments during a review, then revise.

Smith provides two ‘checklists’ to assess such communications:

  1. Effectiveness. Speak with an audience in mind, highlight a well-defined problem and purpose, project authority, and use the right form of communication.
  2. Excellence. Focus on clarity, precision, conciseness, and credibility.

Smith then focuses on specific aspects of this general method, including:

  • Framing involves describing the nature of the problem – its scope, and who is affected – and connecting this definition to current or new solutions.
  • Evaluation requires critical skills to question ‘conventional wisdom’ and assess the selective use of information by others. Use the ‘general method’ to ask how others frame problems and solutions, then provide a fresh perspective (compare with Bacchi).
  • Know the Record involves researching previous solutions. This process reflects the importance of ‘precedent’: telling a story of previous attempts to solve the problem helps provide context for new debates (and project your knowledgeability).
  • Know the Arguments involves engaging with the ideas of your allies and competitors. Understand your own position, make a reasoned argument in relation to others, present a position paper, establish its scope (the big picture or specific issue), and think strategically (and ethically) about how to maximise its impact in current political debates.
  • Inform Policymakers suggests maximising policymaker interest by keeping communication concise, polite, and tailored to a policymaker’s values and interests.
  • Public Comment focuses on the importance of working with administrative officials even after legislation is passed (especially if ‘street level bureaucrats’ make policy as they deliver).

Policy analysis in a wider context

Although Smith does not focus on policy process theories, knowledge of policy processes guides this advice. For example, Smith advises that:

  • There is no linear and orderly policy cycle in which to present written analysis. The policymaking environment is more complex and less predictable than this model suggests (although Smith still distinguishes heavily between legislation to make policy and administration to deliver – compare with the ACF)
  • There is no blueprint or uniform template for writing policy analysis. The mix of policy problems is too diverse to manage with one approach, and ‘context’ may be more important than the ‘content’ of your proposal. Consequently, Smith provides a huge number of real-world examples to highlight the need to adapt policy analysis to the task at hand (see also Bacchi on analysts creating problems as they frame them).
  • Policy communication is not a rational/ technical process. It is a political exercise, built on the use of values to frame and try to solve problems. Analysis takes place in often highly divisive debates. People communicate using stories, and they use framing and persuasion techniques. They need to tailor their arguments to specific audiences, rather than hoping that one document could appeal to everyone (see Deborah Stone’s Policy Paradox).
  • Everyone may have the ability to frame issues, but only some policymakers ‘have authority to decide’ to pay attention to and interpret problems (see PET).
  • Communication comes in many forms to reflect many possible venues (such as, in the US context, processes of petition and testimony to public hearings alongside appeals to the executive and legislative branches)

See also: Policy Analysis in 750 words (the overview)

 

4 Comments

Filed under 750 word policy analysis, public policy

Theory and Practice: How to Communicate Policy Research beyond the Academy

Notes (and audio) for my first talk at the University of Queensland, Wednesday 24th October, 12.30pm, Graduate Centre, room 402.

Here is the powerpoint that I tend to use to inform discussions with civil servants (CS). I first used it for discussion with CS in the Scottish and UK governments, followed by remarkably similar discussions in parts of New Zealand and Australian government. Partly, it provides a way into common explanations for gaps between the supply of, and demand for, research evidence. However, it also provides a wider context within which to compare abstract and concrete reasons for those gaps, which inform a discussion of possible responses at individual, organisational, and systemic levels. Some of the gap is caused by a lack of effective communication, but we should also discuss the wider context in which such communication takes place.

I begin by telling civil servants about the message I give to academics about why policymakers might ignore their evidence:

  1. There are many claims to policy relevant knowledge.
  2. Policymakers have to ignore most evidence.
  3. There is no simple policy cycle in which we all know at what stage to provide what evidence.

slide 3 24.10.18

In such talks, I go into different images of policymaking, comparing the simple policy cycle with images of ‘messy’ policymaking, then introducing my own image which describes the need to understand the psychology of choice within a complex policymaking environment.

Under those circumstances, key responses include:

  • framing evidence in terms of the ways in which your audience understands policy problems
  • engaging in networks to identify and exploit the right time to act, and
  • venue shopping to find sympathetic audiences in different parts of political systems.

However, note the context of those discussions. I tend to be speaking with scientific researcher audiences to challenge some preconceptions about: what counts as good evidence, how much evidence we can reasonably expect policymakers to process, and how easy it is to work out where and when to present evidence. It’s generally a provocative talk, to identify the massive scale of the evidence-to-policy task, not a simple ‘how to do it’ guide.

In that context, I suggest to civil servants that many academics might be interested in more CS engagement, but might be put off by the overwhelming scale of their task, and – even if they remained undeterred – would face some practical obstacles:

  1. They may not know where to start: who should they contact to start making connections with policymakers?
  2. The incentives and rewards for engagement may not be clear. The UK’s ‘impact’ agenda has changed things, but not to the extent that any engagement is good engagement. Researchers need to tell a convincing story that they made an impact on policy/ policymakers with their published research, so there is a notional tipping point of engagement in which it reaches a scale that makes it worth doing.
  3. The costs are clearer. For example, any time spent doing engagement is time away from writing grant proposals and journal articles (in other words, the things that still make careers).
  4. The rewards and costs are not spread evenly. Put most simply, white male professors may have the most opportunities and face the fewest penalties for engagement in policymaking and social media. Or, the opportunities and rewards may vary markedly by discipline. In some, engagement is routine. In others, it is time away from core work.

In that context, I suggest that CS should:

  • provide clarity on what they expect from academics, and when they need information
  • describe what they can offer in return (which might be as simple as a written and signed acknowledgement of impact, or formal inclusion on an advisory committee).
  • show some flexibility: you may have a tight deadline, but can you reasonably expect an academic to drop what they are doing at short notice?
  • Engage routinely with academics, to help form networks and identify the right people you need at the right time

These introductory discussions provide a way into common descriptions of the gap between academic and policymaker:

  • Technical languages/ jargon to describe their work
  • Timescales to supply and demand information
  • Professional incentives (such as to value scientific novelty in academia but evidential synthesis in government
  • Comfort with uncertainty (often, scientists project relatively high uncertainty and don’t want to get ahead of the evidence; often policymakers need to project certainty and decisiveness)
  • Assessments of the relative value of scientific evidence compared to other forms of policy-relevant information
  • Assessments of the role of values and beliefs (some scientists want to draw the line between providing evidence and advice; some policymakers want them to go much further)

To discuss possible responses, I use the European Commission Joint Research Centre’s ‘knowledge management for policy’ project in which they identify the 8 core skills of organisations bringing together the suppliers and demanders of policy-relevant knowledge

Figure 1

However, I also use the following table to highlight some caution about the things we can achieve with general skills development and organisational reforms. Sometimes, the incentives to engage will remain low. Further, engagement is no guarantee of agreement.

In a nutshell, the table provides three very different models of ‘evidence-informed policymaking’ when we combine political choices about what counts as good evidence, and what counts as good policymaking (discussed at length in teaching evidence-based policy to fly). Discussion and clearer communication may help clarify our views on what makes a good model, but I doubt it will produce any agreement on what to do.

Table 1 3 ideal types of EBBP

In the latter part of the talk, I go beyond that powerpoint into two broad examples of practical responses:

  1. Storytelling

The Narrative Policy Framework describes the ‘science of stories’: we can identify stories with a 4-part structure (setting, characters, plot, moral) and measure their relative impact.  Jones/ Crow and Crow/Jones provide an accessible way into these studies. Also look at Davidson’s article on the ‘grey literature’ as a rich source of stories on stories.

On one hand, I think that storytelling is a great possibility for researchers: it helps them produce a core – and perhaps emotionally engaging – message that they can share with a wider audience. Indeed, I’d see it as an extension of the process that academics are used to: identifying an audience and framing an argument according to the ways in which that audience understands the world.

On the other hand, it is important to not get carried away by the possibilities:

  • My reading of the NPF empirical work is that the most impactful stories are reinforcing the beliefs of the audience – to mobilise them to act – not changing their minds.
  • Also look at the work of the Frameworks Institute which experiments with individual versus thematic stories because people react to them in very different ways. Some might empathise with an individual story; some might judge harshly. For example, they discusse stories about low income families and healthy eating, in which they use the theme of a maze to help people understand the lack of good choices available to people in areas with limited access to healthy food.

See: Storytelling for Policy Change: promise and problems

  1. Evidence for advocacy

The article I co-authored with Oxfam staff helps identify the lengths to which we might think we have to go to maximise the impact of research evidence. Their strategies include:

  1. Identifying the policy change they would like to see.
  2. Identifying the powerful actors they need to influence.
  3. A mixture of tactics: insider, outsider, and supporting others by, for example, boosting local civil society organisations.
  4. A mix of ‘evidence types’ for each audience

oxfam table 2

  1. Wider public campaigns to address the political environment in which policymakers consider choices
  2. Engaging stakeholders in the research process (often called the ‘co-production of knowledge’)
  3. Framing: personal stories, ‘killer facts’, visuals, credible messenger
  4. Exploiting ‘windows of opportunity’
  5. Monitoring, learning, trial and error

In other words, a source of success stories may provide a model for engagement or the sense that we need to work with others to engage effectively. Clear communication is one thing. Clear impact at a significant scale is another.

See: Using evidence to influence policy: Oxfam’s experience

 

 

 

 

 

 

 

1 Comment

Filed under agenda setting, Evidence Based Policymaking (EBPM)