This post forms one part of the Policy Analysis in 750 words series overview.
Throughout this series you may notice three different conceptions about the scope of policy analysis:
- ‘Ex ante’ (before the event) policy analysis. Focused primarily on defining a problem, and predicting the effect of solutions, to inform current choice (as described by Meltzer and Schwartz and Thissen and Walker).
- ‘Ex post’ (after the event) policy analysis. Focused primarily on monitoring and evaluating that choice, perhaps to inform future choice (as described famously by Weiss).
- Some combination of both, to treat policy analysis as a continuous (never-ending) process (as described by Dunn).
As usual, these are not hard-and-fast distinctions, but they help us clarify expectations in relation to different scenarios.
- The impact of old-school ex ante policy analysis
Radin provides a valuable historical discussion of policymaking with the following elements:
- a small number of analysts, generally inside government (such as senior bureaucrats, scientific experts, and – in particular- economists),
- giving technical or factual advice,
- about policy formulation,
- to policymakers at the heart of government,
- on the assumption that policy problems would be solved via analysis and action.
This kind of image signals an expectation for high impact: policy analysts face low competition, enjoy a clearly defined and powerful audience, and their analysis is expected to feed directly into choice.
Radin goes on to describe a much different, modern policy environment: more competition, more analysts spread across and outside government, with a less obvious audience, and – even if there is a client – high uncertainty about where the analysis fits into the bigger picture.
Yet, the impetus to seek high and direct impact remains.
This combination of shifting conditions but unshifting hopes/ expectations helps explain a lot of the pragmatic forms of policy analysis you will see in this series, including:
- Keep it catchy, gather data efficiently, tailor your solutions to your audience, and tell a good story (Bardach)
- Speak with an audience in mind, highlight a well-defined problem and purpose, project authority, use the right form of communication, and focus on clarity, precision, conciseness, and credibility ( Smith)
- Address your client’s question, by their chosen deadline, in a clear and concise way that they can understand (and communicate to others) quickly (Weimer and Vining)
- Client-oriented advisors identify the beliefs of policymakers and anticipate the options worth researching (Mintrom)
- Identify your client’s resources and motivation, such as how they seek to use your analysis, the format of analysis they favour (make it ‘concise’ and ‘digestible’), their deadline, and their ability to make or influence the policies you might suggest (Meltzer and Schwartz).
- ‘Advise strategically’, to help a policymaker choose an effective solution within their political context (Thissen and Walker).
- Focus on producing ‘policy-relevant knowledge’ by adapting to the evidence-demands of policymakers and rejecting a naïve attachment to ‘facts speaking for themselves’ or ‘knowledge for its own sake’ (Dunn).
- The impact of research and policy evaluation
Many of these recommendations are familiar to scientists and researchers, but generally in the context of far lower expectations about their likely impact, particularly if those expectations are informed by policy studies (compare Oliver & Cairney with Cairney & Oliver).
- to inform solutions to a problem identified by policymakers
- as one of many sources of information used by policymakers, alongside ‘stakeholder’ advice and professional and service user experience
- as a resource used selectively by politicians, with entrenched positions, to bolster their case
- as a tool of government, to show it is acting (by setting up a scientific study), or to measure how well policy is working
- as a source of ‘enlightenment’, shaping how people think over the long term (compare with this discussion of ‘evidence based policy’ versus ‘policy based evidence’).
In other words, researchers may have a role, but they struggle (a) to navigate the politics of policy analysis, (b) find the right time to act, and (c) to secure attention, in competition with many other policy actors.
- The potential for a form of continuous impact
Dunn suggests that the idea of ‘ex ante’ policy analysis is misleading, since policymaking is continuous, and evaluations of past choices inform current choices. Think of each policy analysis steps as ‘interdependent’, in which new knowledge to inform one step also informs the other four. For example, routine monitoring helps identify compliance with regulations, if resources and services reach ‘target groups’, if money is spent correctly, and if we can make a causal link between the policy solutions and outcomes. Its impact is often better seen as background information with intermittent impact.
Key conclusions to bear in mind
- The demand for information from policy analysts may be disproportionately high when policymakers pay attention to a problem, and disproportionately low when they feel that they have addressed it.
- Common advice for policy analysts and researchers often looks very similar: keep it concise, tailor it to your audience, make evidence ‘policy relevant’, and give advice (don’t sit on the fence). However, unless researchers are prepared to act quickly, to gather data efficiently (not comprehensively), to meet a tight brief for a client, they are not really in the impact business described by most policy analysis texts.
- A lot of routine, continuous, impact tends to occur out of the public spotlight, based on rules and expectations that most policy actors take for granted.
See the Policy Analysis in 750 words series overview to continue reading on policy analysis.
See the ‘evidence-based policymaking’ page to continue reading on research impact.
Bristol powerpoint: Paul Cairney Bristol EBPM January 2020