Many research funders are interested in supporting researchers as they engage with government to inform policy and practice.
In that context, here is a long read on how (a) policy analysis and (b) policy process research can help funders understand the likely impact of such engagement-with-government’ initiatives.
I draw on four blog post series to identify:
- The wider policymaking context in which such initiatives take place (see posts in 500 or 1000 words, and on ‘evidence based’ policymaking)
- The advice given to policy analysts about how to improve their impact with research (in 750 words).
I introduce seven broad themes to help us interpret the effectiveness of specific approaches to engagement (in other words, I am not evaluating specific initiatives directly*).
In each case, I suggest that evaluations of impact initiatives lack meaning without a clear sense of (a) what engagement and impact is for, (b) how far researchers and research organisations are willing to go to support them, and (c) how individual initiatives relate to each other (in a vaguely defined ‘culture’ or ‘ecosystem’ of activity).
Seven key themes
- Rather than trying to design a new system, learn from and adapt to the systems and practices that exist
- Clarify the role of support directed at individuals, institutions, and systems.
- Reduce ambiguity by defining the policy problem and strategy in practice
- Tailor research support to ‘multi-centric policymaking’
- Clarify the role of researchers when they engage with policymakers
- Establish the credibility of research through expertise and/or co-production
- Establish clear and realistic expectations for researcher engagement practices
1. Rather than trying to design a new system, learn from and adapt to the systems and practices that exist
Studies of ‘evidence based policymaking’ (EBPM), from a researcher perspective, tend to identify (a) the barriers between their evidence and policy, and (b) what a good system of research production and use might look like. These discussions form part of a continuous debate on how research organisations might design better political and research systems to improve the production and use of evidence.
Yet, the power to redesign systems is not in their gift. Instead, they form one part of a policymaking system over which they have incomplete knowledge and minimal control.
This specific problem for researchers is part of a general problem identified in policy studies: do not confuse your requirements of policymaking with their actual dynamics or your ability to influence them.
For example, common 5-step guides to policy analysis appear to correspond to a well-known policy cycle model of policymaking, in which analysts: define the problem, identify potential solutions, choose the criteria to compare and evaluate them, recommend a solution, monitor its effects, and evaluate past policy to inform current policy. However, this model based on functional requirements, not actual policymaking practices.
Instead, the modern study of policymaking seeks to understand a far messier or complex policymaking environment in which it is not clear (a) who the most relevant policymakers are, (b) how they think about policy problems and the research that may be relevant, and (c) their ability to turn that evidence into policy outcomes.
In that context, the general advice to policy analysts is to be flexible and iterative rather than rationalistic, to learn the ‘art and craft’ of policy analysis through experience, and to consider many possible analytical styles that could emphasise what is good for knowledge, debate, process, or the client.
- Focus less on the abstract design of research engagement strategies based on what researchers would like to see.
- Focus more on tailoring support to researchers in relation to understanding what policymakers do.
In particular, while few academics will see policymakers as their ‘clients’, they would benefit from knowing more about the many different ways in which policymakers and analysts gather and use evidence. Academic-practitioner workshops provide a general forum for such insights, but we need a collection of more specific analyses of the ways in which full-time analysts and civil servants gather evidence to meet policymaker demands.
2. Clarify the role of support directed at individuals, institutions, and systems.
We can think of research support in terms of individuals, institutions, or systems, but the nature of each type of support – and the relationship between them – is not clear.
Studies of academic policy impact and policy analysis share a focus on the successful individuals who are often called policy entrepreneurs. For Mintrom,
- they ‘are energetic actors who engage in collaborative efforts in and around government to promote policy innovations’,
- their attributes are ‘ambition’, ‘social acuity’, ‘credibility’, ‘sociability’, and ‘tenacity’
- their skills are ‘strategic thinking’, ‘team building’, ‘collecting evidence’, ‘making arguments’, ‘engaging multiple audiences’, ‘negotiating’, and ‘networking’
- their strategies include ‘problem framing’, ‘using and expanding networks’, ‘working with advocacy coalitions’, ‘leading by example’, and ‘scaling up change processes’.
These descriptions of successful individuals are familiar to readers of the personal accounts of impactful researchers, whose recommendations are summarised by Oliver and Cairney:
(1) Do high quality research; (2) make your research relevant and readable; (3) understand policy processes; (4) be accessible to policymakers: engage routinely, flexible, and humbly; (5) decide if you want to be an issue advocate or honest broker; (6) build relationships (and ground rules) with policymakers; (7) be ‘entrepreneurial’ or find someone who is; and (8) reflect continuously: should you engage, do you want to, and is it working?
One approach is to learn from, and seek to transfer, their success. However, analyses of entrepreneurship suggest that most fail, and that their success depends more on the nature of (a) their policymaking environments, and (b) the social backgrounds that facilitate their opportunities to engage (reinforcing inequalities in relation to factors such as race and gender, as well as level of seniority, academic discipline, and type of University).
In that context, the idea of institutionalising engagement is attractive. Instead of leaving impact to the individual, set up a system of rules and norms (or a new ‘culture’) to make engagement routine and expected (the range of such initiatives is discussed in more depth in Oliver et al’s report).
However, research on policy analysis and process shows that the ability to set the ‘rules of the game’ is not shared equally, and researchers generally need to adapt to – rather than co-design – the institutions in government with which they engage.
This point informs a tendency to describe research-government interactions in relation to a complex ‘system’ or ‘ecosystem’. This metaphorical language can be useful, to encourage participants not to expect to engage in simple (or even understandable) policy processes, and to encourage ‘systems thinking’. Further, the same systems-language is useful to describe the cross-cutting nature of policy problems, to encourage (a) interdisciplinary research and (b) cross-departmental cooperation in government to solve major policy problems.
However, there are at least 10 different ways to define systems thinking, such as in relation to policy problems and social or policymaking behaviour. Further, many accounts provide contradictory messages:
- some highlight the potential to find the right ‘leverage’ to make a disproportionate impact from a small reform, while
- others highlight the inability of governments to understand and control policymaking systems (or for systems to be self-governing).
- Identify how (and why) you would strike a balance between support for individuals, institutions, and systems (and consider the effect of a redistribution of support).
- Engage in a meaningful discussion of the potential trade-offs between aims, such as to maximise the impact of already successful individuals or support less successful groups.
- If seeking to encourage the ‘institutionalisation’ of research engagement cultures, clarify the extent to which one organisation can influence overall institutional design.
- If engaging with a research-policy ‘ecosystem’, clarify what you mean and how researchers can seek to understand and engage in it.
3. Reduce ambiguity by defining the policy problem and strategy in practice
Single strategy documents are useful to identify aims and objectives. However, they do not determine outcomes, partly because (a) they only form one guide to organisational practices, (b) they do not give unambiguous priority to some objectives over others, and (c) many organisational practices and aims are often unwritten. For example, engagement priorities could relate to an mix of:
- To foster specific government policy aims
- To foster an image of high UK research expertise and the value of investing in research funders
- To foster outcomes not identified explicitly by current governments, such as to (a) provide a long-term institutional memory, or (b) pursue a critical and emancipatory role for research, which often leads researchers to oppose government policy.
- To ensure an equal distribution of opportunities for researchers, in recognition of the effect of impact on careers.
This mix of aims requires organisations to clarify further their objectives when they seek to deliver them in practice, and there will always be potential trade-offs between them. Examples include:
- The largest opportunity to demonstrate visible and recordable social science research impact seems to be in cooperation with Westminster, but impact on a House of Commons committee is not an effective way to secure direct impact on policy outcomes.
- Initiatives may appear to succeed on one measure (such as to give researchers the support to develop skills or the space to criticise policy) and fail on another (such as to demonstrate a direct and positive impact on current policy).
- Clarify the objective of each form of engagement support
- Evaluate current initiatives in relation to those aims (which may not be the aims of the original project)
4. Tailor research support to ‘multi-centric policymaking’
Policy process research describes multi-centric policymaking in which central governments share responsibility with many other policymakers spread across many levels and types of government. To some extent, it results from choice, such as when the UK government shares responsibilities with devolved and local governments. However, it also results from necessity, since policymakers are only able to pay attention to a tiny proportion of their responsibilities, and they engage in a policymaking environment over which they have limited understanding and less control. This environment contains many policymakers and influencers spread across many venues, each with their own institutions, networks, ideas (and ways to frame policy), and responses to socio-economic context and events.
This image of the policy process presents the biggest challenge to identifying where to invest research funding support. On the one hand, it could support researchers to engage with policy actors with a clearly defined formal role, including government ministers (supported by civil servants) and parliamentary committees (supported by clerks). However, the vast majority of policy takes place out of the public spotlight, processed informally at a relatively low level of central government, or by non-departmental public bodies or subnational governments.
Policy analysts or interest groups may deal with this problem by investing their time to identify: which policymaking venues matter, their informal rules and networks, and which ideas (ways of thinking) are in good currency. In many cases, they work behind closed doors, seek compromise, and accept that they will receive minimal public credit for their work. Many such options may seem unattractive to research funders or researchers (when they seek to document impact), since they involve a major investment of time with no expectation of a demonstrable payoff.
- Equip researchers with a working knowledge of policy processes
- Incorporate policy science insights into impact training (alongside advice from practitioners on the ‘nuts-and-bolts’ of engagement)
- Identify how to address trade-offs between forms of formal and informal engagement
- Identify how to address the potential trade-offs between high-but-low visibility and low-but-high-visibility impact.
5. Clarify the role of researchers when they engage with policymakers
Policy analysts face profound choices on how to engage ethically, and key questions include:
- Is your primary role to serve individual clients or a wider notion of the ‘public good’?
- Should you maximise your role as an individual or play your part in a wider profession?
- What forms of knowledge and evidence count in policy analysis?
- What does it mean to communicate policy analysis responsibly?
- Should you provide a clear recommendation or encourage reflection?
In the field of policy analysis, we can find a range of responses, from
- the pragmatic client-oriented analyst focused on a brief designed for them, securing impact in the short term, to the
- critical researcher willing to question a policymaker’s definition of the problem and choice regarding what knowledge counts and what solutions are feasible, to seek impact over the long term.
Further, these choices necessitate the development of a wide range of different skills, and a choice about in which to invest. For example, Radin (2019: 48) identifies the value of:
Case study methods, Cost- benefit analysis, Ethical analysis, Evaluation, Futures analysis, Historical analysis, Implementation analysis, Interviewing, Legal analysis, Microeconomics, Negotiation, mediation, Operations research, Organizational analysis, Political feasibility analysis, Public speaking, Small- group facilitation, Specific program knowledge, Statistics, Survey research methods, Systems analysis.
Such a wide range of possible skills may prompt research funders to consider how to prioritise skills training as a whole, and in relation to each discipline or individual.
- Evaluate the effectiveness of current initiatives via the lens of appropriate practices and requisite skills.
6. Establish the credibility of research through expertise and/or co-production
Expectations for ‘evidence based’ or ‘coproduced’ policies are not mutually exclusive, but we should not underestimate the potential for major tensions between their aims and practices, in relation to questions such as:
- How many people should participate (a small number of experts or large number of stakeholders)?
- Whose knowledge counts (including research, experiential, practitioner learning)?
- Who should coordinate the ‘co-production’ of research and policy (such as researchers leading networks to produce knowledge to inform policy, or engaging in government networks)?
For example, when informing public services, researchers will be following different models of evidence-informed governance, from engagement with central governments to roll out a uniform ‘evidence based’ model, to engagement at a local level to encourage storytelling-based learning.
Further, these forms of engagement not only require very different skills but also competing visions of what counts as policy-relevant evidence. Governments tend not to make such tensions explicit, perhaps prompting researchers to navigate them via experience more than training.
- Evaluate initiatives according to clearly-stated expectations for researchers when they engage with stakeholders
- Clarify if researchers should be responsible for forming or engaging with existing networks to foster co-produced research
7. Establish realistic expectations for researcher engagement practices
Policy analysts and associated organisations appear to expect far more (direct and short-term) impact from their research than academic researchers (more likely to perform a long-term ‘enlightenment’ function). Academic researchers may desire more direct impact in theory, but also be wary of the costs and compromises in practice.
We can clarify these differences by identifying the ways in which common advice to analysts goes beyond common advice to academics (the latter is summarised in 8 categories by Oliver and Cairney above):
- Gather data efficiently, tailor your solutions to your audience, and tell a good story (Bardach)
- Address your client’s question, by their chosen deadline, in a clear and concise way that they can understand (and communicate to others) quickly (Weimer and Vining)
- Client-oriented advisors identify the beliefs of policymakers and anticipate the options worth researching (Mintrom)
- Identify your client’s resources and motivation, such as how they seek to use your analysis, the format of analysis they favour (make it ‘concise’ and ‘digestible’), their deadline, and their ability to make or influence the policies you might suggest (Meltzer and Schwartz).
- ‘Advise strategically’, to help a policymaker choose an effective solution within their political context (Thissen and Walker).
- Focus on producing ‘policy-relevant knowledge’ by adapting to the evidence-demands of policymakers and rejecting a naïve attachment to ‘facts speaking for themselves’ or ‘knowledge for its own sake’ (Dunn).
In that context, it would be reasonable to set high expectations for engagement and impact, but recognise the professional and practical limits to that engagement.
- Identify the relationships, between researchers and policymakers, that underpin expectations for engagement and impact
*I apologise for being a little enigmatic at this stage, and would welcome your thoughts on any aspect of this work-in-development, either in the comment function or via email (p.a.cairney [at] stir.ac.uk).