Policy Analysis in 750 Words: Political feasibility and policy success

Policy studies and policy analysis guidebooks identify the importance of feasible policy solutions:

  • Technical feasibility: will this solution work as intended if implemented?
  • Political feasibility: will it be acceptable to enough powerful people?

For example, Kingdon treats feasibility as one of three conditions for major policy change during a ‘window of opportunity’: (1) there is high attention to the policy problem, (2) a feasible solution already exists, and (3) key policymakers have the motive and opportunity to select it.

Guidebooks relate this requirement initially to your policymaker client: what solutions will they rule out, to the extent that they are not even worth researching as options (at least for the short term)?

Further, this assessment relates to types of policy ‘tool’ or ‘instrument’: one simple calculation is that ‘redistributive’ measures are harder to sell than ‘distributive’, while both may be less attractive than regulation (although complex problems likely require a mix of instruments).

These insights connect to Lindblom’s classic vision of:

  1. Incremental analysis. It is better to research in-depth a small number of feasible options than spread your resources too thinly to consider all possibilities.
  2. Strategic analysis. The feasibility of a solution relates strongly to current policy. The more radical a departure from the current negotiated position, the harder it will be to sell.

As many posts in the Policy Analysis in 750 words series describe, this advice is not entirely  useful for actors who seek rapid and radical departures from the status quo. Lindblom’s response to such critics was to seek radical change via a series of non-radical steps (at least in political systems like the US), which (broadly speaking) represents one of two possible approaches.

While incrementalism is not as popular as it once was (as a description of, or prescription for, policymaking), it tapped into the enduring insight that policymaking systems produce huge amounts of minor change. Rapid and radical policy change is rare, and it is even rarer to be able to connect it to influential analysis and action (at least in the absence of a major event). This knowledge should not put people off trying, but rather help them understand the obstacles that they seek to overcome.

Relating feasible solutions and strategies to ‘policy success’

One way to incorporate this kind of advice is to consider how (especially elected) policymakers would describe their own policy success. The determination of success and failure is a highly contested and political process (not simply a technical exercise called ‘evaluation’), and policymakers may refer – often implicitly – to the following questions when seeking success:

  1. Political. Will this policy boost my government’s credibility and chances of re-election?
  2. Process. Will it be straightforward to legitimise and maintain support for this policy?
  3. Programmatic. Will it achieve its stated objectives and produce beneficial outcomes if implemented?

The benefit to analysts, in asking themselves these questions, is that they help to identify the potential solutions that are technically but not politically feasible (or vice versa).

The absence of clear technical feasibility does not necessarily rule out solutions with wider political benefits (for example, it can be beneficial to look like you are trying to do something good). Hence the popular phrase ‘good politics, bad policy’.

Nor does a politically unattractive option rule out a technically feasible solution (not all politicians flee the prospect of ‘good policy, bad politics’). However, it should prompt attention to hard choices about whose support to seek, how long to wait, or how hard to push, to seek policy change. You can see this kind of thinking as ‘entrepreneurial‘ or ‘systems thinking’ depending on how much faith you have in agency in highly-unequal political contexts.

Further reading

It is tempting to conclude that these obstacles to ‘good policy’ reflect the pathological nature of politics. However, if we want to make this argument, we should at least do it well:

1. You can find this kind of argument in fields such as public health and climate change studies, where researchers bemoan the gap between (a) their high-quality evidence on an urgent problem and (b) a disproportionately weak governmental response. To do it well, we need to separate analytically (or at least think about): (a) the motivation and energy of politicians (usually the source of most criticism of low ‘political will’), and (b) the policymaking systems that constrain even the most sincere and energetic policymakers. See the EBPM page for more.

2. Studies of Social Construction and Policy Design are useful to connect policymaking research with a normative agenda to address ‘degenerative’ policy design.

Leave a comment

Filed under 750 word policy analysis

Policy Analysis in 750 Words: Changing things from the inside

How should policy actors seek radical changes to policy and policymaking?

This question prompts two types of answer:

1. Be pragmatic, and change things from the inside

Pragmatism is at the heart of most of the policy analysis texts in this series. They focus on the needs and beliefs of clients (usually policymakers). Policymakers are time-pressed, so keep your analysis short and relevant. See the world through their eyes. Focus on solutions that are politically as well as technically feasible. Propose non-radical steps, which may add up to radical change over the long-term.

This approach will seem familiar to students of research ‘impact’ strategies which emphasise relationship-building, being available to policymakers, and responding to the agendas of governments to maximise the size of your interested audience.

It will also ring bells for advocates of radical reforms in policy sectors such as (public) health and intersectoral initiatives such as gender mainstreaming:

  • Health in All Policies is a strategy to encourage radical changes to policy and policymaking to improve population health.  Common advice includes to: identify to policymakers how HiAP fits into current policy agendas, seek win-win strategies with partners in other sectors, and go to great lengths to avoid the sense that you are interfering in their work (‘health imperialism’).
  • Gender mainstreaming is a strategy to consider gender in all aspect of policy and policymaking. An equivalent playbook involves steps to: clarify what gender equality is, and what steps may help achieve it; make sure that these ideas translate across all levels and types of policymaking; adopt tools to ensure that gender is a part of routine government business (such as budget processes); and, modify existing policies or procedures while increasing the representation of women in powerful positions.

In other words, the first approach is to pursue your radical agenda via non-radical means, using a playbook that is explicitly non-confrontational.  Use your insider status to exploit opportunities for policy change.

2. Be radical, and challenge things from the outside

Challenging the status quo, for the benefit of marginalised groups, is at the heart of critical policy analysis:

  • Reject the idea that policy analysis is a rationalist, technical, or evidence-based process. Rather, it involves the exercise of power to (a) depoliticise problems to reduce attention to current solutions, and (b) decide whose knowledge counts.
  • Identify and question the dominant social constructions of problems and populations, asking who decides how to portray these stories and who benefits from their outcomes.

This approach resonates with frequent criticisms of ‘impact’ advice, emphasising the importance of producing research independent of government interference, to challenge policies that further harm already-marginalised populations.

It will also rings bells among advocates of more confrontational strategies to seek radical changes to policy and policymaking. They include steps to: find more inclusive ways to generate and share knowledge, produce multiple perspectives on policy problems and potential solutions, focus explicitly on the impact of the status quo on marginalised populations, politicise issues continuously to ensure that they receive sufficient attention, and engage in outsider strategies to protest current policies and practices.

Does this dichotomy make sense?

It is tempting to say that this dichotomy is artificial and that we can pursue the best of both worlds, such as working from within when it works and resorting to outsider action and protest when it doesn’t.

However, the blandest versions of this conclusion tend to ignore or downplay the politics of policy analysis in favour of more technical fixes. Sometimes collaboration and consensus politics is a wonderful feat of human endeavour. Sometimes it is a cynical way to depoliticise issues, stifle debate, and marginalise unpopular positions.

This conclusion also suggests that it is possible to establish what strategies work, and when, without really saying how (or providing evidence for success that would appeal to audiences associated with both approaches). Indeed, a recurrent feature of research in these fields is that most attempts to produce radical change prove to be dispiriting struggles. Non-radical strategies tend to be co-opted by more powerful actors, to mainstream new ways of thinking without changing the old. Radical strategies are often too easy to dismiss or counter.

The latter point reminds us to avoid excessively optimistic overemphasis on the strategies of analysts and advocates at the expense of context and audience. The 500 and 1000 words series perhaps tip us too far in the other direction, but provide a useful way to separate (analytically) the reasons for often-minimal policy change. To challenge dominant forms of policy and policymaking requires us to separate the intentional sources of inertia from the systemic issues that would constrain even the most sincere and energetic reformer.

Further reading

This post forms one part of the Policy Analysis in 750 words series, including posts on the role of analysts and marginalised groups. It also relates to work with St Denny, Kippin, and Mitchell (drawing on this draft paper) and posts on ‘evidence based policymaking’.

1 Comment

Filed under 750 word policy analysis

Call for papers for a JEPP Special Issue, ‘The politics of policy analysis: theoretical insights on real world problems’

Note: this call will appear shortly on the JEPP page. See also my 750 words series on policy analysis.

For a special edition of the Journal of European Public Policy, we invite proposals for papers that reflect on the theory and practice of policy analysis. This special issue will include state of the art articles on the politics of policy analysis, and empirical studies that use theoretical insights to analyse and address real world problems.

Policy analysis is not a rationalist, technocratic, centrally managed, or ‘evidence based’ process to solve policy problems. Rather, critical policy analysis and mainstream policy studies describe contemporary policy analysis as a highly contested (but unequal) process in which many policymakers, analysts, and influencers cooperate or compete across many centres of government. Further, governments are not in the problem solving business. Instead, they inherit policies that address some problems and create or exacerbate others, benefit some groups and marginalize others, or simply describe problems as too difficult to solve. The highest profile problems, such as global public health and climate change, require the kinds of (1) cooperation across many levels of government (and inside and outside of government), and (2) attention to issues of justice and equity, of which analysts could only dream.

This description of policymaking complexity presents a conundrum. On the one hand, there exist many five-step guides to analysis, accompanied by simple stage-based descriptions of policy processes, but they describe what policy actors would need or like to happen rather than policymaking reality. On the other, policy theory-informed studies are essential to explanation, but not yet essential reading for policy analysts. Policy theorists may be able to describe policy processes – and the role of policy analysts – more accurately than simple guides, but do not offer a clear way to guide action. Practitioner audiences are receptive to accurate descriptions of policymaking reality, but also want a take-home message that they can pick up and use in their work. Critical policy analysts may appreciate insights on the barriers to policy and policymaking change, but only if there is equal attention to how to overcome them.

We seek contributions that engage with this conundrum. We welcome papers which use theories, concepts and frameworks that are considered the policy studies mainstream, but also contributions from critical studies that use research to support marginalized populations as they analyse contemporary policy problems. We focus on Europe broadly defined, but welcome contributions with  direct lessons from any other region.

Potential themes include but are not limited to:

  • State of the art articles that use insights from policy theories and/ or critical policy analysis to guide the study and practice of policy analysis
  • Articles that situate the analysis of contemporary policy problems within a wider policymaking context, to replace wishful thinking with more feasible (but equally ambitious) analysis
  • Articles that engage critically with contemporary themes in policy analysis and design, such as how to encourage ‘entrepreneurial’ policy analysis, foster ‘co-production’ during policy analysis and design, or engage in ‘systems thinking’ without relying on jargon and gimmicks.
  • Articles that engage with the unrealistic idea of ‘evidence-based policymaking’ to produce more feasible (and less technocratic) images of evidence-informed policymaking.

Expressions of interest consisting of a title, author(s) names and affiliation, and a short abstract (no more than 300 words) should be sent to p.a.cairney@stir.ac.uk by Feb 28th 2022. Successful authors should have a full article draft for submission into the JEPP review process by August 30th 2022.

Leave a comment

Filed under Uncategorized

Policy Analysis in 750 Words: Two approaches to policy learning and transfer

This post forms one part of the Policy Analysis in 750 words series. It draws on work for an in-progress book on learning to reduce inequalities. Some of the text will seem familiar if you have read other posts. Think of it as an adventure game in which the beginning is the same but you don’t know the end.

Policy learning is the use of new information to update policy-relevant knowledge. Policy transfer involves the use of knowledge about policy and policymaking in one government to inform policy and policymaking in another.

These processes may seem to relate primarily to research and expertise, but they require many kinds of political choices (explored in this series). They take place in complex policymaking systems over which no single government has full knowledge or control.

Therefore, while the agency of policy analysts and policymakers still matters, they engage with a policymaking context that constrains or facilitates their action.

Two approaches to policy learning: agency and context-driven stories

Policy analysis textbooks focus on learning and transfer as an agent-driven process with well-established  guidance (often with five main steps). They form part of a functionalist analysis where analysts identify the steps required to turn comparative analysis into policy solutions, or part of a toolkit to manage stages of the policy process.

Agency is less central to policy process research, which describes learning and transfer as contingent on context. Key factors include:

Analysts compete to define problems and determine the manner and sources of learning, in a multi-centric environment where different contexts will constrain and facilitate action in different ways. For example, varying structural factors – such as socioeconomic conditions – influence the feasibility of proposed policy change, and each centre’s institutions provide different rules for gathering, interpreting, and using evidence.

The result is a mixture of processes in which:

  1.  Learning from experts is one of many possibilities. For example, Dunlop and Radaelli also describe ‘reflexive learning’, ‘learning through bargaining’, and ‘learning in the shadow hierarchy’
  2.  Transfer takes many forms.

How should analysts respond?

Think of two different ways to respond to this description of the policy process with this lovely blue summary of concepts. One is your agency-centred strategic response. The other is me telling you why it won’t be straightforward.

An image of the policy process (see 5 images)

There are many policy makers and influencers spread across many policymaking ‘centres’

  1. Find out where the action is and tailor your analysis to different audiences.
  2. There is no straightforward way to influence policymaking if multiple venues contribute to policy change and you don’t know who does what.

Each centre has its own ‘institutions’

  1. Learn the rules of evidence gathering in each centre: who takes the lead, how do they understand the problem, and how do they use evidence?
  2. There is no straightforward way to foster policy learning between political systems if each is unaware of each other’s unwritten rules. Researchers could try to learn their rules to facilitate mutual learning, but with no guarantee of success.

Each centre has its own networks

  1. Form alliances with policymakers and influencers in each relevant venue.
  2. The pervasiveness of policy communities complicates policy learning because the boundary between formal power and informal influence is not clear.

Well-established ‘ideas’ tend to dominate discussion

  1. Learn which ideas are in good currency. Tailor your advice to your audience’s beliefs.
  2. The dominance of different ideas precludes many forms of policy learning or transfer. A popular solution in one context may be unthinkable in another.

Many policy conditions (historic-geographic, technological, social and economic factors) command the attention of policymakers and are out of their control. Routine events and non-routine crises prompt policymaker attention to lurch unpredictably.

  1. Learn from studies of leadership in complex systems or the policy entrepreneurs who find the right time to exploit events and windows of opportunity to propose solutions.
  2. The policy conditions may be so different in each system that policy learning is limited and transfer would be inappropriate. Events can prompt policymakers to pay disproportionately low or high attention to lessons from elsewhere, and this attention relates weakly to evidence from analysts.

Feel free to choose one or both forms of advice. One is useful for people who see analysts and researchers as essential to major policy change. The other is useful if it serves as a source of cautionary tales rather than fatalistic responses.

See also:

Policy Concepts in 1000 Words: Policy Transfer and Learning

Teaching evidence based policy to fly: how to deal with the politics of policy learning and transfer

Policy Concepts in 1000 Words: the intersection between evidence and policy transfer

Policy learning to reduce inequalities: a practical framework

Three ways to encourage policy learning

Epistemic versus bargaining-driven policy learning

The ‘evidence-based policymaking’ page explores these issues in more depth

Leave a comment

Filed under 750 word policy analysis, IMAJINE, Policy learning and transfer, public policy

We are recruiting a temporary lecturer in International Politics at the University of Stirling

Please see our Vacancy page for the details: https://www.stir.ac.uk/about/work-at-stirling/list/details/?jobId=2841&jobTitle=Lecturer%20in%20International%20Politics

I am one of the pre-interview contacts and these are my personal thoughts on that process, which blend background information and some helpful advice. These notes are also there to address a potentially major imbalance in the informal side to recruitment: if you do not have the contacts and networks that help give you the confidence to seek information (on the things not mentioned in the further particulars), here is the next best thing: the information I would otherwise give you on the phone.

This approach is also handy under the current circumstances, in which (a) the vacancy will run for a short period (deadline: 29th November), because (b) we need someone to start in January.

In contrast to most of the positions I have described on this blog , this post is temporary (12 months, beginning in January). It arises from (very welcome) grant success, which prompted us to rejuggle our teaching and administration at short notice (the essential criteria and descriptions are narrower than usual because we have in mind some very specific teaching requirements).

Here are some general tips on the application and interview processes.

The application process:

  • At this stage, the main documents are the CV and the cover letter.
  • You should keep the cover letter short to show your skills at concise writing (I suggest 1-page). Focus on what you can offer the Division specifically, given the nature of our call and further particulars.
  • Lecturers will be competing with many people who have completed a PhD – so what makes your CV stand out?
  • We take teaching very seriously. Within our division, we plan an overall curriculum together, discuss regularly if it is working, and come to agreements about how to teach and assess work. We pride ourselves on being a small and friendly bunch of people, open to regular student contact and, for example, committed to meaningful and regular feedback.
  • You might think generally about how you would contribute to teaching and learning in that context. In particular, you should think about how, for example, you would deliver large undergraduate modules (in which you may only be an expert on some of the material) as well as the smaller, more specialist and advanced, modules closer to your expertise. However, please also note that your main contribution is specific:
  • Dissertation supervision at Undergraduate and Postgraduate levels;
  • Coordinating and delivering specialist modules in the Undergraduate programme (including the advanced module POLU9PE Global Political Economy, and one other advanced module)
  • Coordinating and delivering the International Conflict and Cooperation (ICC) Postgraduate taught programme (ICCPP02 International Organisations)

The interview process

The shortlisting is on the 10th December. All going well, you will know if you have reached the interview stage by the 13th. The interviews will take place on the 16th December (morning). 

The interview stage

Here is how I would describe an open ended lectureship. By the interview stage, here are the things that you should normally know:

  • The teaching and research specialisms of the division and their links to cross-divisional research.
  • The kinds of courses that the division would expect you to teach.

Perhaps most importantly, you need to be able to articulate why you want to come and work at Stirling. ‘Why Stirling?’ or ‘Why this division?’ is usually the first question in an interview, so you should think about it in advance. We recommend doing some research on Stirling and the division/ faculty, to show in some detail that you have a considered reply (beyond ‘it is a beautiful campus’). We will see through a generic response and, since it is the first question, your answer will set the tone for the rest of the interview. You might check, for example, who you might share interests with in the Division, and how you might  develop links beyond the division or faculty, since this is likely to be a featured question too.

  • Then you might think about what you would bring to the University in a wider sense, such as through well-established (domestic and international) links with other scholars in academic networks.
  • Further, since ‘impact’ is of rising importance, you might discuss your links with people and organisations outside of the University, and how you have pursued meaningful engagement with the public or practitioners to maximise the wider contribution of your research.

Here is how I would qualify that advice for this post.  With this post, we are likely to focus relatively intensely on specific questions regarding the likely teaching, so please do not feel that you should research the history of the University as preparation.

The interview format

For open-ended contracts, we tend to combine (a) presentations to divisional (and other interested) staff in the morning, with (b) interviews in the afternoon. However, in this case, we will ask you to present briefly to the interview panel.

“Please prepare a 10-minute presentation (with no obligation to use powerpoint) on this question: How would your teaching experience contribute to this Lectureship? Please focus on:

  • coordinating and delivering the advanced undergraduate module Global Political Economy
  • what other advanced undergraduate module you could deliver (based on your research expertise)
  • coordinating and delivering the postgraduate taught module International Organisations
  • supervising undergraduate dissertations in international politics”

In addition:

  • We recommend keeping the (online, via Teams) presentation compact, to show that you can present complex information in a concise and clear way. Presentations are usually a mix of what you do in teaching, research, and what you will contribute in a wider sense to the University (but this one is more focused).
  • The usual interview panel format at this level is four members, including: one subject specialist from the Division (me), one member of the Faculty (our Head of Division), the Head of Faculty of Arts and Humanities, and a senior academic in another Faculty.
  • In other words, only 1 member of your panel will be a subject specialist in Politics (and, in this case, not International Politics). This means that (at the very least) you need to describe your success in a way that a wider audience will appreciate.

It sounds daunting, but we are a friendly bunch and want you to do well. You might struggle to retain all of our names (although they are written on Teams), so focus on the types of question we ask – for example, the general question to get you started will be from the senior manager. There are often more men than women on the panel (I think this one will be 50-50), and they are often all-white panels, but we are committed to making such routine imbalances a thing of the past.

Please email – p.a.cairney@stir.ac.uk – if you have further questions.

Leave a comment

Filed under Uncategorized

Policy Analysis in 750 Words: How to deal with ambiguity

This post forms one part of the Policy Analysis in 750 words series. It draws on this 500 Words post, then my interpretation of co-authored work with Drs Emily St Denny and John Boswell (which I would be delighted to share if it gets published). It trails off at the end.

In policy studies, ambiguity describes the ability to entertain more than one interpretation of a policy problem. There are many ways to frame issues as problems. However, only some frames receive high policymaker attention, and policy change relates strongly to that attention. Resolving ambiguity in your favour is the prize.

Policy studies focus on different aspects of this dynamic, including:

  1. The exercise of power, such as of the narrator to tell stories and the audience to engage with or ignore them.
  2. Policy learning, in which people collaborate (and compete) to assign concrete meaning to abstract aims.
  3. A complex process in which many policymakers and influencers are cooperating/ competing to define problems in many policymaking centres.

They suggest that resolving ambiguity affects policy in different ways, to influence the:

The latter descriptions, reflecting multi-centric policymaking, seem particularly relevant to major contemporary policy problems – such as global public health and climate crises – in which cooperation across (and outside of) many levels and types of government is essential.

Resolving ambiguity in policy analysis texts

This context helps us to interpret common (Step 1) advice in policy analysis textbooks: define a policy problem for your client, using your skills of research and persuasion but tailoring your advice to your client’s interests and beliefs. Yet, gone are the mythical days of elite analysts communicating to a single core executive in charge of formulating and implementing all policy instruments. Many analysts engage with many centres producing (or co-producing) many instruments. Resolving ambiguity in one centre does not guarantee the delivery of your aims across many.

Two ways to resolve ambiguity in policy analysis

Classic debates would highlight two different responses:

  • ‘Top down’ accounts see this issue through the lens of a single central government, examining how to reassert central control by minimising implementation gaps.

Policy analysis may focus on (a) defining the policy problem, and (b) ensuring the implementation of its solution.

  • ‘Bottom up’ accounts identify the inevitability (and legitimacy) of policy influence in multiple centres. Policy analysis may focus on how to define the problem in cooperation with other centres, or to set a strategic direction and encourage other centres to make sense of it in their context.

This terminology went out of fashion, but note the existence of each tendency in two ideal-type approaches to contemporary policy problems:

1. Centralised and formalised approaches.

Seek clarity and order to address urgent policy problems. Define the policy problem clearly, translate that definition into strategies for each centre, and develop a common set of effective ‘tools’ to ensure cooperation and delivery.

Policy analysis may focus on technical aspects, such as how to create a fine-detail blueprint for action, backed by performance management and accountability measures that tie actors to specific commitments.

The tagline may be: ambiguity is a problem to be solved, to direct policy actors towards a common goal.

2. Decentralised, informal, collaborative approaches.

Seek collaboration to make sense of, and address, problems. Reject a single definition of the problem, encourage actors in each centre (or in concert) to deliberate to make sense of problems together, and co-create the rules to guide a continuous process of collective behaviour.

Policy analysis may focus on how to contribute to a collaborative process of sense-making and rule-making.

The tagline may be: ambiguity presents an opportunity to energise policy actors, to harness the potential for innovation arising from deliberation.

Pick one approach and stick with it?

Describing these approaches in such binary terms makes the situation – and choice between approaches – look relatively straightforward. However, note the following issues:

  • Many policy sectors (and intersectoral agendas) are characterised by intense disagreement on which choice to make. These disagreements intersect with others (such as when people seek not only transformative policy change to solve global problems, but also equitable process and outcomes).
  • Some sectors seem to involve actors seeking the best of both worlds (centralise and localise, formalise and deliberate) without recognising the trade-offs and dilemmas that arise.
  • I have described these options as choices, but did not establish if anyone is in the position to make or contribute to that choice.

In that context, resolving ambiguity in your favour may still be the prize, but where would you even begin?

Further reading

Well, that was an unsatisfying end to the post, eh? Maybe I’ll write a better one when some things are published. In the meantime, some of these papers and posts explore some of these issues:

Leave a comment

Filed under Uncategorized

Policy in 500 Words: Trust

This post summarises ‘COVID-19: effective policymaking depends on trust in experts, politicians, and the public’ by Adam Wellstead and me.

The meaning of trust

We define trust as ‘a belief in the reliability of other people, organizations, or processes’, but it is one of those terms – like ‘policy’ – that defies a single comprehensive definition. The term ‘distrust’ complicates things further, since it does not simply mean the absence of trust.

Its treatment in social science also varies, which makes our statement – ‘Trust is necessary for cooperation, coordination, social order, and to reduce the need for coercive state imposition’ – one of many ways to understand its role.

A summary of key concepts

Social science accounts of trust relate it to:

1. Individual choice

I may trust someone to do something if I value their integrity (if they say they will do it, I believe them), credibility (I believe their claim is accurate and feasible), and competence (I believe they have the ability).

This perception of reliability depends on:

  • The psychology of the truster. The truster assesses the risk of relying on others, while combining cognition and emotion to relate that risk of making themselves vulnerable to the benefit of collective action, while drawing on an expectation of reciprocity.
  • The behaviour of the trustee. They demonstrate their trustworthiness in relation to past performance, which demonstrates their competence and reliability and perhaps their selflessness in favour of collective action.
  • Common reference points. The trustee and truster may use shortcuts to collective action, such as a reference to something they have in common (e.g. their beliefs or social background), their past interactions, or the authority of the trustee.

2. Social and political rules (aka institutions).

Perhaps ideally, we would learn who to trust via our experiences of working together, but we also need to trust people we have never met, and put equivalent trust in organisations and ‘systems’.

In that context, approaches such as the Institutional Analysis and Development (IAD) identify the role of many different kinds of rules in relation to trust:

  • Rules can be formal, written, and widely understood (e.g. to help assign authority regardless of levels of interaction) or informal, unwritten, and only understood by some (e.g. resulting from interactions in some contexts).
  • Rules can represent low levels of trust and a focus on deterring breaches (e.g. creating and enforcing contracts) or high levels of trust (e.g. to formalize ‘effective practices built on reciprocity, emotional bonds, and/or positive expectations’).

3. Societal necessity and interdependence.

Trust is a functional requirement. We need to trust people because we cannot maintain a functional society or political system without working together. Trust-building underpins the study of collaboration (or cooperation and bargaining), such as in the Ecology of Games approach (which draws on the IAD).

  • In that context, trust is a resource (to develop) that is crucial to a required outcome.

Is trust good and distrust bad?

We describe trust as ‘necessary for cooperation’ and distrust as a ‘potent motivator’ that may prompt people to ignore advice or defy cooperation or instruction. Yet, neither is necessarily good or bad. Too much trust may be a function of: (1) the abdication of our responsibility to engage critically with leaders in political systems, (2) vulnerability to manipulation, and/ or (3) excessive tribalism, prompting people to romanticise their own cause and demonise others, each of which could lead us to accept uncritically the cynical choices of policymakers.

Further reading

Trust is a slippery concept, and academics often make it slippier by assuming rather than providing a definition. In that context, why not read all of the 500 Words series and ask yourself where trust/ distrust fit in?

Leave a comment

Filed under 500 words, public policy

Policy Analysis in 750 Words: power and knowledge

This post adapts Policy in 500 Words: Power and Knowledge (the body of this post) to inform the Policy Analysis in 750 words series (the top and tails).

One take home message from the 750 Words series is to avoid seeing policy analysis simply as a technical (and ‘evidence-based’) exercise. Mainstream policy analysis texts break down the process into technical-looking steps, but also show how each step relates to a wider political context. Critical policy analysis texts focus more intensely on the role of politics in the everyday choices that we might otherwise take for granted or consider to be innocuous. The latter connect strongly to wider studies of the links between power and knowledge.

Power and ideas

Classic studies suggest that the most profound and worrying kinds of power are the hardest to observe. We often witness highly visible political battles and can use pluralist methods to identify who has material resources, how they use them, and who wins. However, key forms of power ensure that many such battles do not take place. Actors often use their resources to reinforce social attitudes and policymakers’ beliefs, to establish which issues are policy problems worthy of attention and which populations deserve government support or punishment. Key battles may not arise because not enough people think they are worthy of debate. Attention and support for debate may rise, only to be crowded out of a political agenda in which policymakers can only debate a small number of issues.

Studies of power relate these processes to the manipulation of ideas or shared beliefs under conditions of bounded rationality (see for example the NPF). Manipulation might describe some people getting other people to do things they would not otherwise do. They exploit the beliefs of people who do not know enough about the world, or themselves, to know how to identify and pursue their best interests. Or, they encourage social norms – in which we describe some behaviour as acceptable and some as deviant – which are enforced by (1) the state (for example, via criminal justice and mental health policy), (2) social groups, and (3) individuals who govern their own behaviour with reference to what they feel is expected of them (and the consequences of not living up to expectations).

Such beliefs, norms, and rules are profoundly important because they often remain unspoken and taken for granted. Indeed, some studies equate them with the social structures that appear to close off some action. If so, we may not need to identify manipulation to find unequal power relationships: strong and enduring social practices help some people win at the expense of others, by luck or design.

Relating power to policy analysis: whose knowledge matters?

The concept of‘epistemic violence’ is one way todescribe the act of dismissing an individual, social group, or population by undermining the value of their knowledge or claim to knowledge. Specific discussions include: (a) the colonial West’s subjugation of colonized populations, diminishing the voice of the subaltern; (b) privileging scientific knowledge and dismissing knowledge claims via personal or shared experience; and (c) erasing the voices of women of colour from the history of women’s activism and intellectual history.

It is in this context that we can understand ‘critical’ research designed to ‘produce social change that will empower, enlighten, and emancipate’ (p51). Powerlessness can relate to the visible lack of economic material resources and factors such as the lack of opportunity to mobilise and be heard.

750 Words posts examining this link between power and knowledge

Some posts focus on the role of power in research and/ or policy analysis:

These posts ask questions such as: who decides what evidence will be policy-relevant, whose knowledge matters, and who benefits from this selective use of evidence? They help to (1) identify the exercise of power to maintain evidential hierarchies (or prioritise scientific methods over other forms of knowledge gathering and sharing), and (2) situate this action within a wider context (such as when focusing on colonisation and minoritization). They reflect on how (and why) analysts should respect a wider range of knowledge sources, and how to produce more ethical research with an explicit emancipatory role. As such, they challenge the – naïve or cynical – argument that science and scientists are objective and that science-informed analysis is simply a technical exercise (see also Separating facts from values).

Many posts incorporate these discussions into many policy analysis themes.

See also

Policy Concepts in 1000 Words: Power and Ideas

Education equity policy: ‘equity for all’ as a distraction from race, minoritization, and marginalization. It discusses studies of education policy (many draw on critical policy analysis)

There are also many EBPM posts that slip this discussion of power and politics into discussions of evidence and policy. They don’t always use the word ‘power’ though (see Evidence-informed policymaking: context is everything)

Leave a comment

Filed under 750 word policy analysis, agenda setting, Evidence Based Policymaking (EBPM), public policy

Policy Analysis in 750 Words: Separating facts from values

This post begins by reproducing Can you separate the facts from your beliefs when making policy?(based on the 1st edition of Understanding Public Policy) …

A key argument in policy studies is that it is impossible to separate facts and values when making policy. We often treat our beliefs as facts, or describe certain facts as objective, but perhaps only to simplify our lives or support a political strategy (a ‘self-evident’ fact is very handy for an argument). People make empirical claims infused with their values and often fail to realise just how their values or assumptions underpin their claims.

This is not an easy argument to explain. One strategy is to use extreme examples to make the point. For example, Herbert Simon points to Hitler’s Mein Kampf as the ultimate example of value-based claims masquerading as facts. We can also identify historic academic research which asserts that men are more intelligent than women and some races are superior to others. In such cases, we would point out, for example, that the design of the research helped produce such conclusions: our values underpin our (a) assumptions about how to measure intelligence or other measures of superiority, and (b) interpretations of the results.

‘Wait a minute, though’ (you might say). “What about simple examples in which you can state facts with relative certainty – such as the statement ‘there are X number of words in this post’”. ‘Fair enough’, I’d say (you will have to speak with a philosopher to get a better debate about the meaning of your X words claim; I would simply say that it is trivially true). But this statement doesn’t take you far in policy terms. Instead, you’d want to say that there are too many or too few words, before you decided what to do about it.

In that sense, we have the most practical explanation of the unclear fact/ value distinction: the use of facts in policy is to underpin evaluations (assessments based on values). For example, we might point to the routine uses of data to argue that a public service is in ‘crisis’ or that there is a public health related epidemic (note: I wrote the post before COVID-19; it referred to crises of ‘non-communicable diseases’). We might argue that people only talk about ‘policy problems’ when they think we have a duty to solve them.

Or, facts and values often seem the hardest to separate when we evaluate the success and failure of policy solutions, since the measures used for evaluation are as political as any other part of the policy process. The gathering and presentation of facts is inherently a political exercise, and our use of facts to encourage a policy response is inseparable from our beliefs about how the world should work.

It continues with an edited excerpt from p59 of Understanding Public Policy, which explores the implications of bounded rationality for contemporary accounts of ‘evidence-based policymaking’:

‘Modern science remains value-laden … even when so many people employ so many systematic methods to increase the replicability of research and reduce the reliance of evidence on individual scientists. The role of values is fundamental. Anyone engaging in research uses professional and personal values and beliefs to decide which research methods are the best; generate research questions, concepts and measures; evaluate the impact and policy relevance of the results; decide which issues are important problems; and assess the relative weight of ‘the evidence’ on policy effectiveness. We cannot simply focus on ‘what works’ to solve a problem without considering how we used our values to identify a problem in the first place. It is also impossible in practice to separate two choices: (1) how to gather the best evidence and (2) whether to centralize or localize policymaking. Most importantly, the assertion that ‘my knowledge claim is superior to yours’ symbolizes one of the most worrying exercises of power. We may decide to favour some forms of evidence over others, but the choice is value-laden and political rather than objective and innocuous’.

Implications for policy analysis

Many highly-intelligent and otherwise-sensible people seem to get very bothered with this kind of argument. For example, it gets in the way of (a) simplistic stories of heroic-objective-fact-based-scientists speaking truth to villainous-stupid-corrupt-emotional-politicians, (b) the ill-considered political slogan that you can’t argue with facts (or ‘science’), (c) the notion that some people draw on facts while others only follow their feelings, and (d) the idea that you can divide populations into super-facty versus post-truthy people.

A more sensible approach is to (1) recognise that all people combine cognition and emotion when assessing information, (2) treat politics and political systems as valuable and essential processes (rather than obstacles to technocratic policymaking), and (3) find ways to communicate evidence-informed analyses in that context. This article and 750 post explore how to reflect on this kind of communication.

Most relevant posts in the 750 series

Linda Tuhiwai Smith (2012) Decolonizing Methodologies 

Carol Bacchi (2009) Analysing Policy: What’s the problem represented to be? 

Deborah Stone (2012) Policy Paradox

Who should be involved in the process of policy analysis?

William Riker (1986) The Art of Political Manipulation

Using Statistics and Explaining Risk (David Spiegelhalter and Gerd Gigerenzer)

Barry Hindess (1977) Philosophy and Methodology in the Social Sciences

See also

To think further about the relevance of this discussion, see this post on policy evaluation, this page on the use of evidence in policymaking, this book by Douglas, and this short commentary on ‘honest brokers’ by Jasanoff.

1 Comment

Filed under 750 word policy analysis, Academic innovation or navel gazing, agenda setting, Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy

Policy Analysis in 750 Words: How to communicate effectively with policymakers

This post forms one part of the Policy Analysis in 750 words series overview. The title comes from this article by Cairney and Kwiatkowski on ‘psychology based policy studies’.

One aim of this series is to combine insights from policy research (1000, 500) and policy analysis texts. How might we combine insights to think about effective communication?

1. Insights from policy analysis texts

Most texts in this series relate communication to understanding your audience (or client) and the political context. Your audience has limited attention or time to consider problems. They may have a good antennae for the political feasibility of any solution, but less knowledge of (or interest in) the technical details. In that context, your aim is to help them treat the problem as worthy of their energy (e.g. as urgent and important) and the solution as doable. Examples include:

  • Bardach: communicating with a client requires coherence, clarity, brevity, and minimal jargon.
  • Dunn: argumentation involves defining the size and urgency of a problem, assessing the claims made for each solution, synthesising information from many sources into a concise and coherent summary, and tailoring reports to your audience.
  • Smith: your audience makes a quick judgement on whether or not to read your analysis. Ask yourself questions including: how do I frame the problem to make it relevant, what should my audience learn, and how does each solution relate to what has been done before? Maximise interest by keeping communication concise, polite, and tailored to a policymaker’s values and interests.

2. Insights from studies of policymaker psychology

These insights emerged from the study of bounded rationality: policymakers do not have the time, resources, or cognitive ability to consider all information, possibilities, solutions, or consequences of their actions. They use two types of informational shortcut associated with concepts such as cognition and emotion, thinking ‘fast and slow’, ‘fast and frugal heuristics’, or, if you like more provocative terms:

  • ‘Rational’ shortcuts. Goal-oriented reasoning based on prioritizing trusted sources of information.
  • ‘Irrational’ shortcuts. Emotional thinking, or thought fuelled by gut feelings, deeply held beliefs, or habits.

We can use such distinctions to examine the role of evidence-informed communication, to reduce:

  • Uncertainty, or a lack of policy-relevant knowledge. Focus on generating ‘good’ evidence and concise communication as you collate and synthesise information.
  • Ambiguity, or the ability to entertain more than one interpretation of a policy problem. Focus on argumentation and framing as you try to maximise attention to (a) one way of defining a problem, and (b) your preferred solution.

Many policy theories describe the latter, in which actors: combine facts with emotional appeals, appeal to people who share their beliefs, tell stories to appeal to the biases of their audience, and exploit dominant ways of thinking or social stereotypes to generate attention and support. These possibilities produce ethical dilemmas for policy analysts.

3. Insights from studies of complex policymaking environments

None of this advice matters if it is untethered from reality.

Policy analysis texts focus on political reality to note that even a perfectly communicated solution is worthless if technically feasible but politically unfeasible.

Policy process texts focus on policymaking reality: showing that ideal-types such as the policy cycle do not guide real-world action, and describing more accurate ways to guide policy analysts.

For example, they help us rethink the ‘know your audience’ mantra by:

Identifying a tendency for most policy to be processed in policy communities or subsystems:

Showing that many policymaking ‘centres’ create the instruments that produce policy change

Gone are the mythical days of a small number of analysts communicating to a single core executive (and of the heroic researcher changing the world by speaking truth to power). Instead, we have many analysts engaging with many centres, creating a need to not only (a) tailor arguments to different audiences, but also (b) develop wider analytical skills (such as to foster collaboration and the use of ‘design principles’).

How to communicate effectively with policymakers

In that context, we argue that effective communication requires analysts to:

1. Understand your audience and tailor your response (using insights from psychology)

2. Identify ‘windows of opportunity’ for influence (while noting that these windows are outside of anyone’s control)

3. Engage with real world policymaking rather than waiting for a ‘rational’ and orderly process to appear (using insights from policy studies).

See also:

Why don’t policymakers listen to your evidence?

3. How to combine principles on ‘good evidence’, ‘good governance’, and ‘good practice’

Entrepreneurial policy analysis

1 Comment

Filed under 750 word policy analysis, agenda setting, Evidence Based Policymaking (EBPM), public policy, Storytelling

Policy in 500 Words: Peter Hall’s policy paradigms

Several 500 Word and 1000 Word (a, b, c) posts try to define and measure policy change.

Most studies agree that policymaking systems produce huge amounts of minor change and rare instances of radical change, but not how to explain these patterns. For example:

  • Debates on incrementalism questioned if radical change could be managed via non-radical steps.
  • Punctuated equilibrium theory describes policy change as a function of disproportionately low or high attention to problems, and akin to the frequency of earthquakes (a huge number of tiny changes, and more major changes than we would see in a ‘normal distribution’).

One of the most famous accounts of major policy change is by Peter Hall. ‘Policy paradigms’ help explain a tendency towards inertia, punctuated rarely by radical change (compare with discussions of path dependence and critical junctures).

A policy paradigm is a dominant and often taken-for-granted worldview (or collection of beliefs) about: policy goals, the nature of a policy problem, and the instruments to address it.

Paradigms can operate for long periods, subject to minimal challenge or defended successfully during events that call current policies into question. Adherence to a paradigm produces two ‘orders’ of change:

  • 1st order: frequent routine bureaucratic changes to instruments while maintaining policy goals.
  • 2nd order: less frequent, non-routine changes (or use of new instruments) while maintaining policy goals.

Radical and rare – 3rd order – policy change may only follow a crisis in which policymakers cannot solve a policy problem or explain why policy is failing. It prompts a reappraisal and rejection of the dominant paradigm, by a new government with new ways of thinking and/or a government rejecting current experts in favour of new ones. Hall’s example was of rapid paradigm shift in UK economic policy – from ‘Keynesianism’ to ‘Monetarism’ – within very few years.

Hall’s account prompted two different debates:

1. Some describe Hall’s case study as unusual.

Many scholars produced different phrases to describe a more likely pattern of (a) non-radical policy changes contributing to (b) long-term paradigm change and (c) institutional change, perhaps over decades. They include: ‘gradual change with transformative results’ and ‘punctuated evolution’ (see also 1000 Words: Evolution).

2. Some describe Hall’s case study as inaccurate.

This UK paradigm change did not actually happen. Instead, there was:

(a) A sudden and profound policy change that did not represent a paradigm shift (the UK experiment with Monetarism was short-lived).

(b) A series of less radical changes that produced paradigm change over decades: from Keynesianism to ‘neo-Keynesianism’, or from state intervention to neoliberalism (such as to foster economic growth via private rather than public borrowing and spending)

These debates connect strongly to issues in policy analysis, particularly if analysts seek transformative policy change to challenge unequal and unfair outcomes (such as in relation to racism or the climate crisis):

  1. Is paradigm change generally only possible over decades?
  2. How will we know if this transformation is actually taking place and here to stay (if even the best of us can be fooled by temporary developments)?

See also:

1. Beware the use of the word ‘evolution

2. This focus on the endurance of policy instrument change connects to studies of policy success (see Great Policy Successes).

3. Paul Cairney and Chris Weible (2015) ‘Comparing and Contrasting Peter Hall’s Paradigms and Ideas with the Advocacy Coalition Framework’ in (eds) M. Howlett and J. Hogan Policy Paradigms in Theory and Practice (Basingstoke: Palgrave) PDF

4 Comments

Filed under 500 words, public policy

Policy in 500 words and Policy Analysis in 750 words: writing about policy

This post is a shortened version of The Politics of Policy Analysis Annex A. It shows how to use insights from policy process research in policy analysis and policymaking coursework (much like the crossover between Scooby-Doo and Batman). It describes a range of exercises, including short presentations, policy analysis papers, blog posts, and essays. In each case, it explains the rationale for each exercise and the payoff to combining them.

If you prefer me to describe these insights less effectively, there is also a podcast:

[See also Writing About Policy 2: Write Harder, which describes how to write a 10000 word dissertation]

One step to combining policy analysis and policy process research is to modify the former according to the insights of the latter. In other words, consider how a ‘new policy sciences’ inspired policy analysis differs from the analyses already provided by 5-step guides.

It could turn out that the effects of our new insights on a policy briefing could be so subtle that you might blink and miss them. Or, there are so many possibilities from which to choose that it is impossible to provide a blueprint for new policy science advice. Therefore, I encourage students to be creative in their policy analysis and reflective in their assessment of their analysis. Our aim is to think about the skills you need to analyse policy, from producing or synthesising evidence, to crafting an argument based on knowing your audience, and considering how your strategy might shift in line with a shifting context.

To encourgage creativity, I set a range of tasks so that students can express themselves in different ways, to different audiences, with different constraints. For example, we can learn how to be punchy and concise from a 3-minute presentation or 500-word blog, and use that skill to get to the point more quickly in policy analysis or clarify the research question in the essay.

The overall effect should be that students can take what they have learned from each exercise and use it for the others.

In each section below, I reproduce the ways in which I describe this mix of coursework to students then, in each box, note the underlying rationale.

1. A 3-minute spoken presentation to your peers in a seminar.

In 3 minutes, you need to identify a problem, describe one or more possible solutions, and end your presentation in a convincing way. For example, if you don’t make a firm recommendation, what can you say to avoid looking like you are copping out? Focus on being persuasive, to capture your audience’s imagination. Focus on the policy context, in which you want to present a problem as solvable (who will pay attention to an intractable problem?) but not make inflated claims about how one action can solve a major problem. Focus on providing a memorable take home message.

The presentation can be as creative as you wish, but it should not rely on powerpoint in the room. Imagine that none of the screens work or that you are making your pitch to a policymaker as you walk along the street: can you make this presentation engaging and memorable without any reference to someone else’s technology? Can you do it without just reading out your notes? Can you do it well in under 3 minutes? We will then devote 5 minutes to questions from the audience about your presentation. Being an active part of the audience – and providing peer review – is as important as doing a good presentation of your own.

BOX A1: Rationale for 3-minute presentation.

If students perform this task first (before the coursework is due), it gives them an initial opportunity to see how to present only the most relevant information, and to gauge how an audience responds to their ideas. Audience questions provide further peer-driven feedback. I also plan a long seminar to allow each student (in a group of 15-20 people) to present, then ask all students about which presentation they remember and why. This exercise helps students see that they are competing with each other for limited policymaker attention, and learn from their peers about what makes an effective pitch. Maybe you are wondering why I discourage powerpoint. It’s largely because it will cause each presenter to go way over time by cramming in too much information, and this problem outweighs the benefit of being able to present an impressive visualisation. I prefer to encourage students to only tell the audience what they will remember (by only presenting what they remember).

2. A policy analysis paper, and 3. A reflection on your analysis

Provide a policy analysis paper which has to make a substantive argument or recommendation in approximately two pages (1000 words), on the assumption that busy policymakers won’t read much else before deciding whether or not to pay attention to the problem and your solutions. Then provide a reflection paper (also approximately 1000 words) to reflect your theoretical understanding of the policy process. You can choose how to split the 2000 word length, between analysis and reflection. You can give each exercise 1000 each (roughly a 2-page analysis), provide a shorter analysis and more reflection, or widen the analysis and reject the need for conceptual reflection. The choice is yours to make, as long as you justify your choice in your reflection.

When writing policy analysis, I ask you to keep it super-short on the assumption that you have to make your case quickly to people with 99 other things to do. For example, what can you tell someone in one paragraph or a half-page to get them to read all 2 pages?  It is tempting to try to tell someone everything you know, because everything is connected and to simplify is to describe a problem simplistically. Instead, be smart enough to know that such self-indulgence won’t impress your audience. In person, they might smile politely, but their eyes are looking at the elevator lights. In writing, they can skim your analysis or simply move on. So, use these three statements to help you focus less on your need to supply information and more on their demand:

  1. Your aim is not to give a full account of a problem. It is to get powerful people to care about it.
  2. Your aim is not to give a painstaking account of all possible solutions. It is to give a sense that at least one solution is feasible and worth pursuing.
  3. Your guiding statement should be: policymakers will only pay attention to your problem if they think they can solve it, and without that solution being too costly.

Otherwise, I don’t like to give you too much advice because I want you to be creative about your presentation; to be confident enough to take chances and feel that you’ll see the reward of making a leap. At the very least, you have three key choices to make about how far you’ll go to make a point:

  1. Who is your audience? Our discussion of the limits to centralised policymaking suggest that your most influential audience will not necessarily be an elected policymaker, but who else would it be?
  2. How ‘manipulative’ should you be? Our discussions of ‘bounded rationality’ and ‘evidence-based policymaking’ suggest that policymakers combine ‘rational’ and ‘irrational’ shortcuts to gather information and make choices. So, do you appeal to their desire to set goals and gather a lot of scientific information, make an emotional appeal, or rely on Riker-style heresthetics?
  3. What is your role? Contemporary discussions of science advice to government highlight unresolved debates about the role of unelected advisors: should you simply lay out some possible solutions or advocate one solution strongly?

For our purposes, there are no wrong answers to these questions. Instead, I want you to make and defend your decisions. That is the aim of your policy paper ‘reflection’: to ‘show your work’. You still have some room to be creative in your reflection: tell me what you know about policy theory and how it informed your decisions. Here are some examples, but it is up to you to decide what to highlight:

  1. Show how your understanding of policymaker psychology helped you decide how to present information on problems and solutions.
  2. Extract insights from policy theories, such as from punctuated equilibrium theory on policymaker attention, multiple streams analysis on timing and feasibility, or the NPF on how to tell persuasive stories.
  3. Explore the implications of the lack of ‘comprehensive rationality’ and absence of a ‘policy cycle’: feasibility is partly about identifying the extent to which a solution is ‘doable’ when central governments have limited powers. What ‘policy style’ or policy instruments would be appropriate for the solution you favour?

I use the following questions to guide the marking on the policy paper: Tailored properly to a clearly defined audience? Punchy and concise summary? Clearly defined problem? Good evidence or argument behind the solution? Clear recommendations backed by a sense that the solution is feasible? Evidence of substantial reading, accompanied by well explained further reading?

In my experience of marking, successful students gave a very clear and detailed account of the nature and size of the policy problem. The best reports used graphics and/ or statistics to describe the problem in several ways. Some identified a multi-faceted problem – such as in health outcomes, and health inequalities – without presenting confusing analysis. Some were able to present an image of urgency, to separate this problem from the many others that might grab policymaker attention. Successful students presented one or more solutions which seemed technically and/ or politically feasible. By technically feasible, I mean that there is a good chance that the policy will work as intended if implemented. For example, they provided evidence of its success in a comparable country (or in the past) or outlined models designed to predict the effects of specific policy instruments. By politically feasible, I mean that you consider how open your audience would be to the solution, and how likely the suggestion is to be acceptable to key policymakers. Some students added to a good discussion of feasibility by comparing the pros/ cons of different scenarios. In contrast, some relatively weak reports proposed solutions which were vague, untested, and/ or not likely to be acted upon.

BOX A2: Rationale for policy analysis and reflection

Students already have 5-step policy analysis texts at their disposal, and they give some solid advice about the task. However, I want to encourage students to think more about how their knowledge of the policy process will guide their analysis. First, what do you do if you think that one audience will buy your argument, and another reject it wholeheartedly? Just pretend to be an objective analyst and put the real world in the ‘too hard’ pile? Or, do you recognise that policy analysts are political actors and make your choices accordingly? For me, an appeal to objectivity combined with insufficient recognition of the ways in which people respond emotionally to information, is a total cop-out. I don’t want to contribute to a generation of policy analysts who provide long, rigorous, and meticulous reports that few people read and fewer people use. Instead, I want students to show me how to tell a convincing story with a clear moral, or frame policy analysis to grab their audience’s attention and generate enthusiasm to try to solve a problem. Then, I want them to reflect on how they draw the line between righteous persuasion and unethical manipulation.

Second, how do you account for policymaking complexity? You can’t assume that there is a cycle in which a policymaker selects a solution and it sets in train a series of stages towards successful implementation. Instead, you need to think about the delivery of your policy as much as the substance. Students have several choices. In some cases, they will describe how to deliver policy in a multi-level or multi-centric environment, in which, say, a central government actor will need to use persuasion or cooperation rather than command-and-control. Or, if they are feeling energetic, they might compare a top-down delivery option with support for Ostrom-style polycentric arrangements. Maybe they’ll recommend pilots and/ or trial and error, to monitor progress continuously instead of describing a one-shot solution.  Maybe they’ll reflect on multiple streams analysis and think about how you can give dependable advice in a policy process containing some serendipity. Who knows? Policy process research is large and heterogeneous, which opens the possibility for some creative solutions that I won’t be able to anticipate in advance.

4. One kind of blog post (for the policy analysis)

Write a short and punchy blog post which recognises the need to make an argument succinctly and grab attention with the title and first sentence/ paragraph, on the assumption that your audience will be reading it on their phone and will move on to something else quickly. In this exercise, your blog post is connected to your policy analysis. Think, for example, about how you would make the same case for a policy solution to a wider ‘lay’ audience. Or, use the blog post to gauge the extent to which your client could sell your policy solution. If they would struggle, should you make this recommendation in the first place?

Your blog post audience is wider than your policy analysis audience. You are trying to make an argument that will capture the attention of a larger group of people who are interested in politics and policy, but without being specialists. They will likely access your post from Twitter/ Facebook or via a search engine. This constraint produces a new requirement, to: present a punchy title which sums up the whole argument in under 280 characters (a statement is often better than a vague question); to summarise the whole argument in approximately 100 words in the first paragraph (what is the problem and solution?); then, to provide more information up to a maximum of 500 words. The reader can then be invited to read the whole policy analysis.

The style of blog posts varies markedly, so you should consult many examples before attempting your own (for example, compare the LSE with The Conversation and newspaper blogs to get a sense of variations in style). When you read other posts, take note of their strengths and weaknesses. For example, many posts associated with newspapers introduce a personal or case study element to ground the discussion in an emotional appeal. Sometimes this works, but sometimes it causes the reader to scroll down quickly to find the main argument. Perhaps ironically, I recommend storytelling but I often skim past people’s stories. Many academic posts are too long (well beyond your 500 limit), take too long to get to the point, and do not make explicit recommendations, so you should not emulate them. You should aim to be better than the scholars whose longer work you read. You should not just chop down your policy analysis to 500 words; you need a new kind of communication.

Hopefully, by the end of this fourth task, you will appreciate the transferable life skills. I have generated some uncertainty about your task to reflect the sense among many actors that they don’t really know how to make a persuasive case and who to make it to. We can follow some basic Bardach-style guidance, but a lot of this kind of work relies on trial-and-error. I maintain a short word count to encourage you to get to the point, and I bang on about ‘stories’ in modules to encourage you to present a short and persuasive story to policymakers.

This process seems weird at first, but isn’t it also intuitive? For example, next time you’re in my seminar, measure how long it takes you to get bored and look forward to the weekend. Then imagine that policymakers have the same attention span as you. That’s how long you have to make your case! Policymakers are not magical beings with an infinite attention span. In fact, they are busier and under more pressure than us, so you need to make your pitch count.

BOX A3: Rationale for blog post 1

This exercise forces students to make their case in 500 words. It helps them understand the need to communicate in different ways to different audiences. It suggests that successful communication is largely about knowing how your audience consumes information, rather than telling people all you know. I gauge success according to questions such as: Punchy and eye grabbing title? Tailored to an intelligent ‘lay’ audience rather than a specific expert group? Clearly defined problem? Good evidence or argument behind the solution? Clear recommendations backed by a sense that the solution is feasible? Well embedded weblinks to further relevant reading?

5. Writing a theory-informed essay

I tend to set this simple-looking question for coursework in policy modules: what is policy, how much has it changed, and why? Students get to choose the policy issue, timeframe, political system, and relevant explanatory concepts.

On the face of it, it looks very straightforward. Give it a few more seconds, and you can see the difficulties:

  1. We spend a lot of time in class agreeing that it seems almost impossible to define policy
  2. There are many possible measures of policy change
  3. There is an almost unmanageable number of models, concepts, and theories to use to explain policy dynamics.

I try to encourage some creativity when solving this problem, but also advise students to keep their discussion as simple and jargon-free as possible (often by stretching an analogy with competitive diving, in which a well-executed simple essay can score higher than a belly-flopped hard essay).

Choosing a format: the initial advice

  1. Choose a policy area (such as health) or issue (such as alcohol policy).
  2. Describe the nature of policy, and the extent of policy change, in a particular time period (such as in a particular era, after an event or constitutional change, or after a change in government).
  3. Select one or more policy concepts or theory to help structure your discussion and help explain how and why policy has changed.

For example, a question might be: What is tobacco policy in the UK, how much has it changed since the 1980s, and why? I use this example because I try to answer that question myself, even though some of my work is too theory-packed to be a good model for a student essay (Cairney, 2007 is essentially a bad model for students).

Choosing a format: the cautionary advice

You may be surprised about how difficult it is to answer a simple question like ‘what is policy?’ and I will give you a lot of credit for considering how to define and measure it; by identifying, for example, the use of legislation/ regulation, funding, staff, and information sharing, and/ or by considering the difference between, say, policy as a statement of intent or a long term outcome. In turn, a good description and explanation of policy change is difficult. If you are feeling ambitious, you can go further, to compare, say, two issues (such as tobacco and alcohol) or places (such UK Government policy and the policy of another country), but sometimes a simple and narrow discussion can be more effective. Similarly, you can use many theories or concepts to aid explanation, but one theory may do. Note that (a) your description of your research question, and your essay structure, is more important than (b) your decision on what topic or concepts to use.

BOX A4: Rationale for the essay

The wider aim is to encourage students to think about the relationship between differentperspectives on policy theory and analysis. For example, in a blog and policy analysis paper they try to generate attention to a policy problem and advocate a solution. Then, they draw on policy theories and concepts to reflect on their papers, highlighting (say): the need to identify the most important audience; the importance of framing issues with a mixture of evidence and emotional appeals; and, the need to present ‘feasible’ solutions.

The reflection can provide a useful segue to the essay, since we’re already identifying important policy problems, advocating change, reflecting on how best to encourage it – such as by presenting modest objectives – and then, in the essay, trying to explain (say) why governments have not taken that advice in the past. Their interest in the policy issue can prompt interest in researching the issue further; their knowledge of the issue and the policy process can help them develop politically-aware policy analysis. All going well, it produces a virtuous circle.

BOX A5: Rationale for blog post 2

I get students to do the analysis/reflection/blog combination in the first module, and an essay/ blog combo in the second module. The second blog post has a different aim. Students use the 500 words to present a jargon-free analysis of policy change. The post represents a useful exercise in theory translation. Without it, students tend to describe a large amount of jargon because I am the audience and I understand it. By explaining the same thing to a lay audience, they are obliged to explain key developments in a plain language. This requirement should also help them present a clearer essay, because people (academics and students) often use jargon to cover the fact that they don’t really know what they are saying.

1 Comment

Filed under Uncategorized

The future of education equity policy: ‘neoliberal’ versus ‘social justice’ approaches

This post summarises Cairney and Kippin’s qualitative systematic review of peer-reviewed research on education equity policy. See also: The future of equity policy in education and health: will intersectoral action be the solution? and posts on ‘Heath in All Policies’ and health inequalities.

Governments, international organisations, and researchers all express a high and enduring commitment to ‘education equity’. Yet, this is where the agreement ends.

The definition of the problem of inequity and the feasibility of solutions is highly contested, to the extent that it is common to identify two competing approaches:

1. A ‘neoliberal’ approach, focusing on education’s role in the economy, market-based reforms, and ‘new public management’ reforms to schools.

2. A ‘social justice’ approach, focusing on education’s role in student wellbeing and life opportunities, and state-led action to address the wider social determinants of education outcomes.

Almost all of the research included in our review suggests that the neoliberal approach dominates international and domestic policy agendas at the expense of the wider focus on social justice.

We describe education equity researchers as the narrators of cautionary tales of education inequity. Most employ critical policy analysis to challenge what they call the dominant stories of education that hinder meaningful equity policies.

First, many describe common settings, including a clear sense that unfair inequalities endure despite global and domestic equity rhetoric.

They also describe the multi-level nature of the governance of education, but with less certainty about relationships across levels. A small number of international organisations and countries are key influencers of a global neoliberal agenda and there is discretion to influence policy at local and school levels. In that context, some studies relate the lack of progress to the malign influence of one or more levels, such as global and central government agendas undermining local change, or local actors disrupting central initiatives.

Second, studies describe similar plots. Many describe stymied progress on equity caused by the negative impacts of neoliberalism: undermining equity by (1) equating it with narrow definitions of equal access to well-performing schools and test-based attainment outcomes, and (2) taking attention from social justice to focus on economic competitiveness.

Many describe policymakers using a generic focus on equity as a facade, to ignore and reproduce inequalities in relation to minoritized populations. Or, equity is a ‘wicked’ issue that defies simple solutions. Many plots involve a contrast between agency-focused narratives that emphasise hopefulness (e.g. among ‘change agents’) and systemic or structural narratives that emphasise helplessness.

Third, they present common ideas about characters. In global narratives, researchers challenge the story by international organisations that they are the heroes providing funding backed by crucial instructions to make educations systems and economies competitive. Most education articles portray neoliberal international organisations and central governments as the villains: narrowing equity to simplistic measures of performance at the expense of more meaningful outcomes.

At a national and local level, they criticise the dominant stories of equity within key countries, such as the US, that continue to reproduce highly unequal outcomes while projecting a sense of progress. The most vividly told story is of white parents, who portray their ‘gifted’ children as most deserving of advantage in the school system, and therefore the victims of attempts to widen access or redistribute scarce resources (high quality classes and teachers). Rather, these parents are the villains standing – sometimes unintentionally, but mostly intentionally – in the way of progress.

The only uncertainty regards the role of local and school leaders. In some cases, they are the initially-heroic figures, able to find ways to disrupt a damaging national agenda and become the ‘change agents’ that shift well-established rules and norms before being thwarted by community and parental opposition. In others, they are perhaps-unintentional villains who reproduce racialised, gendered, or class-based norms regarding which students are ‘gifted’ and worthy of investment versus which students need remedial classes or disrupt other learners.

Fourth, the moral of the story is mostly clear. Almost all studies criticise the damaging impact of neoliberal definitions of equity and the performance management and quasi-market techniques that support it. They are sold as equity measures but actually exacerbate inequalities. As such, the moral is to focus our efforts elsewhere: on social justice, the social and economic determinants of education, and the need to address head-on the association between inequalities and minoritized populations (to challenge ‘equity for all’ messages). However, it is difficult to pinpoint the source of much-needed change. In some cases, strong direction from central governments is necessary to overcome obstacles to change. In others, only bottom-up action by local and school leaders will induce change.

Perhaps the starkest difference in approaches relates to expectations for the future. For ‘neoliberal’ advocates, solutions such as market incentives or education system reforms will save schools and the next generation of students. In contrast, ‘social justice’ advocates expect these reforms to fail and cause irreparable damage to the prospect of education equity.

1 Comment

Filed under COVID-19, education policy, Policy learning and transfer, public policy

The future of equity policy in education and health: will intersectoral action be the solution?

This post was first published by NORRAG. It summarises key points from two qualitative systematic reviews of peer-reviewed research on health equity policy (Cairney, St Denny, Mitchell) and education equity policy (Cairney, Kippin) for the European Research Council funded IMAJINE project. Our focus on comparing strategies within sectors supplements a wider focus on spatial justice (and cross-sectoral gender equity) strategies. It is published in conjunction with a GHC and NORRAG joint event “The Future of Equity Policy in Education and Health: Will Intersectoral Action be the Solution?” scheduled for 02 November at 17:00-18:30 CET/Geneva, which will discuss the opportunities and challenges to intersectoral research, practice and policy in education and health. Register for the event here

Many governments, international organisations, practitioners, and researchers express high rhetorical support for more equitable policy outcomes. However, the meaning of equity is vague, the choice of policy solutions is highly contested, and approaches to equity policy vary markedly in different policy sectors. 

In that context, it is common for policymakers to back up this equity policy rhetoric with a commitment to intersectoral action and collaboration inside and outside of government, described with terms such as holisticjoined-up, collaborative, or systems approaches to governance. At the same time, it is common for research on policymaking to highlight the ever-present and systemic obstacles to the achievement of such admirable but vague aims.

Our reviews of equity policy and policymaking in two different sectors – health and education – highlights these obstacles in different ways.

In health, the global equity strategy Health in All Policies (HiAP) describes a coherent and convincing rationale for intersectoral action and collaboration inside and outside of government:

  1. Health is a human right to be fostered and protected by all governments.
  2. Most determinants of health inequalities are social – relating to income, wealth, education, housing, social, and physical environments – and we should focus less on individual choices and healthcare.
  3. Policies to address social determinants are not in the gift of health sectors, so we need intersectoral action to foster policy changes, such as in relation to tax and spending, education, and housing. 
  4. Effective collaborative strategies foster win-win solutions and the co-production of policy, and avoid the perception of ‘health imperialism’ or interference in the work of other professions. 

Yet, our review of HiAP articles suggests that very few projects deliver on these aims. In some cases, authors express frustration that people in other sectors do not take their health aims seriously enough. Or, those actors make sense of HiAP aims in different ways, turning a social determinants approach into projects focusing more on individual lifestyles. These experiences highlight governance dilemmas, in which the need to avoid ‘health imperialism’ leads to minimal challenges to the status quo, or HiAP advocates seek contradictory approaches such as to formalize HiAP strategies from the top-down (to ensure high-level commitment to reform) and encourage collaborative ‘bottom-up’ approaches (to let go of those reforms to foster creative and locally tailored solutions). 

In education, it is more difficult to identify a single coherent rationale for wider intersectoral action. Within ‘social justice’ approaches, there is some focus on the ‘out of school’ factors crucial to learning and attainment processes and outcomes, particularly when describing the marginalization and minoritization of social groups. There are also some studies of systems-based approaches to education. However, there is a more general tendency to focus on sector-specific activities and solutions, including reforms to education systems and school governance. Further, agenda setting organizations such as the OECD foster the sense that investment in early years education, well governed schools and education systems, and reallocations of resources to boost capacity in schools in deprived areas, can address problems of unequal attainment. 

In other words, in both sectors we can often find a convincing rationale for practitioners in one sector to seek cooperation with other sectors. However, no study describes an effective way to do it, or even progress towards new ways of thinking. Indeed, perhaps the most striking proxy indicator of meaningful intersectoral action comes from the bibliographies of these articles. It is clear from the reading lists of each sector that they are not reading each other’s work. The literature on intersectoral action comes with a narrow sectoral lens. 

In sum, intersectoral action and collaboration remains a functional requirement – and a nice idea – rather than a routine activity.

2 Comments

Filed under education policy, Evidence Based Policymaking (EBPM), Policy learning and transfer, Prevention policy, Public health

Creeping crisis: the UK government’s response to COVID-19 (and the role of experts)

These are my presentation notes on the UK government’s initial response to COVID-19, as part of the ‘The Role of Experts’ panel at the Creeping Crisis conference. I draw on articles and blog posts stored in my COVID-19 page. You can find more on the idea of a COVID-19 creeping crisis in Hiding in Plain Sight: Conceptualizing the Creeping Crisis:

In December 2019, a new Coronavirus emerged in China. As little was known about the immediate consequences of the virus, the world paid scant attention. That hardly changed when China announced that the outbreak of the virus was dangerous and subsequently locked down its entire population, bringing its juggernaut economy to a sudden halt. When the first cases emerged in one European country, other countries did not take any measures. When the World Health Organization branded Europe as the new hot spot of the pandemic, the United States did not react. When the first deaths were registered on the U.S. West Coast, the New York City mayor admonished his citizens to stick with their routines (keep going to restaurants!). The COVID‐19 crisis crept up on countries, cities, and hospitals. It arrived in full view, yet still surprised politicians, hospital administrators, pundits, business owners, and citizens

(Boin et al, 2020: 2).

A creeping crisis is a threat to widely shared societal values or life‐sustaining systems that evolves over time and space, is foreshadowed by precursor events, subject to varying degrees of political and/or societal attention, and impartially or insufficiently addressed by authorities

(Boin et al, 2020: 7).

Our conference’s aim is to answer two broad questions, adapted as follows:

  1. What are the main lessons to learn from the initial UK government response?
  2. What are the main research challenges for our community?

Q1 Lessons from initial research: analysing policy failures

From The UK government’s COVID‑19 policy: assessing evidence‑informed policy analysis in real time and What have we learned so far from the government’s COVID-19 policy?

By late March 2020, COVID-19 prompted almost-unprecedented policy change, towards state intervention, at a speed and magnitude that seemed unimaginable before 2020. Yet, many have criticised the UK government’s response as slow and insufficient, contributing to the UK’s internationally high number of excess deaths.

Initial criticisms include that ministers did not:

  • Take COVID-19 seriously enough in relation to existing evidence, when its devastating effect was apparent in China in January and Italy from February
  • Act as quickly as other countries to test for infection to limit its spread
  • Introduce swift-enough measures to close schools, businesses, and major social events.
  • Secure enough personal protective equipment (PPE), testing capacity, and an effective test-trace-and-isolate system.
  • Respond to the right epidemic (assuming that COVID-19 could be treated like influenza)
  • Pursue an elimination strategy to minimise its spread until a vaccine could be developed.
  • Use the right models and data to estimate the R (rate of transmission) and ‘doubling time’ of cases (which suggested locking down earlier).

Q1 Lessons from Parliament: failure and success

A new report by the House of Commons Health and Social Care and Science and Technology Committees describes Covid-19 as ‘the biggest crisis our country has faced in generations’, which disrupted our lives to an extent few predicted’. Its ‘lessons learned to date’ are similarly negative, although there are some positive lessons on vaccine development and roll-out (the following points appear in the Executive Summary).

Negative lessons

To explain why ‘in 2020 the UK did significantly worse in terms of covid deaths than many countries’:

  • Poor pandemic preparedness. UK planning was based on influenza, not more relevant experiences such as SARS.
  • Insufficient initial action, to pursue ‘non-pharmaceutical interventions’. The initial UK policy response was based on fatalism. It assumed that infection spread was inevitable and that people would not tolerate lockdown or ‘social distancing’ measures. It should have intervened more quickly when it emerged that there was no feasible alternative to lockdown. Its subsequent actions show that the UK public supported and followed lockdown measures.
  • Groupthink and an inability to learn from best practice. The rejection of lessons from ‘East and South East Asian countries’ by policymakers and their scientific advisers reflects a wider problem of groupthink:

‘The fact that the UK approach reflected a consensus between official scientific advisers and the Government indicates a degree of groupthink that was present at the time which meant we were not as open to approaches being taken elsewhere as we should have been’.

  • Limited capacity to test, trace and isolate. The government gave up too early on community testing before eventually ramping up capacity. Then, the establishment of its NHS Test and Trace was chaotic, largely because it established a new centralised system rather than relying on more-established local capacity. Test and trace policy did not deliver on its aim to develop an alternative to prevent further lockdowns. The test and trace system is now good, but is not accompanied by an effective compensation model to allow people to isolate.
  • Insufficient National Health Service (NHS) capacity. The government acted quickly and well to boost emergency hospital capacity, but without ensuring the maintenance of equally important core services (e.g. cancer treatment).
  • Failure to protect social care. Policymakers and scientific advisers were too late to recognise the impact of discharging people from hospitals to social care ‘without adequate testing or rigorous isolation’ (again, without learning from international practice).
  • Excessive ministerial optimism underpinned the rejection of science advice. Ministers paid insufficient attention to scientific advice on the need for further lockdowns to address surges of infection in Autumn 2020.
  • Lack of attention to inequitable outcomes. ‘Black, Asian and Minority Ethnic communities’ faced  disproportionately (a) high rates of death and illness, (b) low access to PPE, and (c) low access to safe housing and working conditions. ‘People with learning disabilities and autistic people’ faced (a) higher mortality risk (exacerbated by inappropriate ‘do not resuscitate’ orders), (b) lower access to essential care services, and (c) diminished contact with family members and carers.

Positive lessons

  • The vaccine rollout was a success. The UK vaccine programme was ‘one of the most effective in Europe and, for a country of our size one of the most effective in the world’. It resulted from major and early investment in research and development, an effective regulatory response, and a Vaccines Taskforce led with authority.
  • Its research on COVID-19 treatment is world-leading.

Q2. Research challenges for a research and practice community

There are many different research challenges based on what researchers want to do, including:

Using research to change the minds of policymakers: COVID-19 crisis strategy

One specific criticism is that UK ministers and their advisers defined the COVID-19 policy problem incorrectly. They sought a shift of approach from (a) managing a chronic and seasonal problem, to (b) pursuing an elimination strategy until a vaccine was available, but expressed continual frustration about their lack of impact on government policy and official advice (a complaint captured partly by the idea of ‘groupthink’).

Summary of UKG definition of the policy problem (from here)

1. The UK Government’s COVID-19 Policy: What Does “Guided by the Science” Mean in Practice?

2. COVID-19: effective policymaking depends on trust in experts, politicians, and the public

These articles help to explain this relative lack of ‘impact’ by most potential-expert-advisers:

First, to all intents and purposes, policymakers need to find ways to ignore almost all information. One way is to rely on a small number of trusted experts. Guided by the science means by our scientific advisers.

Second, a classic categorisation of interest group strategy and status helps to categorise the role and status of expert advisers.

  • An insider strategy follows the ‘rules of the game’, including: accept a government’s definition of the problem (or right to define it), be pragmatic, present modest demands, and don’t criticise the outcomes in public.
  • Non-governmental advisers may learn – and largely follow – similar rules.
  • Some advisers are also civil servants, expected to follow additional – formal and informal – rules associated with their conduct in government.
  • One example of a formal rule isto defend a distinction between (a) officials giving evidence and advice and (b) ministers making policy.
  • Informal rules may describe how to conduct yourself in discussion (in ways that contrast with the idea of maverick scientists speaking truth to power at all costs)

Third, governments assign status to groups based on their resources, policy positions, and willingness to pursue an insider strategy.

  • Core insiders (senior government science advisers) are consulted regularly in relation to the general problem. They know and follow the rules, and can navigate complex policy processes.
  • Specialist insiders (such as members of the Scientific Advisory Group for Emergencies, SAGE) provide advice on specific issues. They appear sensitive to informal rules when speaking in public, and may have some navigation skills.
  • Peripheral insiders are consulted cosmetically, and few have enough experience of engagement to learn and follow the rules.
  • Outsiders may be of no use to government and/or reject the rules of the game. Many prefer the rules or principles that they associate with their profession, including transparency, visibility, responsibility, integrity, independence, and accountability.

Take-home message. This dynamic suggests that a vague criticism of groupthink, or push for the reform of advisory systems, will not address the routine assignment of core insider status to very few people. Ministers will identify good reasons to trust very few advisers, and have virtually no incentive to listen to external critics.

‘some experts remain core insiders if they advise on policies that they do not necessarily support, while outsiders have the freedom to criticize the policy they were unable to influence’

Using research to change the minds of policymakers: COVID-19 and inequalities

A more general criticism is that governments (such as the UK) do not back up their ‘Health in All Policies’ rhetoric with substantive action. HiAP focuses on the ‘social determinants’ of health and health inequalities:

‘significant and persistent disparities in health outcomes caused by structural inequities in social and economic factors, including employment opportunities, the law and the justice systems, education, housing, neighborhood environments, and transportation. These elements are otherwise known as the social determinants of heath. The opportunity or lack of opportunity to be healthy is too often associated with a person’s socioeconomic status, race, ethnicity, gender, religion, sexual identity, or disability’

(Bliss et al., 2016: S88).

 The future of public health policymaking after COVID-19 explores how HiAP advocates seek (largely in vain) to use policy theory insights to challenge the lack of policy progress, such as via framing and coalition building strategies, and seeking ‘windows of opportunity’ to act. One challenge for this kind of work is that policy theories are designed largely to explain policymaking constraints, not to help overcome them:

‘relatively abstract policy theories will rarely provide concrete advice of how to act and what to do in all given contexts. There are too many variables in play to make this happen. The complexity of policy processes, its continuously changing nature, and its diversity across contexts, prevent precise prediction for policy actors seeking influence or policy change’

(Weible and Cairney, 2018: 186)

Q2. Research challenges for policy scholars

The most relevant challenge is to juggle three different statements on policy learning:

  1. We should encourage certain kinds of research-informedpolicy learning’.

The House of Commons report (discussed in Q1) is a good example of exhortation to learn from – and perhaps adopt – ‘international best practice’. It is part of a collection of continuous and energetic calls for the UK Government to learn from the policies of more successful countries such as South Korea. Indeed, Dominic Cummings (former Special Adviser to Prime Minister Boris Johnson) declared that: ‘Essentially if we just cut and pasted what they were doing in Singapore or Taiwan or whatever, and just said that’s our policy everything would have been better’ (Oral evidence to the House of Commons Science and Technology Committee, 26.5.21).

  • Few accounts describe how to do it.

Very few accounts provide enough (a) clarity to describe convincingly what and how to learn, or (b) awareness of political and policymaking reality, to present plausible claims.

This lack of clarity is apparent in published academic articles that claim – misleadingly – to facilitate policy learning or transfer: Intra-crisis learning and prospective policy transfer in the COVID-19 pandemic.

  • Learning (and ‘policy transfer’) is a political act. We need to understand why governments will not learn from most other governments.

This work-in-progress presents three questions to guide policy learning, followed by a simple distinction between (a) what policy analysts or designers may seek (an agency-focused approach), and (b) what we would expect to actually happen in complex policymaking environments (a context-focused approach).

1. What is the evidence for one government’s success, and from where does it come?   

  • Policy analysis: seek multiple independent sources of evidence.
  • Policy process: (a) political actors compete to define good evidence and its implications; (b) governance choices (on the extent to which policy is centralised) influence evidence choices.

2. What story do exporters/ importers of policy tell about the problem they seek to solve?

  • Policy analysis: improve comparability by establishing how each government defines the policy problem, establishes the feasibility of solutions, and measures success.
  • Policy process: it is often not possible to determine a policymaker’s motivation, especially when many venues or levels of government contribute to policy.

3. Do they have comparable political and policymaking systems?

  • Policy analysis: identify the comparable features of each political system (e.g. federal/ unitary).
  • Policy process: identify the comparable features of policymaking systems (e.g. actors, institutions, networks, ideas, socioeconomic context).

A quick application of these questions to UK learning from South Korea helps to demonstrate the low likelihood of it happening:

  1. What is the evidence for one government’s success, and from where does it come? UK ministers and scientific advisers expressed scepticism about the long-term success ofcountries – like China and South Korea – who introduced very strong lockdown protocols. Advisers predicted that these countries would minimise a first wave of infection but then open up to cause a much larger second wave. As such, the evidence of success was highly contested.
  2. What story do exporters/ importers of policy tell about the problem they seek to solve? The UK government’s definition of the policy problem (described above) is not conducive to learning from countries like South Korea. It portrayed Korean-style restrictions as politically infeasible (before the UK lockdown).
  3. Do they have comparable political and policymaking systems? Their political system differences are relatively straightforward to identify since, for example, the UK is a liberal democracy with a less established tradition of state intervention in the ways now taken for granted in 2020. It would be difficult to know where to begin to compare their policymaking systems (containing multiple authoritative venues, each with their own institutions, networks, and ideas).

The more general take-home message is to ‘beware the insufficient analysis of the connection between functional requirements and policymaking dynamics. Too often, researchers highlight what they need from governments to secure policy change, while policy theories identify the low likelihood that governments can meet that need’.

1 Comment

Filed under Uncategorized

Education equity policy: ‘equity for all’ as a distraction from race, minoritization, and marginalization

By Paul Cairney and Sean Kippin

This post summarizes a key section of our review of education equity policymaking [see the full article for references to the studies summarized here].

One of the main themes is that many governments present a misleading image of their education policies. There are many variations on this theme, in which policymakers:

  1. Describe the energetic pursuit of equity, and use the right language, as a way to hide limited progress.
  2. Pursue ‘equity for all’ initiatives that ignore or downplay the specific importance of marginalization and minoritization, such as in relation to race and racism, immigration, ethnic minorities, and indigenous populations.
  3. Pursue narrow definitions of equity in terms of access to schools, at the expense of definitions that pay attention to ‘out of school’ factors and social justice.

Minoritization is a strong theme in US studies in particular. US experiences help us categorise multiple modes of marginalisation in relation to race and migration, driven by witting and unwitting action and explicit and implicit bias:

  • The social construction of students and parents. Examples include: framing white students as ‘gifted’ and more deserving of merit-based education (or victims of equity initiatives); framing non-white students as less intelligent, more in need of special needs or remedial classes, and having cultural or other learning ‘deficits’ that undermine them and disrupt white students; and, describing migrant parents as unable to participate until they learn English.
  • Maintaining or failing to challenge inequitable policies. Examples include higher funding for schools and colleges with higher white populations, and tracking (segregating students according to perceived ability), which benefit white students disproportionately.
  • Ignoring social determinants or ‘out of school’ factors.
  • Creating the illusion of equity with measures that exacerbate inequalities. For example, promoting school choice policies while knowing that the rules restrict access to sought-after schools.
  • Promoting initiatives to ignore race, including so-called ‘color blind’ or ‘equity for all’ initiatives.
  • Prioritizing initiatives at the expense of racial or socio-economic equity, such as measures to boost overall national performance at the expense of targeted measures.
  • Game playing and policy subversion, including school and college selection rules to restrict access and improve metrics.

The wider international – primarily Global North – experience suggests that minoritization and marginalization in relation to race, ethnicity, and migration is a routine impediment to equity strategies, albeit with some uncertainty about which policies would have the most impact.

Other country studies describe the poor treatment of citizens in relation to immigration status or ethnicity, often while presenting the image of a more equitable system. Until recently, Finland’s global reputation for education equity built on universalism and comprehensive schools has contrasted with its historic ‘othering’ of immigrant populations. Japan’s reputation for containing a homogeneous population, allowing its governments to present an image of classless egalitarianism and harmonious society, contrasts with its discrimination against foreign students. Multiple studies of Canadian provinces provide the strongest accounts of the symbolic and cynical use of multiculturalism for political gains and economic ends:

As in the US, many countries use ‘special needs’ categories to segregate immigrant and ethnic minority populations. Mainstreaming versus special needs debates have a clear racial and ethnic dimension when (1) some groups are more likely to be categorised as having learning disabilities or behavioural disorders, and (2) language and cultural barriers are listed as disabilities in many countries. Further, ‘commonwealth’ country studies identify the marginalisation of indigenous populations in ways comparable to the US marginalisation of students of colour.

Overall, these studies generate the sense that the frequently used language of education equity policy can signal a range of possibilities, from (1) high energy and sincere commitment to social justice, to (2) the cynical use of rhetoric and symbolism to protect historic inequalities.

Examples:

  • Turner, E.O., and Spain, A.K., (2020) ‘The Multiple Meanings of (In)Equity: Remaking School District Tracking Policy in an Era of Budget Cuts and Accountability’, Urban Education, 55, 5, 783-812 https://doi.org/10.1177%2F0042085916674060
  • Thorius, K.A. and Maxcy, B.D. (2015) ‘Critical Practice Analysis of Special Education Policy: An RTI Example’, Remedial and Special Education, 36, 2, 116-124 https://doi.org/10.1177%2F0741932514550812
  • Felix, E.R. and Trinidad, A. (2020) ‘The decentralization of race: tracing the dilution of racial equity in educational policy’, International Journal of Qualitative Studies in Education, 33, 4, 465-490 https://doi.org/10.1080/09518398.2019.1681538
  • Alexiadou, N. (2019) ‘Framing education policies and transitions of Roma students in Europe’, Comparative Education, 55, 3,  https://doi.org/10.1080/03050068.2019.1619334

See also: https://paulcairney.wordpress.com/2017/09/09/policy-concepts-in-500-words-social-construction-and-policy-design/

2 Comments

Filed under education policy, Evidence Based Policymaking (EBPM), Policy learning and transfer, Prevention policy, public policy

Perspectives on academic impact and expert advice to policymakers

A blog post prompted by this fascinating post by Dr Christiane Gerblinger: Are experts complicit in making their advice easy for politicians to ignore?

There is a lot of advice out there for people seeking to make an ‘impact’ on policy with their research, but some kinds of advice must seem like they are a million miles apart.

For the sake of brevity, here are some exemplars of the kinds of discussion that you might find:

Advice from former policymakers

Here is what you could have done to influence my choices when I was in office. Almost none of you did it.

(for a nicer punchline see How can we demonstrate the public value of evidence-based policy making when government ministers declare that the people ‘have had enough of experts’?)

Advice from former civil servants

If you don’t know and follow the rules here, people will ignore your research. We despair when you just email your articles.

(for nicer advice see Creating and communicating social research for policymakers in government)

Advice from training courses on communication

Be concise and engaging.

Advice from training courses on policy impact

Find out where the action is, learn the rules, build up relationships and networks, become a trusted guide, be in the right place at the right time to exploit opportunities, give advice rather than sitting on the fence.

(see for example Knowledge management for policy impact: the case of the European Commission’s Joint Research Centre)

Advice from researchers with some experience of engagement

Do great research, make it relevant and readable, understand your policymaking context, decide how far you want to go have an impact, be accessible, build relationships, be entrepreneurial.

(see Beware the well-intentioned advice of unusually successful academics)

Advice from academic-practitioner exchanges

Note the different practices and incentives that undermine routine and fruitful exchanges between academics, practitioners, and policymakers.

(see Theory and Practice: How to Communicate Policy Research beyond the Academy and ANZOG Wellington).

Advice extrapolated from policy studies

Your audience decides if your research will have impact; policymakers will necessarily ignore almost all of it; a window of opportunity may never arise; and, your best shot may be to tailor your research findings to policymakers whose beliefs you may think are abhorrent.

(discussed in how much impact can you expect from your analysis? and book The Politics of Policy Analysis)

Inference from my study of UK COVID-19 policy

Very few expert advisers had a continuous impact on policy, some had decent access, but almost all were peripheral players or outsiders by choice.

(see The UK government’s COVID-19 policy: what does ‘guided by the science’ mean in practice? and COVID-19 page)

Inference from Dr Gerblinger

Experts ensure that they ignored when: ‘focussing extensively on one strand of enquiry while sidestepping the wider context; expunging complexity; and routinely raising the presence of inconclusiveness’.

What can we make of all of this advice?

One way to navigate all of this material is to make some basic distinctions between:

Sensible basic advice to early career researchers

Know your audience, and tailor your communication accordingly; see academic-practitioner exchange as two-way conversation rather than one-way knowledge transfer.

Take home message: here are some sensible ways to share experiences with people who might find your research useful.

Reflections from people with experience

It will likely not reflect your position or experience (but might be useful sometimes).

Take home message: I think this stuff worked for me, but I am not really sure, and I doubt you will have the same resources.

Reflections from studies of academic-practitioner exchange

It tends to find minimal evidence that people are (a) evaluating research engagement projects, and (b) finding tangible evidence of success (see Research engagement with government: insights from research on policy analysis and policymaking)

Take home message: there is a lot of ‘impact’ work going on, but no one is sure what it all adds up to.

Policy initiatives such as the UK Research Excellence Framework, which requires case studies of policy (or other) impact to arise directly from published research.

Take home message: I have my own thoughts, but see Rethinking policy ‘impact’: four models of research-policy relations

Reflections from people like me

Policy studies can be quite dispiriting. It often looks like I am saying that none of these activities will make much of a difference to policy or policymaking. Rather, I am saying to beware the temptation to turn (a) studies that describe policymaking complexity (e.g. 500 Words) into an agent-centred story of heroically impactful researchers (see for example the Discussion section of this article on health equity policy).

Take home message: don’t confuse studies of policymaking with advice for policy participants.

In other words, identify what you are after before you start to process all of this advice. If you want to engage more with policymakers, you will find some sensible practical advice. If you want to be responsible for a fundamental change of public policy in your field, I doubt any of the available advice will help (unless you seek an explanation for failure).

Leave a comment

Filed under Academic innovation or navel gazing, Evidence Based Policymaking (EBPM), public policy

The future of public health policymaking after COVID-19: lessons from Health in All Policies

Paul Cairney, Emily St Denny, Heather Mitchell 

This post summarises new research on the health equity strategy Health in All Policies. As our previous post suggests, it is common to hope that a major event will create a ‘window of opportunity’ for such strategies to flourish, but the current COVID-19 experience suggests otherwise. If so, what do HIAP studies tell us about how to respond, and do they offer any hope for future strategies? The full report is on Open Research Europe, accompanied by a brief interview on its contribution to the Horizon 2020 project – IMAJINE – on spatial justice.

COVID-19 should have prompted governments to treat health improvement as fundamental to public policy

Many had made strong rhetorical commitments to public health strategies focused on preventing a pandemic of non-communicable diseases (NCDs). To do so, they would address the ‘social determinants’ of health, defined by the WHO as ‘the unfair and avoidable differences in health status’ that are ‘shaped by the distribution of money, power and resources’ and ‘the conditions in which people are born, grow, live, work and age’.

COVID-19 reinforces the impact of the social determinants of health. Health inequalities result from factors such as income and social and environmental conditions, which influence people’s ability to protect and improve their health. COVID-19 had a visibly disproportionate impact on people with (a) underlying health conditions associated with NCDs, and (b) less ability to live and work safely.

Yet, the opposite happened. The COVID-19 response side-lined health improvement

Health departments postponed health improvement strategies and moved resources to health protection.

This experience shows that the evidence does not speak for itself

The evidence on social determinants is clear to public health specialists, but the idea of social determinants is less well known or convincing to policymakers.

It also challenges the idea that the logic of health improvement is irresistible

Health in All Policies (HIAP) is the main vehicle for health improvement policymaking, underpinned by: a commitment to health equity by addressing the social determinants of health; the recognition that the most useful health policies are not controlled by health departments; the need for collaboration across (and outside) government; and, the search for high level political commitment to health improvement.

Its logic is undeniable to HIAP advocates, but not policymakers. A government’s public commitment to HIAP does not lead inevitably to the roll-out of a fully-formed HIAP model. There is a major gap between the idea of HIAP and its implementation. It is difficult to generate HIAP momentum, and it can be lost at any time.

Instead, we need to generate more realistic lessons from health improvement and promotion policy

However, most HIAP research does not provide these lessons. Most HIAP research combines:

  1. functional logic (here is what we need)
  2. programme logic (here is what we think we need to do to achieve it), and
  3. hope.

Policy theory-informed empirical studies of policymaking could help produce a more realistic agenda, but very few HIAP studies seem to exploit their insights.

To that end, this review identifies lessons from studies of HIAP and policymaking

It summarises a systematic qualitative review of HIAP research. It includes 113 articles (2011-2020) that refer to policymaking theories or concepts while discussing HIAP.

We produced these conclusions from pre-COVID-19 studies of HIAP and policymaking, but our new policymaking context – and its ironic impact on HIAP – is impossible to ignore.

It suggests that HIAP advocates produced a 7-point playbook for the wrong game

The seven most common pieces of advice add up to a plausible but incomplete strategy:

  1. adopt a HIAP model and toolkit
  2. raise HIAP awareness and support in government
  3. seek win-win solutions with partners
  4. avoid the perception of ‘health imperialism’ when fostering intersectoral action
  5. find HIAP policy champions and entrepreneurs
  6. use HIAP to support the use of health impact assessments (HIAs)
  7. challenge the traditional cost-benefit analysis approach to valuing HIAP.

Yet, two emerging pieces of advice highlight the limits to the current playbook and the search for its replacement:

  1. treat HIAP as a continuous commitment to collaboration and health equity, not a uniform model; and,
  2. address the contradictions between HIAP aims.

As a result, most country studies report a major, unexpected, and disappointing gap between HIAP commitment and actual outcomes

These general findings are apparent in almost all relevant studies. They stand out in the ‘best case’ examples where: (a) there is high political commitment and strategic action (such as South Australia), or (b) political and economic conditions are conducive to HIAP (such as Nordic countries).

These studies show that the HIAP playbook has unanticipated results, such as when the win-win strategy leads to  HIAP advocates giving ground but receiving little in return.

HIAP strategies to challenge the status quo are also overshadowed by more important factors, including (a) a far higher commitment to existing healthcare policies and the core business of government, and (b) state retrenchment. Additional studies of decentralised HIAP models find major gaps between (a) national strategic commitment (backed by national legislation) and (b) municipal government progress.

Some studies acknowledge the need to use policymaking research to produce new ways to encourage and evaluate HIAP success

Studies of South Australia situate HIAP in a complex policymaking system in which the link between policy activity and outcomes is not linear.  

Studies of Nordic HIAP show that a commitment to municipal responsibility and stakeholder collaboration rules out the adoption of a national uniform HIAP model.

However, most studies do not use policymaking research effectively or appropriately

Almost all HIAP studies only scratch the surface of policymaking research (while some try to synthesise its insights, but at the cost of clarity).

Most HIAP studies use policy theories to:

  1. produce practical advice (such as to learn from ‘policy entrepreneurs’), or
  2. supplement their programme logic (to describe what they think causes policy change and better health outcomes).

Most policy theories were not designed for this purpose.

Policymaking research helps primarily to explain the HIAP ‘implementation gap’

Its main lesson is that policy outcomes are beyond the control of policymakers and HIAP advocates. This explanation does not show how to close implementation gaps.

Its practical lessons come from critical reflection on dilemmas and politics, not the reinvention of a playbook

It prompts advocates to:

  • Treat HIAP as a political project, not a technical exercise or puzzle to be solved.
  • Re-examine the likely impact of a focus on intersectoral action and collaboration, to recognise the impact of imbalances of power and the logic of policy specialisation.
  • Revisit the meaning-in-practice of the vague aims that they take for granted without explaining, such as co-production, policy learning, and organisational learning.
  • Engage with key trade-offs, such as between a desire for uniform outcomes (to produce health equity) but acceptance of major variations in HIAP policy and policymaking.
  • Avoid reinventing phrases or strategies when facing obstacles to health improvement.

We describe these points in more detail here:

Our Open Research Europe article (peer reviewed) The future of public health policymaking… (europa.eu)

Paul summarises the key points as part of a HIAP panel: Health in All Policies in times of COVID-19

ORE blog on the wider context of this work: forthcoming

7 Comments

Filed under agenda setting, COVID-19, Evidence Based Policymaking (EBPM), Public health, public policy

I am not Peter Matthews

Some notes for my guest appearance on @urbaneprofessor ‘s module

Peter’s description

Paul comes from a Political Science background and started off his project trying to understand why politicians don’t make good policy. He uses a lot of Political Science theory to understand the policy process (what MPP students have been learning) and theory from Public Policy about how to make the policy process better.

I come from a Social Policy background. I presume policy will be bad, and approach policy analysis from a normative position, analysing and criticising it from theoretical and critical perspectives.

Paul’s description

I specialize in the study of public policy and policymaking. I ‘synthesise’ and use policy concepts and theories to ask: how do policy processes work, and why?

Most theories and concepts – summarized in 1000 and 500 words – engage with that question in some way.

As such, I primarily seek to describe and explain policymaking, without spending much time thinking about making it better (unless asked to do so, or unless I feel very energetic).

In particular, I can give you a decent account of how all of these policy theories relate to each other, which is more important that it first seems.

A story of complex government

This ‘synthesis’ relates to my story about key elements of policy theories, with a different context influencing how I tell it. For example, I tend to describe ‘The Policy Process’ in 500 or 1000 words with the ‘Westminster Model’ versus ‘policy communities’ stories in mind (and a US scholar might tell this story in a different way):

Bounded rationality (500, 1000):

  • Individual policymakers can only pay attention to and understand a tiny proportion of (a) available information (b) the policy problems of which they are ostensibly responsible
  • So, they find cognitive shortcuts to pay attention to some issues/ information and ignore the rest (goal setting, relying on trusted advisors, belief translation, gut instinct, etc.)
  • Governmental organisations have more capacity, but also develop ‘standard operating procedures’ to limit their attention, and rely on many other actors for information and advice

Complex Policymaking Environments consisting of:

  • Many actors in many venues
  • Institutions (formal and informal rules)
  • Networks (relationships between policymakers and influencers)
  • Ideas (dominant beliefs, influencing the interpretation of problems and solutions)
  • Socioeconomic context and events

As such, the story of, say, multi-centric policymaking (or MLG, or complexity theory) contrasts with the idea of highly centralized control in the UK government.

A story of ‘evidence based policymaking’

That story provides context for applications to the agendas taken forward by other disciplines or professions.

  • The most obvious example is ‘evidence based policymaking’: my role is to explain why it is little more than a political slogan, and why people should not expect (or indeed want) it to exist, not to lobby for its existence
  • Also working on similar stories in relation to policy learning and policy design: my role is to highlight dilemmas and cautionary tales, not be a policy designer.

The politics of policymaking research

Most of the theories I describe relate to theory-informed empirical projects, generally originating from the US, and generally described as ‘positivist’ in contrast to (say) ‘interpretive’ (or, say, ‘constructivist’).

However, there are some interesting qualifications:

  • Some argue that these distinctions are overcooked (or, I suppose, overboiled)
  • Some try to bring in postpositivist ideas to positivist networks (NPF)
  • Some emerged from ‘critical policy analysis’ (SCPD)

The politics of policy analysis

This context helps understand my most recent book: The Politics of Policy Analysis

The initial podcast tells a story about MPP development, in which I used to ask students to write policy analyses (1st semester) without explaining what policy analysis was, or how to do it. My excuse is that the punchline of the module was: your account of the policy theories/ policy context is more important than your actual analysis (see the Annex to the book).

Since then, I have produced a webpage – 750 – which:

  • summarises the stories of the most-used policy analysis texts (e.g. Bardach) which identify steps including: define the problem; identify solutions; use values to compare trade-offs between solutions; predict their effects; make a recommendation
  • relates those texts to policy theories, to identify how bounded rationality and complexity change that story (and the story of the policy cycle)
  • relates both to ‘critical’ policy analysis and social science texts (some engage directly – like Stone, like Bacchi – while some provide insights – such as on critical race theory – without necessarily describing ‘policy analysis’)

A description of ‘critical’ approaches is fairly broad, but I think they tend to have key elements in common:

  • a commitment to use research to improve policy for marginalized populations (described by Bacchi as siding with the powerless against the powerful, usually in relation to class, race, ethnicity, gender, sexuality, disability)
  • analysing policy to identify: who is portrayed positively/negatively; who benefits or suffers as a result
  • analysing policymaking to identify: whose knowledge counts (e.g. as high quality and policy relevant), who is included or excluded
  • identifying ways to challenge (a) dominant and damaging policy frames and (b) insulated/ exclusive versus participatory/ inclusive forms of policymaking

If so, I would see these three approaches as ways to understand and engage with policymaking that could be complementary or contradictory. In other words, I would warn against assuming one or the other.

1 Comment

Filed under 1000 words, 500 words, 750 word policy analysis

The COVID-19 exams fiasco across the UK: why did policymaking go so wrong?

This post first appeared on the LSE British Politics and Policy blog, and it summarises our new article: Sean Kippin and Paul Cairney (2021) ‘The COVID-19 exams fiasco across the UK: four nations and two windows of opportunity’, British Politics, PDF Annex. The focus on inequalities of attainment is part of the IMAJINE project on spatial justice and territorial inequalities.

In the summer of 2020, after cancelling exams, the UK and devolved governments sought teacher estimates on students’ grades, but supported an algorithm to standardise the results. When the results produced a public outcry over unfair consequences, they initially defended their decision but reverted quickly to teacher assessment. These experiences, argue Sean Kippin and Paul Cairney, highlight the confluence of events and choices in which an imperfect and rejected policy solution became a ‘lifeline’ for four beleaguered governments. 

In 2020, the UK and devolved governments performed a ‘U-turn’ on their COVID-19 school exams replacement policies. The experience was embarrassing for education ministers and damaging to students. There are significant differences between (and often within) the four nations in terms of the structure, timing, weight, and relationship between the different examinations. However, in general, the A-level (England, Northern Ireland, Wales) and Higher/ Advanced Higher (Scotland) examinations have similar policy implications, dictating entry to further and higher education, and influencing employment opportunities. The Priestley review, commissioned by the Scottish Government after their U-turn, described this as an ‘impossible task’.

Initially, each government defined the new policy problem in relation to the need to ‘credibly’ replicate the purpose of exams to allow students to progress to tertiary education or employment. All four quickly announced their intentions to allocate in some form grades to students, rather than replace the assessments with, for example, remote examinations. However, mindful of the long-term credibility of the examinations system and of ensuring fairness, each government opted to maintain the qualifications and seek a similar distribution of grades to previous years. A key consideration was that UK universities accept large numbers of students from across the UK.

One potential solution open to policymakers was to rely solely on teacher grading (CAG). CAGs are ‘based on a range of evidence including mock exams, non-exam assessment, homework assignments and any other record of student performance over the course of study’. Potential problems included the risk of high variation and discrepancies between different centres, the potential overload of the higher education system, and the tendency for teacher predicted grades to reward already privileged students and punish disabled, non-white, and economically deprived children.

A second option was to take CAGs as a starting point, then use an algorithm to produce ‘standardisation’, which was potentially attractive to each government as it allowed students to complete secondary education and to progress to the next level in similar ways to previous (and future) cohorts. Further, an emphasis on the technical nature of this standardisation, with qualifications agencies taking the lead in designing the process by which grades would be allocated, and opting not share the details of its algorithm were a key part of its (temporary) viability. Each government then made similar claims when defending the problem and selecting the solution. Yet this approach reduced both the debate on the unequal impact of this process on students, and the chance for other experts to examine if the algorithm would produce the desired effect. Policymakers in all four governments assured students that the grading would be accurate and fair, with teacher discretion playing a large role in the calculation of grades.

To these governments, it appeared at first that they had found a fair and efficient (or at least defendable) way to allocate grades, and public opinion did not respond negatively to its announcement. However, these appearances proved to be profoundly deceptive and vanished on each day of each exam result. The Scottish national mood shifted so intensely that, after a few days, pursuing standardisation no longer seemed politically feasible. The intense criticism centred on the unequal level of reductions of grades after standardisation, rather than the unequal overall rise in grade performance after teacher assessment and standardisation (which advantaged poorer students).

Despite some recognition that similar problems were afoot elsewhere, this shift of problem definition did not happen in the rest of the UK until (a) their published exam results highlighted similar problems regarding the role of previous school performance on standardised results, and (b) the Scottish Government had already changed course. Upon the release of grades outside Scotland, it became clear that downgrades were also concentrated in more deprived areas. For instance, in Wales, 42% of students saw their A-Level results lowered from their Centre Assessed Grades, with the figure close to a third for Northern Ireland.

Each government thus faced similar choices between defending the original system by challenging the emerging consensus around its apparent unfairness; modifying the system by changing the appeal system; or abandoning it altogether and reverting to solely teacher assessed grades. Ultimately, all three governments followed the same path. Initially, they opted to defend their original policy choice. However, by 17 August, the UK, Welsh, and Northern education secretaries announced (separately) that examination grades would be based solely on CAGs – unless the standardisation process had generated a higher grade (students would receive whichever was highest).

Scotland’s initial experience was instructive to the rest of the UK and its example provided the UK government with a blueprint to follow (eventually). It began with a new policy choice – reverting to teacher assessed grades – sold as fairer to victims of the standardisation process. Once this precedent had been set, a different course for policymakers at the UK level became difficult to resist, particularly when faced with a similar backlash. The UK’s government’s decision in turn influenced the Welsh and Northern Irish governments.

In short, we can see that the particular ordering of choices created a cascading effect across the four governments which created initially one policy solution, before triggering a U-turn. This focus on order and timing should not be lost during the inevitable inquiries and reports on the examinations systems. The take-home message is to not ignore the policy process when evaluating the long-term effect of these policies. Focus on why the standardisation processes went wrong is welcome, but we should also focus on why the policymaking process malfunctioned, to produce a wildly inconsistent approach to the same policy choice in such a short space of time. Examining both aspects of this fiasco will be crucial to the grading process in 2021, given that governments will be seeking an alternative to exams for a second year.

__________________________

Note: the above draws on the authors’ published work in British Politics.

Leave a comment

Filed under IMAJINE, Policy learning and transfer, public policy, UK politics and policy