Monthly Archives: December 2019

Policy Analysis in 750 Words: policy analysis for marginalized groups in racialized political systems

Note: this post forms one part of the Policy Analysis in 750 words series overview.

For me, this story begins with a tweet by Professor Jamila Michener, about a new essay by Dr Fabienne Doucet, ‘Centering the Margins: (Re)defining Useful Research Evidence Through Critical Perspectives’:

https://twitter.com/povertyscholar/status/1207054211759910912

Research and policy analysis for marginalized groups

For Doucet (2019: 1), it begins by describing the William T. Grant Foundation’s focus on improving the ‘use of research evidence’ (URE), and the key questions that we should ask when improving URE:

  1. For what purposes do policymakers find evidence useful?

Examples include to: inform a definition of problems and solutions, foster practitioner learning, support an existing political position, or impose programmes backed by evidence (compare with How much impact can you expect from your analysis?).

  1.   Who decides what to use, and what is useful?

For example, usefulness could be defined by the researchers providing evidence, the policymakers using it, the stakeholders involved in coproduction, or the people affected by research and policy (compare with Bacchi, Stone and Who should be involved in the process of policy analysis?).

  1. How do critical theories inform these questions? (compare with T. Smith)

First, they remind us that so-called ‘rational’ policy processes have incorporated research evidence to help:

‘maintain power hierarchies and accept social inequity as a given. Indeed, research has been historically and contemporaneously (mis)used to justify a range of social harms from enslavement, colonial conquest, and genocide, to high-stakes testing, disproportionality in child welfare services, and “broken windows” policing’ (Doucet, 2019: 2)

Second, they help us redefine usefulness in relation to:

‘how well research evidence communicates the lived experiences of marginalized groups so that the understanding of the problem and its response is more likely to be impactful to the community in the ways the community itself would want’ (Doucet, 2019: 3)

In that context, potential responses include to:

  1. Recognise the ways in which research and policy combine to reproduce the subordination of social groups.
  • General mechanisms include: the reproduction of the assumptions, norms, and rules that produce a disproportionate impact on social groups (compare with Social Construction and Policy Design).
  • Specific mechanism include: judging marginalised groups harshly according to ‘Western, educated, industrialized, rich and democratic’ norms (‘WEIRD’)
  1. Reject the idea that scientific research can be seen as objective or neutral (and that researchers are beyond reproach for their role in subordination).
  2. Give proper recognition to ‘experiential knowledge’ and ‘transdiciplinary approaches’ to knowledge production, rather than privileging scientific knowledge.
  3. Commit to social justice, to help ‘eliminate oppressions and to emancipate and empower marginalized groups’, such as by disrupting ‘the policies and practices that disproportionately harm marginalized groups’ (2019: 5-7)
  4. Develop strategies to ‘center race’, ‘democratize’ research production, and ‘leverage’ transdisciplinary methods (including poetry, oral history and narrative, art, and discourse analysis – compare with Lorde) (2019: 10-22)

See also Doucet, F. (2021) ‘Identifying and Testing Strategies to Improve the Use of Antiracist Research Evidence through Critical Race Lenses

Policy analysis in a ‘racialized polity’

A key way to understand these processes is to use, and improve, policy theories to explain the dynamics and impacts of a racialized political system. For example, ‘policy feedback theory’ (PFT) draws on elements from historical institutionalism and SCPD to identify the rules, norms, and practices that reinforce subordination.

In particular, Michener’s (2019: 424) ‘Policy Feedback in a Racialized Polity’ develops a ‘racialized feedback framework (RFF)’ to help explain the ‘unrelenting force with which racism and White supremacy have pervaded social, economic, and political institutions in the United States’. Key mechanisms include (2019: 424-6):

  1. Channelling resources’, in which the rules, to distribute government resources, benefit some social groups and punish others.
  • Examples include: privileging White populations in social security schemes and the design/ provision of education, and punishing Black populations disproportionately in prisons (2019: 428-32).
  • These rules also influence the motivation of social groups to engage in politics to influence policy (some citizens are emboldened, others alienated).
  1. Generating interests’, in which ‘racial stratification’ is a key factor in the power of interest groups (and balance of power in them).
  2. Shaping interpretive schema’, in which race is a lens through which actors understand, interpret, and seek to solve policy problems.
  3. The ways in which centralization (making policy at the federal level) or decentralization influence policy design.
  • For example, the ‘historical record’ suggests that decentralization is more likely to ‘be a force of inequality than an incubator of power for people of color’ (2019: 433).

Insufficient attention to race and racism: what are the implications for policy analysis?

One potential consequence of this lack of attention to race, and the inequalities caused by racism in policy, is that we place too much faith in the vague idea of ‘pragmatic’ policy analysis.

Throughout the 750 words series, you will see me refer generally to the benefits of pragmatism:

In that context, pragmatism relates to the idea that policy analysis consists of ‘art and craft’, in which analysts assess what is politically feasible if taking a low-risk client-oriented approach.

In this context, pragmatism may be read as a euphemism for conservatism and status quo protection.

In other words, other posts in the series warn against too-high expectations for entrepreneurial and systems thinking approaches to major policy change, but they should not be read as an excuse to reject ambitious plans for much-needed changes to policy and policy analysis (compare with Meltzer and Schwartz, who engage with this dilemma in client-oriented advice).

Connections to blog themes

This post connects well to:

6 Comments

Filed under 750 word policy analysis, Evidence Based Policymaking (EBPM), public policy, Storytelling

Policy Analysis in 750 Words: entrepreneurial policy analysis

This post forms one part of the Policy Analysis in 750 words series overview and connects to ‘Three habits of successful policy entrepreneurs’.

The idea of a ‘policy entrepreneur’ is important to policy studies and policy analysis.

Let’s begin with its positive role in analysis, then use policy studies to help qualify its role within policymaking environments.

The take-home-messages are to

  1. recognise the value of entrepreneurship, and invest in relevant skills and strategies, but
  2. not overstate its spread or likely impact, and
  3. note the unequal access to political resources associated with entrepreneurs.

Box 11.3 UPP 2nd ed entrepreneurs

Entrepreneurship and policy analysis

Mintrom identifies the intersection between policy entrepreneurship and policy analysis, to highlight the benefits of ‘positive thinking’, creativity, deliberation, and leadership.

He expands on these ideas further in So you want to be a policy entrepreneur?:

Policy entrepreneurs are energetic actors who engage in collaborative efforts in and around government to promote policy innovations. Given the enormous challenges now facing humanity, the need is great for such actors to step forward and catalyze change processes” (Mintrom, 2019: 307).

Although many entrepreneurs seem to be exceptional people, Mintrom (2019: 308-20) identifies:

  1. Key attributes to compare
  • ‘ambition’, to invest resources for future reward
  • ‘social acuity’, to help anticipate how others are thinking
  • ‘credibility’, based on authority and a good track record
  • ‘sociability’, to empathise with others and form coalitions or networks
  • ‘tenacity’, to persevere during adversity
  1. The skills that can be learned
  • ‘strategic thinking’, to choose a goal and determine how to reach it
  • ‘team building’, to recognise that policy change is a collective effort, not the responsibility of heroic individuals (compare with Oxfam)
  • ‘collecting evidence’, and using it ‘strategically’ to frame a problem and support a solution
  • ‘making arguments’, using ‘tactical argumentation’ to ‘win others to their cause and build coalitions of supporters’ (2019: 313)
  • ‘engaging multiple audiences’, by tailoring arguments and evidence to their beliefs and interests
  • ‘negotiating’, such as by trading your support in this case for their support in another
  • ‘networking’, particularly when policymaking authority is spread across multiple venues.
  1. The strategies built on these attributes and skills.
  • ‘problem framing’, such as to tell a story of a crisis in need of urgent attention
  • ‘using and expanding networks’, to generate attention and support
  • ‘working with advocacy coalitions’, to mobilise a collection of actors who already share the same beliefs
  • ‘leading by example’, to signal commitment and allay fears about risk
  • ‘scaling up change processes’, using policy innovation in one area to inspire wider adoption.

p308 Mintrom for 750 words

Overall, entrepreneurship is ‘tough work’ requiring ‘courage’, but necessary for policy disruption, by: ‘those who desire to make a difference, who recognize the enormous challenges now facing humanity, and the need for individuals to step forward and catalyze change’ (2019: 320; compare with Luetjens).

Entrepreneurship and policy studies

  1. Most policy actors fail

It is common to relate entrepreneurship to stories of exceptional individuals and invite people to learn from their success. However, the logical conclusion is that success is exceptional and most policy actors will fail.

A focus on key skills takes us away from this reliance on exceptional actors, and ties in with other policy studies-informed advice on how to navigate policymaking environments (see ‘Three habits of successful policy entrepreneurs’, these ANZSOG talks, and box 6.3 below)

box 6.3

However, note the final sentence, which reminds us that it is possible to invest a huge amount of time and effort in entrepreneurial skills without any of that investment paying off.

  1. Even if entrepreneurs succeed, the explanation comes more from their environments than their individual skills

The other side of the entrepreneurship coin is the policymaking environment in which actors operate.

Policy studies of entrepreneurship (such as Kingdon on multiple streams) rely heavily on metaphors on evolution. Entrepreneurs are the actors most equipped to thrive within their environments (see Room).

However, Kingdon uses the additional metaphor of ‘surfers waiting for the big wave’, which suggests that their environments are far more important than them (at least when operating on a US federal scale – see Kingdon’s Multiple Streams Approach).

Entrepreneurs may be more influential at a more local scale, but the evidence of their success (independent of the conditions in which they operate) is not overwhelming. So, self-aware entrepreneurs know when to ‘surf the waves’ or try to move the sea.

  1. The social background of influential actors

Many studies of entrepreneurs highlight the stories of tenacious individuals with limited resources but the burning desire to make a difference.

The alternative story is that political resources are distributed profoundly unequally. Few people have the resources to:

  • run for elected office
  • attend elite Universities, or find other ways to develop the kinds of personal networks that often relate to social background
  • develop the credibility built on a track record in a position of authority (such as in government or science).
  • be in the position to invest resources now, to secure future gains, or
  • be in an influential position to exploit windows of opportunity.

Therefore, when focusing on entrepreneurial policy analysis, we should encourage the development of a suite of useful skills, but not expect equal access to that development or the same payoff from entrepreneurial action.

See also:

Compare these skills with the ones we might associate with ‘systems thinking

If you want to see me say these depressing things with a big grin:

8 Comments

Filed under 750 word policy analysis, agenda setting, Evidence Based Policymaking (EBPM), public policy, Uncategorized

Policy Analysis in 750 Words: complex systems and systems thinking

This post forms one part of the Policy Analysis in 750 words series overview and connects to previous posts on complexity. The first 750 words tick along nicely, then there is a picture of a cat hanging in there baby to signal where it can all go wrong. I updated it (22.6.20) to add category 11 then again (30.9.20) when I realised that the former category 11 was a lot like 6.

There are a million-and-one ways to describe systems and systems thinking. These terms are incredibly useful, but also at risk of meaning everything and therefore nothing (compare with planning and consultation).

Let’s explore how the distinction between policy studies and policy analysis can help us clarify the meaning of ‘complex systems’ and ‘systems thinking’ in policymaking.

For example, how might we close a potentially large gap between these two stories?

  1. Systems thinking in policy analysis.
  • Avoid the unintended consequences of too-narrow definitions of problems and processes (systems thinking, not simplistic thinking).
  • If we engage in systems thinking effectively, we can understand systems well enough to control, manage, or influence them.
  1. The study of complex policymaking systems.
  • Policy emerges from complex systems in the absence of: (a) central government control and often (b) policymaker awareness.
  • We need to acknowledge these limitations properly, to accept our limitations, and avoid the mechanistic language of ‘policy levers’ which exaggerate human or government control.

See also: Systems science and systems thinking for public health: a systematic review of the field

Six meanings of complex systems in policy and policymaking

Let’s begin by trying to clarify many meanings of complex system and relate them to systems thinking storylines.

For example, you will encounter three different meanings of complex system in this series alone, and each meaning presents different implications for systems thinking:

  1. A complex policymaking system

Policy outcomes seem to ‘emerge’ from policymaking systems in the absence of central government control. As such, we should rely less on central government driven targets (in favour of local discretion to adapt to environments), encourage trial-and-error learning, and rethink the ways in which we think about government ‘failure’ (see, for example, Hallsworth on ‘system stewardship’, the OECD on ‘Systemic Thinking for Policy Making‘, and this thread)

  • Systems thinking is about learning and adapting to the limits to policymaker control.

  1. Complex policy problems

Dunn (2017:  73) describes the interdependent nature of problems:

Subjectively experienced problems – crime, poverty, unemployment, inflation, energy, pollution, health, security – cannot be decomposed into independent subsets without running the risk of producing an approximately right solution to the wrong problem. A key characteristic of systems of problems is that the whole is greater – that is, qualitatively different – than the simple sum of its parts” (contrast with Meltzer and Schwartz on creating a ‘boundary’ to make problems seem solveable).

  • Systems thinking is about addressing policy problems holistically.
  1. Complex policy mixes

What we call ‘policy’ is actually a collection of policy instruments. Their overall effect is ‘non-linear’, difficult to predict, and subject to emergent outcomes, rather than cumulative (compare with Lindblom’s hopes for incrementalist change).

This point is crucial to policy analysis: does it involve a rethink of all instruments, or merely add a new instrument to the pile?

  • Systems thinking is about anticipating the disproportionate effect of a new policy instrument.

These three meanings are joined by at least three more (from Munro and Cairney on energy systems):

  1. Socio-technical systems (Geels)

Used to explain the transition from unsustainable to sustainable energy systems.

  • Systems thinking is about identifying the role of new technologies, protected initially in a ‘niche’, and fostered by a supportive ‘social and political environment’.
  1. Socio-ecological systems (Ostrom)

Used to explain how and why policy actors might cooperate to manage finite resources.

  • Systems thinking is about identifying the conditions under which actors develop layers of rules to foster trust and cooperation.
  1. Performing the metaphor of systems

Governments often use the language of complex systems – rather loosely – to indicate an awareness of the interconnectedness of things. They often perform systems thinking to give the impression that they are thinking and acting differently, but without backing up their words with tangible changes to policy instruments.

  • Systems thinking is about projecting the sense that (a) policy and policymaking is complicated, but (b) governments can still look like they are in control.

Four more meanings of systems thinking

Now, let’s compare these storylines with a small sample of wider conceptions of systems thinking:

  1. The old way of establishing order from chaos

Based on the (now-diminished) faith in science and rational management techniques to control the natural world for human benefit (compare Hughes and Hughes on energy with Checkland on ‘hard’ v ‘soft’ systems approaches, then see What you need as an analyst versus policymaking reality and Radin on the old faith in rationalist governing systems).

  • Systems thinking was about the human ability to turn potential chaos into well-managed systems (such as ‘large technical systems’ to distribute energy)
  1. The new way of accepting complexity but seeking to make an impact

Based on the idea that we can identify ‘leverage points’, or the places that help us ‘intervene in a system’ (see Meadows then compare with Arnold and Wade).

  • Systems thinking is about the human ability to use a small shift in a system to produce profound changes in that system.
  1. A way to rethink cause-and-effect

Based on the idea that current research methods are too narrowly focused on linearity rather than the emergent properties of systems of behaviour (for example, Rutter et al on how to analyse the cumulative effect of public health interventions, and Greenhalgh on responding more effectively to pandemics).

  • Systems thinking is about rethinking the ways in which governments, funders, or professions conduct policy-relevant research on social behaviour.

  1. A way of thinking about ourselves

Embrace the limits to human cognition, and accept that all understandings of complex systems are limited.

  • Systems thinking is about developing the ‘wisdom’ and ‘humility’ to accept our limited knowledge of the world.

hang-in-there-baby

How can we clarify systems thinking and use it effectively in policy analysis?

Now, imagine you are in a room of self-styled systems thinkers, and that no-one has yet suggested a brief conversation to establish what you all mean by systems thinking. I reckon you can make a quick visual distinction by seeing who looks optimistic.

I’ll be the morose-looking guy sitting in the corner, waiting to complain about ambiguity, so you would probably be better off sitting next to Luke Craven who still ‘believes in the power of systems thinking’.

If you can imagine some amalgam of these pessimistic/ optimistic positions, perhaps the conversation would go like this:

  1. Reasons to expect some useful collaboration.

Some of these 10 discussions seem to complement each other. For example:

  • We can use 3 and 9 to reject one narrow idea of ‘evidence-based policymaking’, in which the focus is on (a) using experimental methods to establish cause and effect in relation to one policy instrument, without showing (b) the overall impact on policy and outcomes (e.g. compare FNP with more general ‘families’ policy).
  • 1-3 and 10 might be about the need for policy analysts to show humility when seeking to understand and influence complex policy problems, solutions, and policymaking systems.

In other words, you could define systems thinking in relation to the need to rethink the ways in which we understand – and try to address – policy problems. If so, you can stop here and move on to the next post. There is no benefit to completing this post.

  1. Reasons to expect the same old frustrating discussions based on no-one defining terms well enough (collectively) to collaborate effectively (beyond using the same buzzwords).

Although all of these approaches use the language of complex systems and systems thinking, note some profound differences:

Holding on versus letting go.

  • Some are about intervening to take control of systems or, at least, make a disproportionate difference from a small change.
  • Some are about accepting our inability to understand, far less manage, these systems.

Talking about different systems.

  • Some are about managing policymaking systems, and others about social systems (or systems of policy problems), without making a clear connection between both endeavours.

For example, if you use approach 9 to rethink societal cause-and-effect, are you then going to pretend that you can use approach 7 to do something about it? Or, will our group have a difficult discussion about the greater likelihood of 6 (metaphorical policymaking) in the context of 1 (the inability of governments to control the policymaking systems we need to solve the problems raised by 9).

In that context, the reason that I am sitting in the corner, looking so morose, is that too much collective effort goes into (a) restating, over and over and over again, the potential benefits of systems thinking, leaving almost no time for (b) clarifying systems thinking well enough to move on to these profound differences in thinking. Systems thinking has not even helped us solve these problems with systems thinking.

See also:

Why systems thinkers and data scientists should work together to solve social challenges

12 Comments

Filed under 750 word policy analysis, Evidence Based Policymaking (EBPM), Prevention policy, public policy, UKERC

Policy Analysis in 750 Words: how much impact can you expect from your analysis?

This post forms one part of the Policy Analysis in 750 words series overview.

Throughout this series you may notice three different conceptions about the scope of policy analysis:

  1. ‘Ex ante’ (before the event) policy analysis. Focused primarily on defining a problem, and predicting the effect of solutions, to inform current choice (as described by Meltzer and Schwartz and Thissen and Walker).
  2. ‘Ex post’ (after the event) policy analysis. Focused primarily on monitoring and evaluating that choice, perhaps to inform future choice (as described famously by Weiss).
  3. Some combination of both, to treat policy analysis as a continuous (never-ending) process (as described by Dunn).

As usual, these are not hard-and-fast distinctions, but they help us clarify expectations in relation to different scenarios.

  1. The impact of old-school ex ante policy analysis

Radin provides a valuable historical discussion of policymaking with the following elements:

  • a small number of analysts, generally inside government (such as senior bureaucrats, scientific experts, and – in particular- economists),
  • giving technical or factual advice,
  • about policy formulation,
  • to policymakers at the heart of government,
  • on the assumption that policy problems would be solved via analysis and action.

This kind of image signals an expectation for high impact: policy analysts face low competition, enjoy a clearly defined and powerful audience, and their analysis is expected to feed directly into choice.

Radin goes on to describe a much different, modern policy environment: more competition, more analysts spread across and outside government, with a less obvious audience, and – even if there is a client – high uncertainty about where the analysis fits into the bigger picture.

Yet, the impetus to seek high and direct impact remains.

This combination of shifting conditions but unshifting hopes/ expectations helps explain a lot of the pragmatic forms of policy analysis you will see in this series, including:

  • Keep it catchy, gather data efficiently, tailor your solutions to your audience, and tell a good story (Bardach)
  • Speak with an audience in mind, highlight a well-defined problem and purpose, project authority, use the right form of communication, and focus on clarity, precision, conciseness, and credibility ( Smith)
  • Address your client’s question, by their chosen deadline, in a clear and concise way that they can understand (and communicate to others) quickly (Weimer and Vining)
  • Client-oriented advisors identify the beliefs of policymakers and anticipate the options worth researching (Mintrom)
  • Identify your client’s resources and motivation, such as how they seek to use your analysis, the format of analysis they favour (make it ‘concise’ and ‘digestible’), their deadline, and their ability to make or influence the policies you might suggest (Meltzer and Schwartz).
  • ‘Advise strategically’, to help a policymaker choose an effective solution within their political context (Thissen and Walker).
  • Focus on producing ‘policy-relevant knowledge’ by adapting to the evidence-demands of policymakers and rejecting a naïve attachment to ‘facts speaking for themselves’ or ‘knowledge for its own sake’ (Dunn).
  1. The impact of research and policy evaluation

Many of these recommendations are familiar to scientists and researchers, but generally in the context of far lower expectations about their likely impact, particularly if those expectations are informed by policy studies (compare Oliver & Cairney with Cairney & Oliver).

In that context, Weiss’ work is a key reference point. It gives us a menu of ways in which policymakers might use policy evaluation (and research evidence more widely):

  • to inform solutions to a problem identified by policymakers
  • as one of many sources of information used by policymakers, alongside ‘stakeholder’ advice and professional and service user experience
  • as a resource used selectively by politicians, with entrenched positions, to bolster their case
  • as a tool of government, to show it is acting (by setting up a scientific study), or to measure how well policy is working
  • as a source of ‘enlightenment’, shaping how people think over the long term (compare with this discussion of ‘evidence based policy’ versus ‘policy based evidence’).

In other words, researchers may have a role, but they struggle (a) to navigate the politics of policy analysis, (b) find the right time to act, and (c) to secure attention, in competition with many other policy actors.

  1. The potential for a form of continuous impact

Dunn suggests that the idea of ‘ex ante’ policy analysis is misleading, since policymaking is continuous, and evaluations of past choices inform current choices. Think of each policy analysis steps as ‘interdependent’, in which new knowledge to inform one step also informs the other four. For example, routine monitoring helps identify compliance with regulations, if resources and services reach ‘target groups’, if money is spent correctly, and if we can make a causal link between the policy solutions and outcomes. Its impact is often better seen as background information with intermittent impact.

Key conclusions to bear in mind

  1. The demand for information from policy analysts may be disproportionately high when policymakers pay attention to a problem, and disproportionately low when they feel that they have addressed it.
  2. Common advice for policy analysts and researchers often looks very similar: keep it concise, tailor it to your audience, make evidence ‘policy relevant’, and give advice (don’t sit on the fence). However, unless researchers are prepared to act quickly, to gather data efficiently (not comprehensively), to meet a tight brief for a client, they are not really in the impact business described by most policy analysis texts.
  3. A lot of routine, continuous, impact tends to occur out of the public spotlight, based on rules and expectations that most policy actors take for granted.

Further reading

See the Policy Analysis in 750 words series overview to continue reading on policy analysis.

See the ‘evidence-based policymaking’ page to continue reading on research impact.

ebpm pic

Bristol powerpoint: Paul Cairney Bristol EBPM January 2020

3 Comments

Filed under 750 word policy analysis, Evidence Based Policymaking (EBPM), Policy learning and transfer, public policy

Policy Analysis in 750 Words: what you need as an analyst versus policymaking reality

This post forms one part of the Policy Analysis in 750 words series overview. Note for the eagle eyed: you are not about to experience déjà vu. I’m just using the same introduction.

When describing ‘the policy sciences’, Lasswell distinguishes between:

  1. ‘knowledge of the policy process’, to foster policy studies (the analysis of policy)
  2. ‘knowledge in the process’, to foster policy analysis (analysis for policy)

The lines between each approach are blurry, and each element makes less sense without the other. However, the distinction is crucial to help us overcome the major confusion associated with this question:

Does policymaking proceed through a series of stages?

The short answer is no.

The longer answer is that you can find about 40 blog posts (of 500 and 1000 words) which compare (a) a stage-based model called the policy cycle, and (b) the many, many policy concepts and theories that describe a far messier collection of policy processes.

cycle

In a nutshell, most policy theorists reject this image because it oversimplifies a complex policymaking system. The image provides a great way to introduce policy studies, and serves a political purpose, but it does more harm than good:

  1. Descriptively, it is profoundly inaccurate (unless you imagine thousands of policy cycles interacting with each other to produce less orderly behaviour and less predictable outputs).
  2. Prescriptively, it gives you rotten advice about the nature of your policymaking task (for more on these points, see this chapter, article, article, and series).

Why does the stages/ policy cycle image persist? Two relevant explanations

 

  1. It arose from a misunderstanding in policy studies

In another nutshell, Chris Weible and I argue (in a secret paper) that the stages approach represents a good idea gone wrong:

  • If you trace it back to its origins, you will find Lasswell’s description of decision functions: intelligence, recommendation, prescription, invocation, application, appraisal and termination.
  • These functions correspond reasonably well to a policy cycle’s stages: agenda setting, formulation, legitimation, implementation, evaluation, and maintenance, succession or termination.
  • However, Lasswell was imagining functional requirements, while the cycle seems to describe actual stages.

In other words, if you take Lasswell’s list of what policy analysts/ policymakers need to do, multiple it by the number of actors (spread across many organisations or venues) trying to do it, then you get the multi-centric policy processes described by modern theories. If, instead, you strip all that activity down into a single cycle, you get the wrong idea.

  1. It is a functional requirement of policy analysis

This description should seem familiar, because the classic policy analysis texts appear to describe a similar series of required steps, such as:

  1. define the problem
  2. identify potential solutions
  3. choose the criteria to compare them
  4. evaluate them in relation to their predicted outcomes
  5. recommend a solution
  6. monitor its effects
  7. evaluate past policy to inform current policy.

However, these texts also provide a heavy dose of caution about your ability to perform these steps (compare Bardach, Dunn, Meltzer and Schwartz, Mintrom, Thissen and Walker, Weimer and Vining)

In addition, studies of policy analysis in action suggest that:

  • an individual analyst’s need for simple steps, to turn policymaking complexity into useful heuristics and pragmatic strategies,

should not be confused with

What you need versus what you can expect

Overall, this discussion of policy studies and policy analysis reminds us of a major difference between:

  1. Functional requirements. What you need from policymaking systems, to (a) manage your task (the 5-8 step policy analysis) and (b) understand and engage in policy processes (the simple policy cycle).
  2. Actual processes and outcomes. What policy concepts and theories tell us about bounded rationality (which limit the comprehensiveness of your analysis) and policymaking complexity (which undermines your understanding and engagement in policy processes).

Of course, I am not about to provide you with a solution to these problems.

Still, this discussion should help you worry a little bit less about the circular arguments you will find in key texts: here are some simple policy analysis steps, but policymaking is not as ‘rational’ as the steps suggest, but (unless you can think of an alternative) there is still value in the steps, and so on.

See also:

The New Policy Sciences

4 Comments

Filed under 750 word policy analysis, agenda setting, public policy

Policy Analysis in 750 Words: Defining policy problems and choosing solutions

This post forms one part of the Policy Analysis in 750 words series overview.

When describing ‘the policy sciences’, Lasswell distinguishes between:

  1. ‘knowledge of the policy process’, to foster policy studies (the analysis of policy)
  2. ‘knowledge in the process’, to foster policy analysis (analysis for policy)

The idea is that both elements are analytically separable but mutually informative: policy analysis is crucial to solving real policy problems, policy studies inform the feasibility of analysis, the study of policy analysts informs policy studies, and so on.

Both elements focus on similar questions – such as What is policy? – and explore their descriptive (what do policy actors do?) and prescriptive (what should they do?) implications.

  1. What is the policy problem?

Policy studies tend to describe problem definition in relation to framing, narrative, social construction, power, and agenda setting.

Actors exercise power to generate attention for their preferred interpretation, and minimise attention to alternative frames (to help foster or undermine policy change, or translate their beliefs into policy).

Policy studies incorporate insights from psychology to understand (a) how policymakers might combine cognition and emotion to understand problems, and therefore (b) how to communicate effectively when presenting policy analysis.

Policy studies focus on the power to reduce ambiguity rather than simply the provision of information to reduce uncertainty. In other words, the power to decide whose interpretation of policy problems counts, and therefore to decide what information is policy-relevant.

This (unequal) competition takes place within a policy process over which no actor has full knowledge or control.

The classic 5-8 step policy analysis texts focus on how to define policy problems well, but they vary somewhat in their definition of doing it well (see also C.Smith):

  • Bardach recommends using rhetoric and eye-catching data to generate attention
  • Weimer and Vining and Mintrom recommend beginning with your client’s ‘diagnosis’, placing it in a wider perspective to help analyse it critically, and asking yourself how else you might define it (see also Bacchi, Stone)
  • Meltzer and Schwartz and Dunn identify additional ways to contextualise your client’s definition, such as by generating a timeline to help ‘map’ causation or using ‘problem-structuring methods’ to compare definitions and avoid making too many assumptions on a problem’s cause.
  • Thissen and Walker compare ‘rational’ and ‘argumentative’ approaches, treating problem definition as something to be measured scientifically or established rhetorically (see also Riker).

These approaches compare with more critical accounts that emphasise the role of power and politics to determine whose knowledge is relevant (L.T.Smith) and whose problem definition counts (Bacchi, Stone). Indeed, Bacchi and Stone provide a crucial bridge between policy analysis and policy studies by reflecting on what policy analysts do and why.

  1. What is the policy solution?

In policy studies, it is common to identify counterintuitive or confusing aspects of policy processes, including:

  • Few studies suggest that policy responses actually solve problems (and many highlight their potential to exacerbate them). Rather, ‘policy solutions’ is shorthand for proposed or alleged solutions.
  • Problem definition often sets the agenda for the production of ‘solutions’, but note the phrase solutions chasing problems (when actors have their ‘pet’ solutions ready, and they seek opportunities to promote them).

Policy studies: problem definition informs the feasibility and success of solutions

Generally speaking, to define the problem is to influence assessments of the feasibility of solutions:

  • Technical feasibility. Will they work as intended, given the alleged severity and cause of the problem?
  • Political feasibility. Will they receive sufficient support, given the ways in which key policy actors weigh up the costs and benefits of action?

Policy studies highlight the inextricable connection between technical and political feasibility. Put simply, (a) a ‘technocratic’ choice about the ‘optimality’ of a solution is useless without considering who will support its adoption, and (b) some types of solution will always be a hard sell, no matter their alleged effectiveness (Box 2.3 below).

In that context, policy studies ask: what types of policy tools or instruments are actually used, and how does their use contribute to policy change? Measures include the size, substance, speed, and direction of policy change.

box 2.3 2nd ed UPP

In turn, problem definition informs: the ways in which actors will frame any evaluation of policy success, and the policy-relevance of the evidence to evaluate solutions. Simple examples include:

  • If you define tobacco in relation to: (a) its economic benefits, or (b) a global public health epidemic, evaluations relate to (a) export and taxation revenues, or (b) reductions in smoking in the population.
  • If you define ‘fracking’ in relation to: (a) seeking more benefits than costs, or (b) minimising environmental damage and climate change, evaluations relate to (a) factors such as revenue and effective regulation, or simply (b) how little it takes place.

Policy analysis: recognising and pushing boundaries

Policy analysis texts tend to accommodate these insights when giving advice:

  • Bardach recommends identifying solutions that your audience might consider, perhaps providing a range of options on a notional spectrum of acceptability.
  • Smith highlights the value of ‘precedent’, or relating potential solutions to previous strategies.
  • Weimer and Vining identify the importance of ‘a professional mind-set’ that may be more important than perfecting ‘technical skills’
  • Mintrom notes that some solutions are easier to sell than others
  • Meltzer and Schwartz describe the benefits of making a preliminary recommendation to inform an iterative process, drawing feedback from clients and stakeholder groups
  • Dunn warns against too-narrow forms of ‘evidence based’ analysis which undermine a researcher’s ability to adapt well to the evidence-demands of policymakers
  • Thissen and Walker relate solution feasibility to a wide range of policy analysis ‘styles’

Still, note the difference in emphasis.

Policy analysis education/ training may be about developing the technical skills to widen definitions and apply many criteria to compare solutions.

Policy studies suggest that problem definition and a search for solutions takes place in an environment where many actors apply a much narrower lens and are not interested in debates on many possibilities (particularly if they begin with a solution).

I have exaggerated this distinction between each element, but it is worth considering the repeated interaction between them in practice: politics and policymaking provide boundaries for policy analysis, analysis could change those boundaries, and policy studies help us reflect on the impact of analysts.

I’ll take a quick break, then discuss how this conclusion relates to the idea of ‘entrepreneurial’ policy analysis.

Further reading

Understanding Public Policy (2020: 28) describes the difference between governments paying for and actually using the ‘tools of policy formulation’. To explore this point, see ‘The use and non-use of policy appraisal tools in public policy making‘ and The Tools of Policy Formulation.

p28 upp 2nd ed policy tools

4 Comments

Filed under 750 word policy analysis, agenda setting, Evidence Based Policymaking (EBPM), public policy

Policy Analysis in 750 Words: What can you realistically expect policymakers to do?

This post forms one part of the Policy Analysis in 750 words series overview.

One aim of this series is to combine insights from policy research (1000, 500) and policy analysis texts.

In this case, modern theories of the policy process help you identify your audience and their capacity to follow your advice. This simple insight may have a profound impact on the advice you give.

Policy analysis for an ideal-type world

For our purposes, an ideal-type is an abstract idea, which highlights hypothetical features of the world, to compare with ‘real world’ descriptions. It need not be an ideal to which we aspire. For example, comprehensive rationality describes the ideal type, and bounded rationality describes the ‘real world’ limitations to the ways in which humans and organisations process information.

 

Imagine writing policy analysis in the ideal-type world of a single powerful ‘comprehensively rational’ policymaker at the heart of government, making policy via an orderly policy cycle.

Your audience would be easy to identify, your analysis would be relatively simple, and you would not need to worry about what happens after you make a recommendation for policy change.

You could adopt a simple 5-8 step policy analysis method, use widely-used tools such as cost-benefit analysis to compare solutions, and know where the results would feed into the policy process.

I have perhaps over-egged this ideal-type pudding, but I think a lot of traditional policy analyses tapped into this basic idea and focused more on the science of analysis than the political and policymaking context in which it takes place (see Radin and Brans, Geva-May, and Howlett).

Policy analysis for the real world

Then imagine a far messier and less predictable world in which the nature of the policy issue is highly contestedresponsibility for policy is unclear, and no single ‘centre’ has the power to turn a recommendation into an outcome.

This image is a key feature of policy process theories, which describe:

  • Many policymakers and influencers spread across many levels and types of government (as the venues in which authoritative choice takes place). Consequently, it is not a straightforward task to identify and know your audience, particularly if the problem you seek to solve requires a combination of policy instruments controlled by different actors.
  • Each venue resembles an institution driven by formal and informal rules. Formal rules are written-down or widely-known. Informal rules are unwritten, difficult to understand, and may not even be understood in the same way by participants. Consequently, it is difficult to know if your solution will be a good fit with the standard operating procedures of organisations (and therefore if it is politically feasible or too challenging).
  • Policymakers and influencers operate in ‘subsystems’, forming networks built on resources such as trust or coalitions based on shared beliefs. Effective policy analysis may require you to engage with – or become part of – such networks, to allow you to understand the unwritten rules of the game and encourage your audience to trust the messenger. In some cases, the rules relate to your willingness to accept current losses for future gains, to accept the limited impact of your analysis now in the hope of acceptance at the next opportunity.
  • Actors relate their analysis to shared understandings of the world – how it is, and how it should be – which are often so well-established as to be taken for granted. Common terms include paradigms, hegemons, core beliefs, and monopolies of understandings. These dominant frames of reference give meaning to your policy solution. They prompt you to couch your solutions in terms of, for example, a strong attachment to evidence-based cases in public health, value for money in treasury departments, or with regard to core principles such as liberalism or socialism in different political systems.
  • Your solutions relate to socioeconomic context and the events that seem (a) impossible to ignore and (b) out of the control of policymakers. Such factors range from a political system’s geography, demography, social attitudes, and economy, while events can be routine elections or unexpected crises.

What would you recommend under these conditions? Rethinking 5-step analysis

There is a large gap between policymakers’ (a) formal responsibilities versus (b) actual control of policy processes and outcomes. Even the most sophisticated ‘evidence based’ analysis of a policy problem will fall flat if uninformed by such analyses of the policy process. Further, the terms of your cost-benefit analysis will be highly contested (at least until there is agreement on what the problem is, and how you would measure the success of a solution).

Modern policy analysis texts try to incorporate such insights from policy theories while maintaining a focus on 5-8 steps. For example:

  • Meltzer and Schwartz contrast their ‘flexible’ and ‘iterative’ approach with a too- rigid ‘rationalistic approach’.
  • Bardachand Dunn emphasise the value of political pragmatism and the ‘art and craft’ of policy analysis.
  • Weimer and Vininginvest 200 pages in economic analyses of markets and government, often highlighting a gap between (a) our ability to model and predict economic and social behaviour, and (b) what actually happens when governments intervene.
  • Mintrom invites you to see yourself as a policy entrepreneur, to highlight the value of of ‘positive thinking’, creativity, deliberation, and leadership, and perhaps seek ‘windows of opportunity’ to encourage new solutions. Alternatively, a general awareness of the unpredictability of events can prompt you to be modest in your claims, since the policymaking environment may be more important (than your solution) to outcomes.
  • Thissen and Walker focus more on a range of possible roles than a rigid 5-step process.

Beyond 5-step policy analysis

  1. Compare these pragmatic, client-orientated, and communicative models with the questioning, storytelling, and decolonizing approaches by Bacchi, Stone, and L.T. Smith.
  • The latter encourage us to examine more closely the politics of policy processes, including the importance of framing, narrative, and the social construction of target populations to problem definition and policy design.
  • Without this wider perspective, we are focusing on policy analysis as a process rather than considering the political context in which analysts use it.
  1. Additional posts on entrepreneurs and ‘systems thinking’ [to be added] encourage us to reflect on the limits to policy analysis in multi-centric policymaking systems.

 

 

3 Comments

Filed under 750 word policy analysis, agenda setting, Evidence Based Policymaking (EBPM), public policy

Policy Analysis in 750 Words: Reflecting on your role as a policy analyst

This post forms one part of the Policy Analysis in 750 words series overview.

One aim of this series is to combine insights from policy research (1000, 500) and policy analysis texts.

If we take key insights from policy theories seriously, we can use them to identify (a) the constraints to policy analytical capacity, and (b) the ways in which analysts might address them. I use the idea of policy analyst archetypes to compare a variety of possible responses.

Key constraints to policy analytical capacity

Terms like ‘bounded rationality’ highlight major limits on the ability of humans and organisations to process information.

Terms like policymaking ‘context’, ‘environments’, and multi-centric policymaking suggest that the policy process is beyond the limits of policymaker understanding and control.

  • Policy actors need to find ways to act, with incomplete information about the problem they seek to solve and the likely impact of their ‘solution’.
  • They gather information to help reduce uncertainty, but problem definition is really about exercising power to reduce ambiguity: select one way to interpret a problem (at the expense of most others), and limit therefore limit the relevance and feasibility of solutions.
  • This context informs how actors might use the tools of policy analysis. Key texts in this series highlight the use of tools to establish technical feasibility (will it work as intended?), but policymakers also select tools for their political feasibility (who will support or oppose this measure?).

box 2.3 2nd ed UPP

How might policy analysts address these constraints ethically?

Most policy analysis texts (in this series) consider the role of professional ethics and values during the production of policy analysis. However, they also point out that there is not a clearly defined profession and associated code of conduct (e.g. see Adachi). In that context, let us begin with some questions about the purpose of policy analysis and your potential role:

  1. Is your primary role to serve individual clients or some notion of the ‘public good’?
  2. Should you maximise your role as an individual or play your part in a wider profession?
  3. What is the balance between the potential benefits of individual ‘entrepreneurship’ and collective ‘co-productive’ processes?
  4. Which policy analysis techniques should you prioritise?
  5. What forms of knowledge and evidence count in policy analysis?
  6. What does it mean to communicate policy analysis responsibly?
  7. Should you provide a clear recommendation or encourage reflection?

 

Policy analysis archetypes: pragmatists, entrepreneurs, manipulators, storytellers, and decolonisers

In that context, I have created a story of policy analysis archetypes to identify the elements that each text emphasises.

The pragmatic policy analyst

  • Bardach provides the classic simple, workable, 8-step system to present policy analysis to policymakers while subject to time and resource-pressed political conditions.
  • Dunn also uses Wildavsky’s famous phrase ‘art and craft’ to suggest that scientific and ‘rational’ methods can only take us so far.

The professional, clientoriented policy analyst

  • Weimer and Vining provide a similar 7-step client-focused system, but incorporating a greater focus on professional development and economic techniques (such as cost-benefit-analysis) to emphasise a particular form of professional analyst.
  • Meltzer and Schwartz also focus on advice to clients, but with a greater emphasis on a wide variety of methods or techniques (including service design) to encourage the co-design of policy analysis with clients.

The communicative policy analyst

  •  C. Smith focuses on how to write and communicate policy analysis to clients in a political context.
  • Compare with Spiegelhalter and Gigerenzer on how to communicate responsibly when describing uncertainty, probability, and risk.

The manipulative policy analyst.

  • Riker helps us understand the relationship between two aspects of agenda setting: the rules/ procedures to make choice, and the framing of policy problems and solutions.

The entrepreneurial policy analyst

  • Mintrom shows how to combine insights from studies of policy entrepreneurship and policy analysis, to emphasise the benefits of collaboration and creativity.

The questioning policy analyst

  • Bacchi  analyses the wider context in which people give and use such advice, to identify the emancipatory role of analysis and encourage policy analysts to challenge dominant social constructions of problems and populations.

The storytelling policy analyst

  • Stone identifies the ways in which people use storytelling and argumentation techniques to define problems and justify solutions. This process is about politics and power, not objectivity and optimal solutions.

The decolonizing policy analyst.

  • L.T. Smith does not describe policy analysis directly, but shows how the ‘decolonization of research methods’ can inform the generation and use of knowledge.
  • Compare with Hindess on the ways in which knowledge-based hierarchies rely on an untenable, circular logic.
  • Compare with Michener’s thread, discussing Doucet’s new essay on (a) the role of power and knowledge in limiting (b) the ways in which we gather evidence to analyse policy problems.

https://twitter.com/povertyscholar/status/1207054211759910912

Using archetypes to define the problem of policy analysis

Studies of the field (e.g. Radin plus Brans, Geva-May, and Howlett) suggest that there are many ways to do policy analysis. Further, as Thissen and Walker describe, such roles are not mutually exclusive, your views on their relative value could change throughout the process of analysis, and you could perform many of these roles.

Further, each text describes multiple roles, and some seem clustered together:

  • pragmatic, client-orientated, and communicative could sum-up the traditional 5-8 step approaches, while
  • questioning, storytelling, and decolonizing could sum up an important (‘critical’) challenge to narrow ways of thinking about policy analysis and the use of information.

Still, the emphasis matters.

Each text is setting an agenda or defining the problem of policy analysis more-or-less in relation to these roles. Put simply, the more you are reading about economic theory and method, the less you are reading about dominance and manipulation.

How can you read further?

Michener’s ‘Policy Feedback in a Racialized Polity’ connects to studies of historical institutionalism, and reminds us to use insights from policy theories to identify the context for policy analysis.

I have co-authored a lot about uncertainty/ ambiguity in relation to ‘evidence based policymaking’, including:

See also The new policy sciences for a discussion of how these issues inform Lasswell’s original vision for the policy sciences (combining the analysis of and for policy).

8 Comments

Filed under 750 word policy analysis, agenda setting, Evidence Based Policymaking (EBPM), feminism, public policy, Storytelling

Policy Analysis in 750 Words: Who should be involved in the process of policy analysis?

This post forms one part of the Policy Analysis in 750 words series overview.

Think of two visions for policy analysis. It should be primarily:

These choices are not mutually exclusive, but there are key tensions between them that should not be ignored, such as when we ask:

  • how many people should be involved in policy analysis?
  • whose knowledge counts?
  • who should control policy design?

Perhaps we can only produce a sensible combination of the two if we clarify their often very different implications for policy analysis. Let’s begin with one story for each and see where they take us.

A story of ‘evidence-based policymaking’

One story of ‘evidence based’ policy analysis is that it should be based on the best available evidence of ‘what works’.

Often, the description of the ‘best’ evidence relates to the idea that there is a notional hierarchy of evidence according to the research methods used.

At the top would be the systematic review of randomised control trials, and nearer the bottom would be expertise, practitioner knowledge, and stakeholder feedback.

This kind of hierarchy has major implications for policy learning and transfer, such as when importing policy interventions from abroad or ‘scaling up’ domestic projects.

Put simply, the experimental method is designed to identify the causal effect of a very narrowly defined policy intervention. Its importation or scaling up would be akin to the description of medicine, in which the evidence suggests the causal effect of a specific active ingredient to be administered with the correct dosage. A very strong commitment to a uniform model precludes the processes we might associate with co-production, in which many voices contribute to a policy design to suit a specific context (see also: the intersection between evidence and policy transfer).

A story of co-production in policymaking

One story of ‘co-produced’ policy analysis is that it should be ‘reflexive’ and based on respectful conversations between a wide range of policymakers and citizens.

Often, the description is of the diversity of valuable policy relevant information, with scientific evidence considered alongside community voices and normative values.

This rejection of a hierarchy of evidence also has major implications for policy learning and transfer. Put simply, a co-production method is designed to identify the positive effect – widespread ‘ownership’ of the problem and commitment to a commonly-agreed solution – of a well-discussed intervention, often in the absence of central government control.

Its use would be akin to a collaborative governance mechanism, in which the causal mechanism is perhaps the process used to foster agreement (including to produce the rules of collective action and the evaluation of success) rather than the intervention itself. A very strong commitment to this process precludes the adoption of a uniform model that we might associate with narrowly-defined stories of evidence based policymaking.

Where can you find these stories in the 750-words series?

  1. Texts focusing on policy analysis as evidence-based/ informed practice (albeit subject to limits) include: Weimer and Vining, Meltzer and Schwartz, Brans, Geva-May, and Howlett (compare with Mintrom, Dunn)
  2. Texts on being careful while gathering and analysing evidence include: Spiegelhalter
  3. Texts that challenge the ‘evidence based’ story include: Bacchi, T. Smith, Hindess, Stone

 

How can you read further?

See the EBPM page and special series ‘The politics of evidence-based policymaking: maximising the use of evidence in policy

There are 101 approaches to co-production, but let’s see if we can get away with two categories:

  1. Co-producing policy (policymakers, analysts, stakeholders). Some key principles can be found in Ostrom’s work and studies of collaborative governance.
  2. Co-producing research to help make it more policy-relevant (academics, stakeholders). See the Social Policy and Administration special issue ‘Inside Co-production’ and Oliver et al’s ‘The dark side of coproduction’ to get started.

To compare ‘epistemic’ and ‘reflexive’ forms of learning, see Dunlop and Radaelli’s ‘The lessons of policy learning: types, triggers, hindrances and pathologies

My interest has been to understand how governments juggle competing demands, such as to (a) centralise and localise policymaking, (b) encourage uniform and tailored solutions, and (c) embrace and reject a hierarchy of evidence. What could possibly go wrong when they entertain contradictory objectives? For example:

  • Paul Cairney (2019) “The myth of ‘evidence based policymaking’ in a decentred state”, forthcoming in Public Policy and Administration(Special Issue, The Decentred State) (accepted version)
  • Paul Cairney (2019) ‘The UK government’s imaginative use of evidence to make policy’, British Politics, 14, 1, 1-22 Open AccessPDF
  • Paul Cairney and Kathryn Oliver (2017) ‘Evidence-based policymaking is not like evidence-based medicine, so how far should you go to bridge the divide between evidence and policy?’ Health Research Policy and Systems (HARPS), DOI: 10.1186/s12961-017-0192-x PDF
  • Paul Cairney (2017) “Evidence-based best practice is more political than it looks: a case study of the ‘Scottish Approach’”, Evidence and Policy, 13, 3, 499-515 PDF

 

9 Comments

Filed under 750 word policy analysis, Evidence Based Policymaking (EBPM), public policy

Policy Analysis in 750 words: the old page

I am redesigning the 750 words page. This is the old version.

The posts in this new series summarise key texts in policy analysis. They present the most common advice about how to ‘do’ policy analysis (to identify a policy problem and possible solutions) and situate this advice within the study of politics, power, and public policy.

This combination of ‘how to’ advice and ‘what actually happens’ research allows you to produce policy analyses and reflect on the political and pragmatic choices you need to make. Policy analysis is not a ‘rational’ or ‘technocratic’ process and we should not pretend otherwise. Rather, our aim in this series is to understand policy analysis through the lens of the policy theories that highlight:

  • a competition to frame problems and identify the technical and political feasibility of solutions; in
  • a policymaking environment over which no one has full understanding or control (even if elected policymakers need to project their control to boost their image of governing competence), during which
  • governments add new policy solutions to an existing, complex, mix of solutions (rather than working from a blank canvas).

In that context, you may find that the summaries make more sense with reference to four main themes from the 1000 words and 500 word policy theories series (and this draft document):

  1. How do you define a policy problem, and what types of solutions are available?
  2. How does your chosen solution relate to existing policies?
  3. Who is your audience, and what can you realistically expect them to do?
  4. Who should be involved in the process of policy analysis?

I then add theme (5): Reflecting on your role as a policy analyst.

In each case, I prompt you to reflect on how (a) your knowledge of politics and policy processes, informs (b) the strategies you adopt when constructing policy analysis. The following description is a long read, but I think it provides essential context that could make the difference between effective and ineffective analysis. My MPP students can also note that this page is the same number of words as the policy analysis/ reflection exercise.

  1. How do you define a policy problem, and what types of solutions are available?

The classic introduction to policy analysis is to ask what is policy? and explore the types of policy tools or instruments that may be used to produce policy change.

Modern discussions also incorporate insights from psychology to understand (a) how policymakers might combine cognition and emotion to reduce uncertainty and ambiguity, to understand problems, and therefore (b) how to communicate effectively when presenting policy analysis. This process is about the power to reduce ambiguity rather than simply the provision of information to reduce uncertainty.

In turn, problem definition influences assessments of the – technical and political -feasibility of solutions, and the ways in which actors will frame any evaluation of policy success.

      2. How does your chosen solution relate to existing policies?

Although you may propose the adoption of one (or more) policy instrument, it will likely add to many others. Governments already combine a large number of instruments to make policy, including legislation, expenditure, economic incentives and penalties, education, and various forms of service delivery. Those instruments combine to represent a complex policy mix whose overall effects are not simple to predict. This interaction provides essential context, particularly if you are asked to provide, say, a simple logic model or ‘theory of change’ to describe the likely impact of your new solution.

      3. Who is your audience, and what can you realistically expect them to do?

For example, imagine writing policy analysis in the ideal-type world of a single powerful ‘comprehensively rational’ policymaker at the heart of an orderly policy cycle. Your analysis would be relatively simple, and you would not need to worry about what happens after you make a recommendation for policy change. You could focus on widely-used tools such as cost-benefit analysis and know where the results would feed into the policy process. I have perhaps over-egged this ideal-type pudding, but I think a lot of traditional policy analysis buys into this basic idea and focuses primarily on the science of analysis rather than the political policymaking context in which it takes place.

Then imagine a far messier and less predictable world in which the nature of the policy issue is highly contestedresponsibility for policy is unclear, and no single ‘centre’ has the power to turn a recommendation into an outcome. This image is a key feature of policy process theories, which describe:

  • Many policymakers and influencers spread across many levels and types of government (as the venues in which authoritative choice takes place). Consequently, it is not a straightforward task to identify and know your audience, particularly if the problem you seek to solve requires a combination of policy instruments controlled by different actors.
  • Each venue resembles an institution driven by formal and informal rules. Formal rules are written-down or widely-known. Informal rules are unwritten, difficult to understand, and may not even be understood in the same way by participants. Consequently, it is difficult to know if your solution will be a good fit with the standard operating procedures of organisations (and therefore if it is politically feasible or too challenging).
  • Policymakers and influencers operate in ‘subsystems’, forming networks built on resources such as trust or coalitions based on shared beliefs. Effective policy analysis may require you to engage with – or become part of – such networks, to allow you to understand the unwritten rules of the game and encourage your audience to trust the messenger. In some cases, the rules relate to your willingness to accept current losses for future gains, to accept the limited impact of your analysis now in the hope of acceptance at the next opportunity.
  • Actors relate their analysis to shared understandings of the world – how it is, and how it should be – which are often so well-established as to be taken for granted. Common terms include paradigms, hegemons, core beliefs, and monopolies of understandings. These dominant frames of reference give meaning to your policy solution. They prompt you to couch your solutions in terms of, for example, a strong attachment to evidence-based cases in public health, value for money in treasury departments, or with regard to core principles such as liberalism or socialism in different political systems.
  • Your solutions relate to socioeconomic context and the events that seem (a) impossible to ignore and (b) out of the control of policymakers. Such factors range from a political system’s geography, demography, social attitudes, economy, while events can be routine elections or unexpected crises. To some extent, you could see yourself as a policy entrepreneur and treat events opportunistically, as ‘windows of opportunity’ to encourage new solutions. Alternatively, a general awareness of the unpredictability of events can prompt you to be modest in your claims, since the policymaking environment may be more important to outcomes than your favoured policy.

What would you recommend under these conditions?  The terms of your cost-benefit analysis would be contested (at least until there is agreement on what the problem is, and how you would measure the success of a solution). Further, even the most sophisticated ‘evidence based’ analysis of a policy problem will fall flat if uninformed by good analysis of the policy process.

Note that these problems amplify the limitations to policy analysis that are more likely to be described in this series. For example,  Weimer and Vining invest about 200 pages in economic analyses of markets and government, often highlighting a gap between (a) our ability to model and predict economic and social behaviour, and (b) what actually happens when governments intervene. To this, we should add the gap between policymakers’ formal responsibilities versus actual control of policy processes and outcomes.

      4. Who should be involved in the process of policy analysis?

Think of two visions for policy analysis. It should be primarily (1) ‘evidence based’ or (2) ‘coproduced’. While these choices are not mutually exclusive, there are key tensions between them that should not be ignored, such as when we ask:

  • how many people should be involved in policy analysis?
  • whose knowledge counts?
  • who should control policy design?

Perhaps we can only produce a sensible combination of the two if we clarify their often very different implications for policy analysis. Let’s begin with one story for each (see also Approach 1 versus Approach 2) and see where they take us.

One story of ‘evidence based’ policy analysis is that it should be based on the best available evidence of ‘what works’. Often, the description of the ‘best’ evidence relates to the idea that there is a notional hierarchy of evidence (according to the research methods used). At the top would be the systematic review of randomised control trials, and nearer the bottom would be expertise, practitioner knowledge, and stakeholder feedback. This kind of hierarchy has major implications for policy learning and transfer, such as when importing policy interventions from abroad or ‘scaling up’ domestic projects. Put simply, the experimental method is designed to identify the causal effect of a very narrowly defined policy intervention. Its importation or scaling up would be akin to the description of medicine, in which the evidence suggests the causal effect of a specific active ingredient to be administered with the correct dosage. A very strong commitment to a uniform model precludes the processes we might associate with co-production, in which many voices contribute to a policy design to suit a specific context.

One story of ‘co-produced’ policy analysis is that it should be ‘reflexive’ and based on respectful conversations between a wide range of policymakers and citizens. Often, the description is of the diversity of valuable policy relevant information, with scientific evidence considered alongside community voices and normative values. This rejection of a hierarchy of evidence also has major implications for policy learning and transfer. Put simply, a co-production method is designed to identify the positive effect – widespread ‘ownership’ of the problem and commitment to a commonly-agreed solution – of a well-discussed intervention, often in the absence of central government control. Its use would be akin to a collaborative governance mechanism, in which the causal mechanism is perhaps the process used to foster agreement (including to produce the rules of collective action and the evaluation of success) rather than the intervention itself. A very strong commitment to this process precludes the adoption of a uniform model that we might associate with narrowly-defined stories of evidence based policymaking.

      5. Reflecting on your role as a policy analyst

If we take insights from policy theories seriously, we need to incorporate them into policy analysis, to consider policymaker psychology and policymaking context alongside the tools of policy analysis. We also need to consider the role of societal and professional values during the production of policy analysis. In other words, consider the limits to your influence and the ethics of your task.

In that context, I have begun to create a story of policy analysis archetypes to help explain this point in context:

  • The pragmatic policy analystBardach provides a simple, workable 8-step system to present policy analysis to policymakers while subject to time and resource-pressed political conditions (and Dunnis pragmatic for other reasons).
  • The professional, clientoriented policy analystWeimer and Vining provide a similar 7-step client-focused system, but incorporating a greater focus on professional development and economic techniques (such as cost-benefit-analysis) to foster efficiency.
  • The communicative policy analystCatherine Smith focuses on how to write and communicate policy analysis to clients in a political context.
  • The entrepreneurial policy analystMintrom shows how to incorporate insights from studies of policy entrepreneurship.
  • The questioning policy analystBacchi’s aim is to analyse the process in which people give and use such advice and to encourage policy analysts to question their role.
  • The storytelling policy analyst. Stone’s aim is to identify the ways in which people use storytelling and argumentation techniques to define problems and justify solutions.
  • The decolonizing policy analystLinda Tuhiwai Smith does not describe policy analysis directly, but shows how the ‘decolonization of research methods’ informs new approaches to policymaking.
  • The manipulative policy analyst. A discussion of Riker helps us understand the relationship between two aspects of agenda setting: the rules/ procedures to make choice, and the framing of policy problems and solutions.

These descriptions allow you to reflect on your role, as part of a wider political or policymaking system:

  1. Is your primary role to serve and communicate to clients? Is it to get what you want?
  2. Are you primarily seeking to maximise your role as an individual or your part in a wider profession?
  3. What is the balance between the potential benefits of individual ‘entrepreneurship’ and collective ‘co-productive’ processes?
  4. Which policy analysis techniques and forms of knowledge do you prioritise?

The initial list of texts

My aim is to summarise the texts below (based initially on a module guide by Dr Raul Pacheco-Vega and any further suggestions you may have) and incorporate this analysis into the draft document How to write theory-driven policy analysis. See also Writing a Policy Paper. The reference has a weblink when a summary is available.

Eugene Bardach (2012) A Practical Guide for Policy Analysis 5th ed. (CQ Press) (see also A step-by-step policy analysis using Bardach’s Eight Step Model)

Carol Bacchi (2009) Analysing Policy: What’s the problem represented to be? (NSW: Pearson Australia)

Eugene Bardach and Erik Patashnik (2015) A Practical Guide for Policy Analysis 5th ed. (International Edition) (CQ Press)

Catherine Smith (2015) Writing Public Policy: A Practical Guide to Communicating in the Policy Making Process (Oxford: Oxford University Press)

David Weimer and Aidan Vining (2017) Policy Analysis: Concepts and Practice, 6th Edition (London: Routledge)

Michael Mintrom (2012) Contemporary Policy Analysis (Oxford: Oxford University Press) (see also Contemporary Policy Analysis (Mintrom 2012))

Dunn, W. (2017) Public Policy Analysis 6th Ed. (London: Routledge)

Geva-May, I. (2005) ‘Thinking Like a Policy Analyst. Policy Analysis as a Clinical Profession’, in Geva-May, I. (ed) Thinking Like a Policy Analyst. Policy Analysis as a Clinical Profession (Basingstoke: Palgrave) (scroll down after Radin) (see also Policy analysis as a clinical profession)

Meltzer, R. and Schwartz, A. (2019) Policy Analysis as Problem Solving (London: Routledge)

Radin, B. (2019) Policy Analysis in the Twenty-First Century (London: Routledge)

Marleen Brans, Iris Geva-May, and Michael Howlett (2017) Routledge Handbook of Comparative Policy Analysis

Thissen, W. and Walker, W. (Eds.). (2013) Public Policy Analysis (London: Springer)

Please let me know if you see some weird omissions from the list. They should give advice about policy analysis rather than describe the policy process (the latter are covered in the 1000 words and 500 word policy theories series). That said, these are some of the books that I use to widen-out the definition of policy analysis books:

William H. Riker (1986) The Art of Political Manipulation (New Haven: Yale University Press)

Linda Tuhiwai Smith (2012) Decolonizing Methodologies (London: Zed Books) (also discusses Lorde, Santos)

Barry Hindess (1977) Philosophy and Methodology in the Social Sciences

Policy Analysis in 750 words: Using Statistics and Explaining Risk (Sincerely)

Policy Analysis in 750 words: Deborah Stone (2012) Policy Paradox

Leave a comment

Filed under 750 word policy analysis, Uncategorized

Policy Analysis in 750 words: Marleen Brans, Iris Geva-May, and Michael Howlett (2017) Routledge Handbook of Comparative Policy Analysis

Please see the Policy Analysis in 750 words series overview before reading the summary (and click here for the full list of authors). This post is a mere 500 words over budget (not including these words describing the number of words).

Brans et al 2017 cover

Marleen Brans, Iris Geva-May, and Michael Howlett (editors) (2017) Routledge Handbook of Comparative Policy Analysis (London: Routledge)

The Handbook … covers … the state of the art knowledge about the science, art and craft of policy analysis in different countries, at different levels of government and by all relevant actors in and outside government who contribute to the analysis of problems and the search for policy solutions’ (Brans et al, 2017: 1).

This book focuses on the interaction between (in Lasswell’s terms) ‘analysis for policy’ (policy analysis) and ‘analysis of policy’ (policy process research). In other words,

  • what can the study of policy analysis tell us about policymaking, and
  • what can studies of policymaking tell budding policy analysts about the nature of their task in relation to their policymaking environment?

Brans et al’s (2017: 1-6) opening discussion suggests that this task is rather unclear and complicated. They highlight the wide range of activity described by the term ‘policy analysis’:

  1. The scope of policy analysis is wide, and its meaning unclear

Analysts can be found in many levels and types of government, in bodies holding governments to account, and outside of government, including interest groups, think tanks, and specialist firms (such as global accountancy or management consultancy firms – Saint-Martin, 2017).

Further, ‘what counts’ as policy analysis can relate to the people that do it, the rules they follow, the processes in which they engage, the form of outputs, and the expectations of clients (Veselý, 2017: 103; Vining and Boardman, 2017: 264).

  1. The role of a policy analyst varies remarkably in relation to context

It varies over time, policy area, type of government (such as central, subnational, local), country, type of political system (e.g. majoritarian and consensus democracies), and ‘policy style’.

  1. Analysis involves ‘science, art and craft’ and the rules are written and unwritten

The process of policy analysis – such as to gather and analyse information, define problems, design and compare solutions, and give policy advice – includes ‘applied social and scientific research as well as more implicit forms of practical knowledge’, and ‘both formal and informal professional practices’ (see also studies of institutions and networks).

  1. The policy process is complex.

It is difficult to identify a straightforward process in which analysts are clearly engaged in multiple, well-defined ‘stages’ of policymaking.

  1. Key principles and practices can be institutionalised, contested, or non-existent.

The idea of policy analysis principles – ‘of transparency, effectiveness, efficiency and accountability through systematic and evidence-based analysis’ – may be entrenched in places like the US but not globally.

In some political systems (particularly in the ‘Anglo-Saxon family of nations’), the most-described forms of policy analysis (in the 750 words series) may be taken for granted (2017: 4):

Even so, the status of science and expertise is often contested, particularly in relation to salient and polarised issues, or more generally:

  • During ‘attempts by elected politicians to restore the primacy of political judgement in the policymaking process, at the expense of technical or scientific evidence’ (2017: 5).
  • When the ‘blending of expert policy analysis with public consultation and participation’ makes ‘advice more competitive and contested’ (2017: 5).
  • When evidence based really means evidence informed, given that there are many legitimate claims to knowledge, and evidence forms one part of a larger process of policy design (van Nispen and de Jong, 2017: 153).

In many political systems, there may be less criticism of the idea of ‘systematic and evidence-based analysis’ because there less capacity to process information. It is difficult to worry about excessively technocratic approaches if they do not exist (a point that CW made to me just before I read this book).

Implications for policy analysis

  1. It is difficult to think of policy analysis as a ‘profession’.

We may wonder if ‘policy analysis’ can ever be based on common skills and methods (such as described by Scott, 2017, and in Weimer and Vining), connected to ‘formal education and training’, a ‘a code of professional conduct’, and the ability of organisations to control membership (Adachi, 2017: 28; compare with Radin and Geva-May).

  1. Policy analysis is a loosely-defined collection of practices that vary according to context.

Policy analysis may, instead, be considered a collection of ‘styles’ (Hassenteufel and Zittoun, 2017), influenced by:

  • competing analytical approaches in different political systems (2017: 65)
  • bureaucratic capacity for analysis (Mendez and Dussauge-Laguna, 2017: 82)
  • a relative tendency to contract out analysis (Veselý, 2017: 113)
  • the types and remits of advisory bodies (e.g. are they tasked simply with offering expert advice, or also to encourage wider participation to generate knowledge?) (Crowley and Head, 2017)
  • the level of government in which analysts work, such as ‘subnational’ (Newman, 2017) or ‘local’ (Lundin and Öberg, 2017)
  • the type of activity, such as when (‘performance’) budgeting analysis is influenced heavily by economic methods and ‘new public management’ reforms (albeit with limited success, followed by attempts at reform) (van Nispen and de Jong, 2017: 143-52)

Policy analysis can also describe a remarkably wide range of activity, including:

  • Public inquiries (Marier, 2017)
  • Advice to MPs, parliaments, and their committees (Wolfs and De Winter, 2017)
  • The strategic analysis of public opinion or social media data (Rothmayr Allison, 2017; Kuo and Cheng, 2017)
  • A diverse set of activities associated with ‘think tanks’ (Stone and Ladi, 2017) and ‘political party think tanks’ (Pattyn et al, 2017)
  • Analysis for and by ‘business associations’ (Vining and Boardman, 2017), unions (Schulze and Schroeder, 2017), and voluntary/ non-profit organisations (Evans et al, 2017), all of whom juggle policy advice to government with keeping members on board.
  • The more-or-less policy relevant work of academic researchers (Blum and Brans, 2017; compare with Dunn and see the EBPM page).
  1. The analysis of and for policy is not so easy to separate in practice.

When defining policy analysis largely as a collection of highly-variable practices, in complex policymaking systems, we can see the symbiotic relationship between policy analysis and policy research. Studying policy analysis allows us to generate knowledge of policy processes. Policy process research demonstrates that the policymaking context influences how we think about policy analysis.

  1. Policy analysis education and training is incomplete without policy process research

Put simply, we should not assume that graduates in ‘policy analysis’ will enter a central government with high capacity, coherent expectations, and a clear demand for the same basic skills. Yet, Fukuyama argues that US University programmes largely teach students:

a battery of quantitative methods … applied econometrics, cost-benefit analysis, decision analysis, and, most recently, use of randomized experiments for program evaluation’ that ‘will tell you what the optimal policy should be’, but not ‘how to achieve that outcome. The world is littered with optimal policies that don’t have a snowball’s chance in hell of being adopted’.

In that context, additional necessary skills include: stakeholder mapping, to identify who is crucial to policy success, defining policy problems in a way that stakeholders and policymakers can support, and including those actors continuously during a process of policy design and delivery. These skills are described at more length by Radin and Geva May, while Botha et al (2017) suggest that the policy analysis programmes (across North American and European Universities) offer a more diverse range of skills (and support for experiential learning) than Fukuyama describes.

6 Comments

Filed under 750 word policy analysis, Evidence Based Policymaking (EBPM), public policy

Policy Analysis in 750 words: William Dunn (2017) Public Policy Analysis

Please see the Policy Analysis in 750 words series overview before reading the summary. This book is a whopper, with almost 500 pages and 101 (excellent) discussions of methods, so 800 words over budget seems OK to me. If you disagree, just read every second word.  By the time you reach the cat hanging in there baby you are about 300 (150) words away from the end.

Dunn 2017 cover

William Dunn (2017) Public Policy Analysis 6th Ed. (Routledge)

Policy analysis is a process of multidisciplinary inquiry aiming at the creation, critical assessment, and communication of policy-relevant knowledge … to solve practical problemsIts practitioners are free to choose among a range of scientific methods, qualitative as well as quantitative, and philosophies of science, so long as these yield reliable knowledge’ (Dunn, 2017: 2-3).

Dunn (2017: 4) describes policy analysis as pragmatic and eclectic. It involves synthesising policy relevant (‘usable’) knowledge, and combining it with experience and ‘practical wisdom’, to help solve problems with analysis that people can trust.

This exercise is ‘descriptive’, to define problems, and ‘normative’, to decide how the world should be and how solutions get us there (as opposed to policy studies/ research seeking primarily to explain what happens).

Dunn contrasts the ‘art and craft’ of policy analysts with other practices, including:

  1. The idea of ‘best practice’ characterised by 5-step plans.
  • In practice, analysis is influenced by: the cognitive shortcuts that analysts use to gather information; the role they perform in an organisation; the time constraints and incentive structures in organisations and political systems; the expectations and standards of their profession; and, the need to work with teams consisting of many professions/ disciplines (2017: 15-6)
  • The cost (in terms of time and resources) of conducting multiple research and analytical methods is high, and highly constrained in political environments (2017: 17-8; compare with Lindblom)
  1. The too-narrow idea of evidence-based policymaking
  • The naïve attachment to ‘facts speak for themselves’ or ‘knowledge for its own sake’ undermines a researcher’s ability to adapt well to the evidence-demands of policymakers (2017: 68; 4 compare with Why don’t policymakers listen to your evidence?).

To produce ‘policy-relevant knowledge’ requires us to ask five questions before (Qs1-3) and after (Qs4-5) policy intervention (2017: 5-7; 54-6):

  1. What is the policy problem to be solved?
  • For example, identify its severity, urgency, cause, and our ability to solve it.
  • Don’t define the wrong problem, such as by oversimplifying or defining it with insufficient knowledge.
  • Key aspects of problems including ‘interdependency’ (each problem is inseparable from a host of others, and all problems may be greater than the sum of their parts), ‘subjectivity’ and ‘artificiality’ (people define problems), ‘instability’ (problems change rather than being solved), and ‘hierarchy’ (which level or type of government is responsible) (2017: 70; 75).
  • Problems vary in terms of how many relevant policymakers are involved, how many solutions are on the agenda, the level of value conflict, and the unpredictability of outcomes (high levels suggest ‘wicked’ problems, and low levels ‘tame’) (2017: 75)
  • ‘Problem-structuring methods’ are crucial, to: compare ways to define or interpret a problem, and ward against making too many assumptions about its nature and cause; produce models of cause-and-effect; and make a problem seem solve-able, such as by placing boundaries on its coverage. These methods foster creativity, which is useful when issues seem new and ambiguous, or new solutions are in demand (2017: 54; 69; 77; 81-107).
  • Problem definition draws on evidence, but is primarily the exercise of power to reduce ambiguity through argumentation, such as when defining poverty as the fault of the poor, the elite, the government, or social structures (2017: 79; see Stone).
  1. What effect will each potential policy solution have?
  • Many ‘forecasting’ methods can help provide ‘plausible’ predictions about the future effects of current/ alternative policies (Chapter 4 contains a huge number of methods).
  • ‘Creativity, insight, and the use of tacit knowledge’ may also be helpful (2017: 55).
  • However, even the most-effective expert/ theory-based methods to extrapolate from the past are flawed, and it is important to communicate levels of uncertainty (2017: 118-23; see Spiegelhalter).
  1. Which solutions should we choose, and why?
  • ‘Prescription’ methods help provide a consistent way to compare each potential solution, in terms of its feasibility and predicted outcome, rather than decide too quickly that one is superior (2017: 55; 190-2; 220-42).
  • They help to combine (a) an estimate of each policy alternative’s outcome with (b) a normative assessment.
  • Normative assessments are based on values such as ‘equality, efficiency, security, democracy, enlightenment’ and beliefs about the preferable balance between state, communal, and market/ individual solutions (2017: 6; 205 see Weimer & Vining, Meltzer & Schwartz, and Stone on the meaning of these values).
  • For example, cost benefit analysis (CBA) is an established – but problematic – economics method based on finding one metric – such as a $ value – to predict and compare outcomes (2017: 209-17; compare Weimer & Vining, Meltzer & Schwartz, and Stone)
  • Cost effectiveness analysis uses a $ value for costs, but compared with other units of measurement for benefits (such as outputs per $) (2017: 217-9)
  • Although such methods help us combine information and values to compare choices, note the inescapable role of power to decide whose values (and which outcomes, affecting whom) matter (2017: 204)
  1. What were the policy outcomes?
  • ‘Monitoring’ methods help identify (say): levels of compliance with regulations, if resources and services reach ‘target groups’, if money is spent correctly (such as on clearly defined ‘inputs’ such as public sector wages), and if we can make a causal link between the policy inputs/ activities/ outputs and outcomes (2017: 56; 251-5)
  • Monitoring is crucial because it is so difficult to predict policy success, and unintended consequences are almost inevitable (2017: 250).
  • However, the data gathered are usually no more than proxy indicators of outcomes. Further, the choice of indicators reflect what is available, ‘particular social values’, and ‘the political biases of analysts’ (2017: 262)
  • The idea of ‘evidence based policy’ is linked strongly to the use of experiments and systematic review to identify causality (2017: 273-6; compare with trial-and-error learning in Gigerenzer, complexity theory, and Lindblom).
  1. Did the policy solution work as intended? Did it improve policy outcomes?
  • Although we frame policy interventions as ‘solutions’, few problems are ‘solved’. Instead, try to measure the outcomes and the contribution of your solution, and note that evaluations of success and ‘improvement’ are contested (2017: 57; 332-41).  
  • Policy evaluation is not an objective process in which we can separate facts from values.
  • Rather, values and beliefs are part of the criteria we use to gauge success (and even their meaning is contested – 2017: 322-32).
  • We can gather facts about the policy process, and the impacts of policy on people, but this information has little meaning until we decide whose experiences matter.

Overall, the idea of ‘ex ante’ (forecasting) policy analysis is a little misleading, since policymaking is continuous, and evaluations of past choices inform current choices.

Policy analysis methods are ‘interdependent’, and ‘knowledge transformations’ describes the impact of knowledge regarding one question on the other four (2017: 7-13; contrast with Meltzer & Schwartz, Thissen & Walker).

Developing arguments and communicating effectively

Dunn (2017: 19-21; 348-54; 392) argues that ‘policy argumentation’ and the ‘communication of policy-relevant knowledge’ are central to policymaking’ (See Chapter 9 and Appendices 1-4 for advice on how to write briefs, memos, and executive summaries and prepare oral testimony).

He identifies seven elements of a ‘policy argument’ (2017: 19-21; 348-54), including:

  • The claim itself, such as a description (size, cause) or evaluation (importance, urgency) of a problem, and prescription of a solution
  • The things that support it (including reasoning, knowledge, authority)
  • Incorporating the things that could undermine it (including any ‘qualifier’, the communication of uncertainty about current knowledge, and counter-arguments).

The key stages of communication (2017: 392-7; 405; 432) include:

  1. ‘Analysis’, focusing on ‘technical quality’ (of the information and methods used to gather it), meeting client expectations, challenging the ‘status quo’, albeit while dealing with ‘political and organizational constraints’ and suggesting something that can actually be done.
  2. ‘Documentation’, focusing on synthesising information from many sources, organising it into a coherent argument, translating from jargon or a technical language, simplifying, summarising, and producing user-friendly visuals.
  3. ‘Utilization’, by making sure that (a) communications are tailored to the audience (its size, existing knowledge of policy and methods, attitude to analysts, and openness to challenge), and (b) the process is ‘interactive’ to help analysts and their audiences learn from each other.

 

hang-in-there-baby

 

Policy analysis and policy theory: systems thinking, evidence based policymaking, and policy cycles

Dunn (2017: 31-40) situates this discussion within a brief history of policy analysis, which culminated in new ways to express old ambitions, such as to:

  1. Use ‘systems thinking’, to understand the interdependence between many elements in complex policymaking systems (see also socio-technical and socio-ecological systems).
  • Note the huge difference between (a) policy analysis discussions of ‘systems thinking’ built on the hope that if we can understand them we can direct them, and (b) policy theory discussions that emphasise ‘emergence’ in the absence of central control (and presence of multi-centric policymaking).
  • Also note that Dunn (2017: 73) describes policy problems – rather than policymaking – as complex systems. I’ll write another post (short, I promise) on the many different (and confusing) ways to use the language of complexity.
  1. Promote ‘evidence based policy, as the new way to describe an old desire for ‘technocratic’ policymaking that accentuates scientific evidence and downplays politics and values (see also 2017: 60-4).

In that context, see Dunn’s (47-52) discussion of comprehensive versus bounded rationality:

  • Note the idea of ‘erotetic rationality’ in which people deal with their lack of knowledge of a complex world by giving up on the idea of certainty (accepting their ‘ignorance’), in favour of a continuous process of ‘questioning and answering’.
  • This approach is a pragmatic response to the lack of order and predictability of policymaking systems, which limits the effectiveness of a rigid attachment to ‘rational’ 5 step policy analyses (compare with Meltzer & Schwartz).

Dunn (2017: 41-7) also provides an unusually useful discussion of the policy cycle. Rather than seeing it as a mythical series of orderly stages, Dunn highlights:

  1. Lasswell’s original discussion of policymaking functions (or functional requirements of policy analysis, not actual stages to observe), including: ‘intelligence’ (gathering knowledge), ‘promotion’ (persuasion and argumentation while defining problems), ‘prescription’, ‘invocation’ and ‘application’ (to use authority to make sure that policy is made and carried out), and ‘appraisal’ (2017: 42-3).
  2. The constant interaction between all notional ‘stages’ rather than a linear process: attention to a policy problem fluctuates, actors propose and adopt solutions continuously, actors are making policy (and feeding back on its success) as they implement, evaluation (of policy success) is not a single-shot document, and previous policies set the agenda for new policy (2017: 44-5).

In that context, it is no surprise that the impact of a single policy analyst is usually minimal (2017: 57). Sorry to break it to you. Hang in there, baby.

hang-in-there-baby

 

13 Comments

Filed under 750 word policy analysis, public policy

Can A Government Really Take Control Of Public Policy?

This post first appeared on the MIHE blog to help sell my book.

During elections, many future leaders give the impression that they will take control of public policy. They promise major policy change and give little indication that anything might stand in their way.

This image has been a major feature of Donald Trump’s rhetoric on his US Presidency. It has also been a feature of campaigns for the UK withdrawal from the European Union (‘Brexit’) to allow its leaders to take back control of policy and policymaking. According to this narrative, Brexit would allow (a) the UK government to make profound changes to immigration and spending, and (b) Parliament and the public to hold the UK government directly to account, in contrast to a distant EU policy process less subject to direct British scrutiny.

Such promises are built on the false image of a single ‘centre’ of government, in which a small number of elected policymakers take responsibility for policy outcomes. This way of thinking is rejected continuously in the modern literature. Instead, policymaking is ‘multi-centric’: responsibility for policy outcomes is spread across many levels and types of government (‘centres’), and shared with organisations outside of government, to the extent that it is not possible to simply know who is in charge and to blame. This arrangement helps explain why leaders promise major policy change but most outcomes represent a minor departure from the status quo.

Some studies of politics relate this arrangement to the choice to share power across many centres. In the US, a written constitution ensures power sharing across different branches (executive, legislative, judicial) and between federal and state or local jurisdictions. In the UK, central government has long shared power with EU, devolved, and local policymaking organisations.

However, policy theories show that most aspects of multi-centric governance are necessary. The public policy literature provides many ways to describe such policy processes, but two are particularly useful.

The first approach is to explain the diffusion of power with reference to an enduring logic of policymaking, as follows:

  • The size and scope of the state is so large that it is always in danger of becoming unmanageable. Policymakers manage complexity by breaking the state’s component parts into policy sectors and sub-sectors, with power spread across many parts of government.
  • Elected policymakers can only pay attention to a tiny proportion of issues for which they are responsible. They pay attention to a small number and ignore the rest. They delegate policymaking responsibility to other actors such as bureaucrats, often at low levels of government.
  • At this level of government and specialisation, bureaucrats rely on specialist organisations for information and advice. Those organisations trade that information/advice and other resources for access to, and influence within, the government.
  • Most public policy is conducted primarily through small and specialist ‘policy communities’ that process issues at a level of government not particularly visible to the public, and with minimal senior policymaker involvement.

This description suggests that senior elected politicians are less important than people think, their impact on policy is questionable, and elections may not provide major changes in policy. Most decisions are taken in their name but without their intervention.

A second, more general, approach is to show that elected politicians deal with such limitations by combining cognition and emotion to make choices quickly. Although such action allows them to be decisive, they occur within a policymaking environment over which governments have limited control. Government bureaucracies only have the coordinative capacity to direct policy outcomes in a small number of high priority areas. In most other cases, policymaking is spread across many venues, each with their own rules, networks, ways of seeing the world, and ways of responding to socio-economic factors and events.

In that context, we should always be sceptical when election candidates and referendum campaigners (or, in many cases, leaders of authoritarian governments) make such promises about political leadership and government control.

A more sophisticated knowledge of policy processes allows us to identify the limits to the actions of elected policymakers, and develop a healthier sense of pragmatism about the likely impact of government policy. The question of our age is not: how can governments take back control? Rather, it is: how can we hold policymakers to account in a complex system over which they have limited knowledge and even less control?

Leave a comment

Filed under public policy, UK politics and policy

Policy Analysis in 750 words: Wil Thissen and Warren Walker (2013) Public Policy Analysis

Thissen Walker 2013 cover

Please see the Policy Analysis in 750 words series overview before reading the summary. Please note that this is an edited book and the full list of authors (PDF) is here. I’m using the previous sentence as today’s excuse for not sticking to 750 words.

Wil Thissen and Warren Walker (editors) (2013) Public Policy Analysis: New Developments (Springer)

Our premise is that there is no single, let alone ‘one best’, way of conducting policy analyses (Thissen and Walker, 2013: 2)

Thissen and Walker (2013: 2) begin by identifying the proliferation of (a) policy analysts inside and outside government, (b) the many approaches and methods that could count as policy analysis (see Radin), and therefore (c) a proliferation of concepts to describe it.

Like Vining and Weimar, they distinguish between:

  1. Policy analysis, as the advice given to clients before they make a choice. Thissen and Walker (2013: 4) describe analysts working with a potential range of clients, when employed directly by governments or organisations, or acting more as entrepreneurs with multiple audiences in mind (compare with Bardach, Weimer & Vining, Mintrom).
  2. Policy process research, as the study of such actors within policymaking systems (see 500 and 1000).

Policy theory: implications for policy analysis

Policy process research informs our understanding of policy analysis, identifying what analysts and their clients (a) can and cannot do, which informs (b) what they should do.

As Enserink et al (2012: 12-3) describe, policy analysis (analysis for policy) will differ profoundly if the policy process is ‘chaotic and messy’ rather than ‘neat and rational’.

The range of policy concepts and theories (analysis of policy) at our disposal helps add meaning to policy analysis as a practice. Like Radin, Enserink et al trace historic attempts to seek ‘rational’ policy analysis then conclude that modern theories – describing policymaking complexity – are ‘more in line with political reality’ (2012: 13-6).

As such, policy analysis shifts from:

(a) A centralised process with few actors inside government, to (b) a messy process including many policymakers and influencers, inside and outside government

(a) Translating science into policy, to (b) a competition to frame issues and assess policy-relevant knowledge

(a) An ‘optimal’ solution from one perspective, to (b) a negotiated solution based on many perspectives (in which optimality is contested)

(a) Analysing a policy problem/ solution with a common metric (such as cost benefit analysis), to (b) developing skills relating to: stakeholder analysis, network management, collaboration, mediation or conflict resolution based on sensitivity to the role of different beliefs, and the analysis of policymaking institutions to help resolve fragmentation (2013: 17-34).

Their Table 2.1 (2012: 35) outlines these potential differences (pop your reading glasses on …. now!):

Enserink et al 2012 page 35

In many cases, the role of an analyst remains uncertain. If we follow the ACF story, does an analyst appeal to one coalition or seek to mediate between them? If we follow MSA, do they wait for a ‘window of opportunity’ or seek to influence problem definition and motivation to adopt certain solutions?

Policy Analysis: implications for policy theory

In that context, rather than identify a 5-step plan for policy analysis, Mayer et al (2013: 43-50) suggest that policy analysts tend to perform one or more of six activities:

  1. ‘Research and analyze’, to collect information relevant to policy problems.
  2. ‘Design and recommend’, to produce a range of potential solutions.
  3. ‘Clarify values and arguments’, to identify potential conflicts and facilitate high quality debate.
  4. ‘Advise strategically’, to help a policymaker choose an effective solution within their political context.
  5. ‘Democratize’, to pursue a ‘normative and ethical objective: it should further equal access to, and influence on, the policy process for all stakeholders’ (2013: 47)
  6. ‘Mediate’, to foster many forms of cooperation between governments, stakeholders (including business), researchers, and/ or citizens.

Styles of policy analysis

Policy analysts do not perform these functions sequentially or with equal weight.

Rather, Mayer et al (2013: 50-5) describe ‘six styles of policy analysis’ that vary according to the analyst’s ‘assumptions about science (epistemology), democracy, learning, and change’ (and these assumptions may change during the process):

  1. Rational, based on the idea that we can conduct research in a straightforward way within a well-ordered policy process (or modify the analysis to reflect limits to research and order).
  2. Argumentative, based on a competition to define policy problems and solutions (see Stone).
  3. Client advice, based on the assumption that analysis is part of a ‘political game’, and analysts bring knowledge of political strategy and policymaking complexity.
  4. Participatory, to facilitate a more equal access to information and debate among citizens.
  5. Process, based on the idea that the faithful adherence to good procedures aids high quality analysis (and perhaps mitigates an ‘erratic and volatile’ policy process)
  6. Interactive, based on the idea that the rehearsal of many competing perspectives is useful to policymaker deliberations (compare with reflexive learning).

In turn, these styles prompt different questions to evaluate the activities associated with analysis (2013: 56):

p56 Mayer et al

In relation to the six policy analysis activities,

  • the criteria for good policy analysis include: the quality of knowledge, usefulness of advice to clients and stakeholders, quality of argumentation, pragmatism of advice, transparency of processes, and ability to secure a mediated settlement (2013: 58).
  • The positive role for analysts includes ‘independent scientist’ or expert, ‘ethicist’, ‘narrator’, ‘counsellor’, ‘entrepreneur’,’ democratic advocate’, or ‘facilitator’ (2013: 59).

Further, their – rather complicated – visualisations of these roles (e.g. p60; compare with the Appendix) project the (useful) sense that (a) individuals face a trade-off between roles (even if they seek to combine some), and (b) many people making many trade-offs adds up to a complex picture of activity.

Therefore, we should bear in mind that

(a) there exist some useful 5-step guides for budding analyst, but

(b) even if they adopt a simple strategy, analysts will also need to find ways to understand and engage with a complex policymaking systems containing a huge number of analysts, policymakers, and influencers.

Policy Analysis styles: implications for problem definition and policy design

Thissen (2013: 66-9) extends the focus on policymaking context and policy analysis styles to problem definition, including:

  1. A rational approach relies on research knowledge to diagnose problems (the world is knowable, use the best scientific methods to produce knowledge, and subject the results to scientific scrutiny).
  2. A ‘political game model’ emphasises key actors and their perspectives, value conflicts, trust, and interdependence (assess the potential to make deals and use skills of mediation and persuasion to secure them).

These different starting points influence they ways in which analysts might take steps to identify: how people perceive policy problems, if other definitions are more useful, how to identify a problem’s cause and effect, and the likely effect of a proposed solution, communicate uncertainty, and relate the results to a ‘policy arena’ with its own rules on resolving conflict and producing policy instruments (2013: 70-84; 93-4).

Similarly, Bots (2013: 114) suggests that these styles inform a process of policy design, constructed to change people’s minds during repeated interactions with clients (such as by appealing to scientific evidence or argumentation).

Bruijn et al (2013: 134-5) situate such activities in modern discussions of policy analysis:

  1. In multi-centric systems, with analysts focused less on ‘unilateral decisions using command and control’ and more on ‘consultation and negotiation among stakeholders’ in networks.
  • The latter are necessary because there will always be contestation about what the available information tells us about the problem, often without a simple way to negotiate choices on solutions.
  1. In relation to categories of policy problems, including
  • ‘tamed’ (high knowledge/ technically solvable, with no political conflict)
  • ‘untamed ethical/political’ (technically solvable, with high moral and political conflict)
  • ‘untamed scientific’ (high consensus but low scientific knowledge)
  • ‘untamed’ problems (low consensus, low knowledge).

Put simply, ‘rational’ approaches may help address low knowledge, while other skills are required to manage processes such as conflict resolution and stakeholder engagement (2013: 136-40)

Policy Analysis styles: implications for models

Part 2 of the book relates such styles (and assumptions about how ‘rational’ and comprehensive our analyses can be) to models of policy analysis. For example,

  1. Walker and van Daalen (2013: 157-84) explore models designed to compare the status quo with a future state, often based on the (shaky) assumption that the world is knowable and we can predict with sufficient accuracy the impact of policy solutions.
  2. Hermans and Cunningham (2013: 185-213) describe models to trace agent behaviour in networks and systems, and create multiple possible scenarios, which could help explore the ‘implementability’ of policies.
  3. Walker et al (2013: 215-61) relate policy analysis styles to how analysts might deal with uncertainty.
  • Some models may serve primarily to reduce ‘epistemic’ uncertainty associated with insufficient knowledge about the future (perhaps with a focus on methods and statistical analysis).
  • Others may focus on resolving ambiguity, in which many actors may define/ interpret problems and feasible solutions in different ways.

Overall, this book contains one of the most extensive discussions of 101 different technical models for policy analysis, but the authors emphasise their lack of value without initial clarity about (a) our beliefs regarding the nature of policymaking and (b) the styles of analysis we should use to resolve policy problems. Few of these initial choices can be resolved simply with reference to scientific analysis or evidence.

10 Comments

Filed under 750 word policy analysis, Evidence Based Policymaking (EBPM), public policy, Research design

Policy Analysis in 750 words: Rachel Meltzer and Alex Schwartz (2019) Policy Analysis as Problem Solving

Please see the Policy Analysis in 750 words series overview before reading the summary. This post might well represent the largest breach of the ‘750 words’ limit, so please get comfortable. I have inserted a picture of a cat hanging in there baby after the main (*coughs*) 1400-word summary. The rest is bonus material, reflecting on the links between this book and the others in the series.

Meltzer Schwartz 2019 cover

Rachel Meltzer and Alex Schwartz (2019) Policy Analysis as Problem Solving (Routledge)

We define policy analysis as evidence-based advice giving, as the process by which one arrives at a policy recommendation to address a problem of public concern. Policy analysis almost always involves advice for a client’ (Meltzer and Schwartz, 2019: 15).

Meltzer and Schwartz (2019: 231-2) describe policy analysis as applied research, drawing on many sources of evidence, quickly, with limited time, access to scientific research, or funding to conduct a lot of new research (2019: 231-2). It requires:

  • careful analysis of a wide range of policy-relevant documents (including the ‘grey’ literature often produced by governments, NGOs, and think tanks) and available datasets
  • perhaps combined with expert interviews, focus groups, site visits, or an online survey (see 2019: 232-64 on methods).

Meltzer and Schwartz (2019: 21) outline a ‘five-step framework’ for client-oriented policy analysis. During each step, they contrast their ‘flexible’ and ‘iterative’ approach with a too- rigid ‘rationalistic approach’ (to reflect bounded, not comprehensive, rationality):

  1. ‘Define the problem’.

Problem definition is a political act of framing, not an exercise in objectivity (2019: 52-3). It is part of a narrative to evaluate the nature, cause, size, and urgency of an issue (see Stone), or perhaps to attach to an existing solution (2019: 38-40; compare with Mintrom).

In that context, ask yourself ‘Who is defining the problem? And for whom?’ and do enough research to be able to define it clearly and avoid misunderstanding among you and your client (2019: 37-8; 279-82):

  • Identify your client’s resources and motivation, such as how they seek to use your analysis, the format of analysis they favour, their deadline, and their ability to make or influence the policies you might suggest (2019: 49; compare with Weimer and Vining).
  • Tailor your narrative to your audience, albeit while recognising the need to learn from ‘multiple perspectives’ (2019: 40-5).
  • Make it ‘concise’ and ‘digestible’, not too narrowly defined, and not in a way that already closes off discussion by implying a clear cause and solution (2019: 51-2).

In doing so:

  • Ask yourself if you can generate a timeline, identify key stakeholders, and place a ‘boundary’ on the problem.
  • Establish if the problem is urgent, who cares about it, and who else might care (or not) (2019 : 46).
  • Focus on the ‘central’ problem that your solution will address, rather than the ‘related’ and ‘underlying’ problems that are ‘too large and endemic to be solved by the current analysis’ (2019: 47).
  • Avoid misdiagnosing a problem with reference to one cause. Instead, ‘map’ causation with reference to (say) individual and structural causes, intended and unintended consequences, simple and complex causation, market or government failure, and/ or the ability to blame an individual or organisation (2019: 48-9).
  • Combine quantitative and qualitative data to frame problems in relation to: severity, trends in severity, novelty, proximity to your audience, and urgency or crisis (2019: 53-4).

During this process, interrogate your own biases or assumptions and how they might affect your analysis (2019: 50).

2. ‘Identify potential policy options (alternatives) to address the problem’.

Common sources of ideas include incremental changes from current policy, ‘client suggestions’, comparable solutions (from another time, place, or policy area), reference to common policy instruments, and ‘brainstorming’ or ‘design thinking’ (2019: 67-9; see box 2.3 and 7.1, below, from Understanding Public Policy).

box 2.3 2nd ed UPP

Identify a ‘wide range’ of possible solutions, then select the (usually 3-5) ‘most promising’ for further analysis (2019: 65). In doing so:

  • be careful not to frame alternatives negatively (e.g. ‘death tax’ – 2019: 66)
  • compare alternatives in ‘good faith’ rather than keeping some ‘off the table’ to ensure that your preferred solution looks good (2019: 66)
  • beware ‘ best practice’ ideas that are limited in terms of (a) applicability (if made at a smaller scale, or in a very different jurisdiction), and (b) evidence of success (2019: 70; see studies of policy learning and transfer)
  • think about how to modify existing policies according to scale or geographical coverage, who to include (and based on what criteria), for how long, using voluntary versus mandatory provisions, and ensuring oversight (2019: 71-3)
  • consider combinations of common policy instruments, such as regulations and economic penalties/ subsidies (2019: 73-7)
  • consider established ways to ‘brainstorm’ ideas (2019: 77-8)
  • note the rise of instruments derived from the study of psychology and behavioural public policy (2019: 79-90)
  • learn from design principles, including ‘empathy’, ‘co-creating’ policy with service users or people affected, ‘prototyping’ (2019: 90-1)

box 7.1

3. ‘Specify the objectives to be attained in addressing the problem and the criteria to  evaluate  the  attainment  of  these  objectives  as  well as  the  satisfaction  of  other  key  considerations  (e.g.,  equity,  cost, equity, feasibility)’.

Your objectives relate to your problem definition and aims: what is the problem, what do you want to happen when you address it, and why?

  • For example, questions to your client may include: what is your organization’s ‘mission’, what is feasible (in terms of resources and politics), which stakeholders to you want to include, and how will you define success (2019: 105; 108-12)?

In that values-based context, your criteria relate to ways to evaluate each policy’s likely impact (2019: 106-7). They should ensure:

  • Comprehensiveness. E.g. how many people, and how much of their behaviour, can you influence while minimizing the ‘burden’ on people, businesses, or government? (2019: 113-4)
  • Mutual Exclusiveness. In other words, don’t have two objectives doing the same thing (2019: 114).

Common criteria include (2019: 116):

  1. Effectiveness. The size of its intended impact on the problem (2019: 117).
  2. Equity (fairness). The impact in terms of ‘vertical equity’ (e.g. the better off should pay more), ‘horizontal equity’ (e.g. you should not pay more if unmarried), fair process, fair outcomes, and ‘intergenerational’ equity (e.g. don’t impose higher costs on future populations) (2019: 118-19).
  3. Feasibility (administrative, political, and technical). The likelihood of this policy being adopted and implemented well (2019: 119-21)
  4. Cost (or financial feasibility). Who would bear the cost, and their willingness and ability to pay (2019: 122).
  5. Efficiency. To maximise the benefit while minimizing costs (2019: 122-3).

 

4. ‘Assess the outcomes of the policy options in light of the criteria and weigh trade-offs between the advantages and disadvantages of the options’.

When explaining objectives and criteria,

  • ‘label’ your criteria in relation to your policy objectives (e.g. to ‘maximize debt reduction’) rather than using generic terms (2019: 123-7)
  • produce a table – with alternatives in rows, and criteria in columns – to compare each option
  • quantify your policies’ likely outcomes, such as in relation to numbers of people affected and levels of income transfer, or a percentage drop in the size of the problem, but also
  • communicate the degree of uncertainty related to your estimates (2019: 128-32; see Spiegelhalter)

Consider using cost-benefit analysis to identify (a) the financial and opportunity cost of your plans (what would you achieve if you spent the money elsewhere?), compared to (b) the positive impact of your funded policy (2019: 141-55).

  • The principle of CBA may be intuitive, but a thorough CBA process is resource-intensive, vulnerable to bias and error, and no substitute for choice. It requires you to make a collection of assumptions about human behaviour and likely costs and benefits, decide whose costs and benefits should count, turn all costs and benefits into a single measure, and imagine how to maximise winners and compensate losers (2019: 155-81; compare Weimer and Vining with Stone).
  • One alternative is cost-effectiveness analysis, which quantifies costs and relates them to outputs (e.g. number of people affected, and how) without trying to translate them into a single measure of benefit (2019: 181-3).
  • These measures can be combined with other thought processes, such as with reference to ‘moral imperatives’, a ‘precautionary approach’, and ethical questions on power/ powerlessness (2019: 183-4).

 

5. ‘Arrive at a recommendation’.

Predict the most likely outcomes of each alternative, while recognising high uncertainty (2019: 189-92). If possible,

  • draw on existing, comparable, programmes to predict the effectiveness of yours (2019: 192-4)
  • combine such analysis with relevant theories to predict human behaviour (e.g. consider price ‘elasticity’ if you seek to raise the price of a good to discourage its use) (2019: 193-4)
  • apply statistical methods to calculate the probability of each outcome (2019: 195-6), and modify your assumptions to produce a range of possibilities, but
  • note Spiegelhalter’s cautionary tales and anticipate the inevitable ‘unintended consequences’ (when people do not respond to policy in the way you would like) (2019: 201-2)
  • use these estimates to inform a discussion on your criteria (equity, efficiency, feasibility) (2019: 196-200)
  • present the results visually – such as in a ‘matrix’ – to encourage debate on the trade-offs between options
  • simplify choices by omitting irrelevant criteria and options that do not compete well with others (2019: 203-10)
  • make sure that your recommendation (a) flows from the analysis, and (b) is in the form expected by your client (2019: 211-12)
  • consider making a preliminary recommendation to inform an iterative process, drawing feedback from clients and stakeholder groups (2019: 212).

 

hang-in-there-baby

 

Policy analysis in a wider context

Meltzer and Schwartz’s approach makes extra sense if you have already read some of the other texts in the series, including:

  1. Weimer and Vining, which represents an exemplar of an X-step approach informed heavily by the study of economics and application of economic models such as cost-benefit-analysis (compare with Radin’s checklist).
  2. Geva-May on the existence of a policy analysis profession with common skills, heuristics, and (perhaps) ethics (compare with Meltzer and Schwartz, 2019: 282-93)
  3. Radin, on:
  • the proliferation of analysts across multiple levels of government, NGOs, and the private sector (compare with Meltzer and Schwartz, 2019: 269-77)
  • the historic shift of analysis from formulation to all notional stages (contrast with Meltzer and Schwartz, 2019: 16-7 on policy analysis not including implementation or evaluation)
  • the difficulty in distinguishing between policy analysis and advocacy in practice (compare with Meltzer and Schwartz, 2019: 276-8, who suggest that actors can choose to perform these different roles)
  • the emerging sense that it is difficult to identify a single client in a multi-centric policymaking system. Put another way, we might be working for a specific client but accept that their individual influence is low.
  1. Stone’s challenge to
  • a historic tendency for economics to dominate policy analysis,
  • the applicability of economic assumptions (focusing primarily on individualist behaviour and markets), and
  • the pervasiveness of ‘rationalist’ policy analysis built on X-steps.

Meltzer and Schwartz (2019: 1-3) agree that economic models are too dominant (identifying the value of insights from ‘other disciplines – including design, psychology, political science, and sociology’).

However, they argue that critiques of rational models exaggerate their limitations (2019: 23-6). For example:

  • these models need not rely solely on economic techniques or quantification, a narrow discussion or definition of the problem, or the sense that policy analysis should be comprehensive, and
  • it is not problematic for analyses to reflect their client’s values or for analysts to present ambiguous solutions to maintain wide support, partly because
  • we would expect the policy analysis to form only one part of a client’s information or strategy.

Further, they suggest that these critiques provide no useful alternative, to help guide new policy analysts. Yet, these guides are essential:

to be persuasive, and credible, analysts must situate the problem, defend their evaluative criteria, and be able to demonstrate that their policy recommendation is superior, on balance, to other alternative options in addressing the problem, as defined by the analyst. At a minimum, the analyst needs to present a clear and defensible ranking of options to guide the decisions of the policy makers’ (Meltzer and Schwartz, 2019: 4).

Meltzer and Schwartz (2019: 27-8) then explore ways to improve a 5-step model with insights from approaches such as ‘design thinking’, in which actors use a similar process – ‘empathize, define the problem, ideate, prototype, test and get feedback from others’ – to experiment with policy solutions without providing a narrow view on problem definition or how to evaluate responses.

Policy analysis and policy theory

One benefit to Meltzer and Schwartz’s approach is that it seeks to incorporate insights from policy theories and respond with pragmatism and hope. However, I think you also need to read the source material to get a better sense of those theories, key debates, and their implications. For example:

  1. Meltzer and Schwartz (2019: 32) note correctly that ‘incremental’ does not sum up policy change well. Indeed, Punctuated Equilibrium Theory shows that policy change is characterised by a huge number of small and a small number of huge changes.
  • However, the direct implications of PET are not as clear as they suggest. Baumgartner and Jones have both noted that they can measure these outcomes and identify the same basic distribution across a political system, but not explain or predict why particular policies change dramatically.
  • It is useful to recommend to policy analysts that they invest some hope in major policy change, but also sensible to note that – in the vast majority of cases – it does not happen.
  • On his point, see Mintrom on policy analysis for the long term, Weiss on the ‘enlightenment’ function of research and analysis, and Box 6.3 (from Understanding Public Policy), on the sense that (a) we can give advice to ‘budding policy entrepreneurs’ on how to be effective analysts, but (b) should note that all their efforts could be for nothing.

box 6.3

  1. Meltzer and Schwartz (2019: 32-3) tap briefly into the old debate on whether it is preferable to seek radical or incremental change. For more on that debate, see chapter 5 in the 1st ed of Understanding Public Policy in which Lindblom notes that proposals for radical/ incremental changes are not mutually exclusive.
  2. Perhaps explore the possible tension between Meltzer and Schwartz’s (2019: 33-4) recommendation that (a) policy analysis should be ‘evidence-based advice giving’, and (b) ‘flexible and open-ended’.
  • I think that Stone’s response would be that phrases such as ‘evidence based’ are not ‘flexible and open-ended’. Rather, they tend to symbolise a narrow view of what counts as evidence (see also Smith, and Hindess).
  • Further, note that the phrase ‘evidence based policymaking’ is a remarkably vague term (see the EBPM page), perhaps better seen as a political slogan than a useful description or prescription of policymaking.

 

Finally, if you read enough of these policy analysis texts, you get a sense that many are bunched together even if they describe their approach as new or distinctive.

  • Indeed, Meltzer and Schwarz (2019: 22-3) provide a table (containing Bardach and Patashnik, Patton et al, Stokey and Zeckhauser, Hammond et al, and Weimer & Vining) of ‘quite similar’ X-step approaches.
  • Weimer and Vining also discuss the implications of policy theories and present the sense that X-step policy analysis should be flexible and adaptive.
  • Many texts – including Radin, and Smith (2016) – focus on the value of case studies to think through policy analysis in particular contexts, rather than suggesting that we can produce a universal blueprint.

However, as Geva-May might suggest, this is not a bad thing if our aim is to generate the sense that policy analysis is a profession with its own practices and heuristics.

 

 

16 Comments

Filed under 750 word policy analysis, agenda setting, Evidence Based Policymaking (EBPM), public policy

Policy Analysis in 750 words: Beryl Radin, B (2019) Policy Analysis in the Twenty-First Century

Please see the Policy Analysis in 750 words series overview before reading the summary. As usual, the 750-word description is more for branding than accuracy.

Beryl Radin (2019) Policy Analysis in the Twenty-First Century (Routledge)

Radin cover 2019

The basic relationship between a decision-maker (the client) and an analyst has moved from a two-person encounter to an extremely complex and diverse set of interactions’ (Radin, 2019: 2).

Many texts in this series continue to highlight the client-oriented nature of policy analysis (Weimer and Vining), but within a changing policy process that has altered the nature of that relationship profoundly.

This new policymaking environment requires new policy analysis skills and training (see Mintrom), and limits the applicability of classic 8-step (or 5-step) policy analysis techniques (2019: 82).

We can use Radin’s work to present two main stories of policy analysis:

  1. The old ways of making policy resembled a club, or reflected a clear government hierarchy, involving:
  • a small number of analysts, generally inside government (such as senior bureaucrats, scientific experts, and – in particular- economists),
  • giving technical or factual advice,
  • about policy formulation,
  • to policymakers at the heart of government,
  • on the assumption that policy problems would be solved via analysis and action.
  1. Modern policy analysis is characterised by a more open and politicised process in which:
  • many analysts, inside and outside government,
  • compete to interpret facts, and give advice,
  • about setting the agenda, and making, delivering, and evaluating policy,
  • across many policymaking venues,
  • often on the assumption that governments have a limited ability to understand and solve complex policy problems.

As a result, the client-analyst relationship is increasingly fluid:

In previous eras, the analyst’s client was a senior policymaker, the main focus was on the analyst-client relationship, and ‘both analysts and clients did not spend much time or energy thinking about the dimensions of the policy environment in which they worked’ (2019: 59). Now, in a multi-centric policymaking environment:

  1. It is tricky to identify the client.
  • We could imagine the client to be someone paying for the analysis, someone affected by its recommendations, or all policy actors with the ability to act on the advice (2019: 10).
  • If there is ‘shared authority’ for policymaking within one political system, a ‘client’ (or audience) may be a collection of policymakers and influencers spread across a network containing multiple types of government, non-governmental actors, and actors responsible for policy delivery (2019: 33).
  • The growth in international cooperation also complicates the idea of a single client for policy advice (2019: 33-4)
  • This shift may limit the ‘face-to-face encounters’ that would otherwise provide information for – and perhaps trust in – the analyst (2019: 2-3).
  1. It is tricky to identify the analyst
  • Radin (2019: 9-25) traces, from the post-war period in the US, a major expansion of policy analysts, from the notional centre of policymaking in federal government towards analysts spread across many venues, inside government (across multiple levels, ‘policy units’, and government agencies) and congressional committees, and outside government (such as in influential think tanks).
  • Policy analysts can also be specialist external companies contracted by organisations to provide advice (2019: 37-8).
  • This expansion shifted the image of many analysts, from a small number of trusted insiders towards many being treated as akin to interest groups selling their pet policies (2019: 25-6).
  • The nature – and impact – of policy analysis has always been a little vague, but now it seems more common to suggest that ‘policy analysts’ may really be ‘policy advocates’ (2019: 44-6).
  • As such, they may now have to work harder to demonstrate their usefulness (2019: 80-1) and accept that their analysis will have a limited impact (2019: 82, drawing on Weiss’ discussion of ‘enlightenment’).

Consequently, the necessary skills of policy analysis have changed:

Although many people value systematic policy analysis (and many rely on economists), an effective analyst does not simply apply economic or scientific techniques to analyse a problem or solution, or rely on one source of expertise or method, as if it were possible to provide ‘neutral information’ (2019: 26).

Indeed, Radin (2019: 31; 48) compares the old ‘acceptance that analysts would be governed by the norms of neutrality and objectivity’ with

(a) increasing calls to acknowledge that policy analysis is part of a political project to foster some notion of public good or ‘public interest’, and

(b)  Stone’s suggestion that the projection of reason and neutrality is a political strategy.

In other words, the fictional divide between political policymakers and neutral analysts is difficult to maintain.

Rather, think of analysts as developing wider skills to operate in a highly political environment in which the nature of the policy issue is contested, responsibility for a policy problem is unclear, and it is not clear how to resolve major debates on values and priorities:

  • Some analysts will be expected to see the problem from the perspective of a specific client with a particular agenda.
  • Other analysts may be valued for their flexibility and pragmatism, such as when they acknowledge the role of their own values, maintain or operate within networks, communicate by many means, and supplement ‘quantitative data’ with ‘hunches’ when required (2019: 2-3; 28-9).

Radin (2019: 21) emphasises a shift in skills and status

The idea of (a) producing new and relatively abstract ideas, based on high control over available information, at the top of a hierarchical organisation, makes way for (b) developing the ability to:

  • generate a wider understanding of organisational and policy processes, reflecting the diffusion of power across multiple policymaking venues
  • identify a map of stakeholders,
  • manage networks of policymakers and influencers,
  • incorporate ‘multiple and often conflicting perspectives’,
  • make and deliver more concrete proposals (2019: 59-74), while recognising
  • the contested nature of information, and the practices sued to gather it, even during multiple attempts to establish the superiority of scientific evidence (2019: 89-103),
  • the limits to a government’s ability to understand and solve problems (2019: 95-6),
  • the inescapable conflict over trade-offs between values and goals, which are difficult to resolve simply by weighting each goal (2019: 105-8; see Stone), and
  • do so flexibly, to recognise major variations in problem definition, attention and networks across different policy sectors and notional ‘stages’ of policymaking (2019: 75-9; 84).

Radin’s (2019: 48) overall list of relevant skills include:

  1. ‘Case study methods, Cost- benefit analysis, Ethical analysis, Evaluation, Futures analysis, Historical analysis, Implementation analysis, Interviewing, Legal analysis, Microeconomics, Negotiation, mediation, Operations research, Organizational analysis, Political feasibility analysis, Public speaking, Small- group facilitation, Specific program knowledge, Statistics, Survey research methods, Systems analysis’

They develop alongside analytical experience and status, from the early career analyst trying to secure or keep a job, to the experienced operator looking forward to retirement (2019: 54-5)

A checklist for policy analysts

Based on these skills requirements, the contested nature of evidence, and the complexity of the policymaking environment, Radin (2019: 128-31) produces a 4-page checklist of – 91! – questions for policy analysts.

For me, it serves two main functions:

  1. It is a major contrast to the idea that we can break policy analysis into a mere 5-8 steps (rather, think of these small numbers as marketing for policy analysis students, akin to 7-minute abs)
  2. It presents policy analysis as an overwhelming task with absolutely no guarantee of policy impact.

To me, this cautious, eyes-wide-open, approach is preferable to the sense that policy analysts can change the world if they just get the evidence and the steps right.

Further Reading:

  1. Iris Geva-May (2005) ‘Thinking Like a Policy Analyst. Policy Analysis as a Clinical Profession’, in Geva-May (ed) Thinking Like a Policy Analyst. Policy Analysis as a Clinical Profession (Basingstoke: Palgrave)

Although the idea of policy analysis may be changing, Geva-May (2005: 15) argues that it remains a profession with its own set of practices and ways of thinking. As with other professions (like medicine), it would be unwise to practice policy analysis without education and training or otherwise learning the ‘craft’ shared by a policy analysis community (2005: 16-17). For example, while not engaging in clinical diagnosis, policy analysts can draw on 5-step process to diagnose a policy problem and potential solutions (2005: 18-21). Analysts may also combine these steps with heuristics to determine the technical and political feasibility of their proposals (2005: 22-5), as they address inevitable uncertainty and their own bounded rationality (2005: 26-34; see Gigerenzer on heuristics). As with medicine, some aspects of the role – such as research methods – can be taught in graduate programmes, while others may be better suited to on the job learning (2005: 36-40). If so, it opens up the possibility that there are many policy analysis professions to reflect different cultures in each political system (and perhaps the venues within each system).

  1. Vining and Weimar’s take on the distinction between policy analysis and policy process research

 

13 Comments

Filed under 750 word policy analysis, public policy

Policy in 500 Words: Punctuated Equilibrium Theory

See also the original – and now 6 years old – 1000 Words post.

This 500 Words version is a modified version of the introduction to chapter 9 in the 2nd edition of Understanding Public Policy.  

UPP p147 PET box

 Punctuated equilibrium theory (PET) tells a story of complex systems that are stable and dynamic:

  • Most policymaking exhibits long periods of stability, but with the ever-present potential for sudden instability.
  • Most policies stay the same for long periods. Some change very quickly and dramatically.

We can explain this dynamic with reference to bounded rationality: since policymakers cannot consider all issues at all times, they ignore most and promote relatively few to the top of their agenda.

This lack of attention to most issues helps explain why most policies may not change, while intense periods of attention to some issues prompts new ways to frame and solve policy problems.

Some explanation comes from the power of participants, to (a) minimize attention and maintain an established framing, or (b) expand attention in the hope of attracting new audiences more sympathetic to new ways of thinking.

Further explanation comes from policymaking complexity, in which the scale of conflict is too large to understand, let alone control.

The original PET story

The original PET story – described in more detail in the 1000 Words version – applies two approaches – policy communities and agenda setting – to demonstrate stable relationships between interest groups and policymakers:

  • They endure when participants have built up trust and agreement – about the nature of a policy problem and how to address it – and ensure that few other actors have a legitimate role or interest in the issue.
  • They come under pressure when issues attract high policymaker attention, such as following a ‘focusing event’ or a successful attempt by some groups to ‘venue shop’ (seek influential audiences in another policymaking venue). When an issue reaches the ‘top’ of this wider political agenda it is processed in a different way: more participants become involved, and they generate more ways to look at (and seek to solve) the policy.

The key focus is the competition to frame or define a policy problem (to exercise power to reduce ambiguity). The successful definition of a policy problem as technical or humdrum ensures that issues are monopolized and considered quietly in one venue. The reframing of that issue as crucial to other institutions, or the big political issues of the day, ensures that it will be considered by many audiences and processed in more than one venue (see also Schattschneider).

The modern PET story

The modern PET story is about complex systems and attention.

Its analysis of bounded rationality and policymaker psychology remains crucial, since PET measures the consequences of the limited attention of individuals and organisations.

However, note the much greater quantification of policy change across entire political systems (see the Comparative Agendas Project).

PET shows how policy actors and organisations contribute to ‘disproportionate information processing’, in which attention to information fluctuates out of proportion to (a) the size of policy problems and (b) the information on problems available to policymakers.

It also shows that the same basic distribution of policy change – ‘hyperincremental’ in most cases, but huge in some – is present in every political system studied by the CAP (summed up by the image below)

True et al figure 6.2

See also:

5 images of the policy process

17 Comments

Filed under 500 words, agenda setting, public policy

Policy Analysis in 750 words: Deborah Stone (2012) Policy Paradox

Please see the Policy Analysis in 750 words series overview before reading the summary. This post is 750 words plus a bonus 750 words plus some further reading that doesn’t count in the word count even though it does.

Stone policy paradox 3rd ed cover

Deborah Stone (2012) Policy Paradox: The Art of Political Decision Making 3rd edition (Norton)

‘Whether you are a policy analyst, a policy researcher, a policy advocate, a policy maker, or an engaged citizen, my hope for Policy Paradox is that it helps you to go beyond your job description and the tasks you are given – to think hard about your own core values, to deliberate with others, and to make the world a better place’ (Stone, 2012: 15)

Stone (2012: 379-85) rejects the image of policy analysis as a ‘rationalist’ project, driven by scientific and technical rules, and separable from politics. Rather, every policy analyst’s choice is a political choice – to define a problem and solution, and in doing so choosing how to categorise people and behaviour – backed by strategic persuasion and storytelling.

The Policy Paradox: people entertain multiple, contradictory, beliefs and aims

Stone (2012: 2-3) describes the ways in which policy actors compete to define policy problems and public policy responses. The ‘paradox’ is that it is possible to define the same policies in contradictory ways.

‘Paradoxes are nothing but trouble. They violate the most elementary principle of logic: something can’t be two different things at once. Two contradictory interpretations can’t both be true. A paradox is just such an impossible situation, and political life is full of them’ (Stone, 2012: 2).

This paradox does not refer simply to a competition between different actors to define policy problems and the success or failure of solutions. Rather:

  • The same actor can entertain very different ways to understand problems, and can juggle many criteria to decide that a policy outcome was a success and a failure (2012: 3).
  • Surveys of the same population can report contradictory views – encouraging a specific policy response and its complete opposite – when asked different questions in the same poll (2012: 4; compare with Riker)

Policy analysts: you don’t solve the Policy Paradox with a ‘rationality project’

Like many posts in this series (Smith, Bacchi, Hindess), Stone (2010: 9-11) rejects the misguided notion of objective scientists using scientific methods to produce one correct answer (compare with Spiegelhalter and Weimer & Vining). A policy paradox cannot be solved by ‘rational, analytical, and scientific methods’ because:

Further, Stone (2012: 10-11) rejects the over-reliance, in policy analysis, on the misleading claim that:

  • policymakers are engaging primarily with markets rather than communities (see 2012: 35 on the comparison between a ‘market model’ and ‘polis model’),
  • economic models can sum up political life, and
  • cost-benefit-analysis can reduce a complex problem into the sum of individual preferences using a single unambiguous measure.

Rather, many factors undermine such simplicity:

  1. People do not simply act in their own individual interest. Nor can they rank-order their preferences in a straightforward manner according to their values and self-interest.
  • Instead, they maintain a contradictory mix of objectives, which can change according to context and their way of thinking – combining cognition and emotion – when processing information (2012: 12; 30-4).
  1. People are social actors. Politics is characterised by ‘a model of community where individuals live in a dense web of relationships, dependencies, and loyalties’ and exercise power with reference to ideas as much as material interests (2012: 10; 20-36; compare with Ostrom, more Ostrom, and Lubell).
  2. Morals and emotions matter. If people juggle contradictory aims and measures of success, then a story infused with ‘metaphor and analogy’, and appealing to values and emotions, prompts people ‘to see a situation as one thing rather than another’ and therefore draw attention to one aim at the expense of the others (2012: 11; compare with Gigerenzer).

Policy analysis reconsidered: the ambiguity of values and policy goals

Stone (2012: 14) identifies the ambiguity of the criteria for success used in 5-step policy analyses. They do not form part of a solely technical or apolitical process to identify trade-offs between well-defined goals (compare Bardach, Weimer and Vining, and Mintrom). Rather, ‘behind every policy issue lurks a contest over conflicting, though equally plausible, conceptions of the same abstract goal or value’ (2012: 14). Examples of competing interpretations of valence issues include definitions of:

  1. Equity, according to: (a) which groups should be included, how to assess merit, how to identify key social groups, if we should rank populations within social groups, how to define need and account for different people placing different values on a good or service, (b) which method of distribution to use (competition, lottery, election), and (c) how to balance individual, communal, and state-based interventions (2012: 39-62).
  2. Efficiency, to use the least resources to produce the same objective, according to: (a) who determines the main goal and how to balance multiple objectives, (a) who benefits from such actions, and (c) how to define resources while balancing equity and efficiency – for example, does a public sector job and a social security payment represent a sunk cost to the state or a social investment in people? (2012: 63-84).
  3. Welfare or Need, according to factors including (a) the material and symbolic value of goods, (b) short term support versus a long term investment in people, (c) measures of absolute poverty or relative inequality, and (d) debates on ‘moral hazard’ or the effect of social security on individual motivation (2012: 85-106)
  4. Liberty, according to (a) a general balancing of freedom from coercion and freedom from the harm caused by others, (b) debates on individual and state responsibilities, and (c) decisions on whose behaviour to change to reduce harm to what populations (2012: 107-28)
  5. Security, according to (a) our ability to measure risk scientifically (see Spiegelhalter and Gigerenzer), (b) perceptions of threat and experiences of harm, (c) debates on how much risk to safety to tolerate before intervening, (d) who to target and imprison, and (e) the effect of surveillance on perceptions of democracy (2012: 129-53).

Policy analysis as storytelling for collective action

Actors use policy-relevant stories to influence the ways in which their audience understands (a) the nature of policy problems and feasibility of solutions, within (b) a wider context of policymaking in which people contest the proper balance between state, community, and market action. Stories can influence key aspects of collective action, including:

  1. Defining interests and mobilising actors, by drawing attention to – and framing – issues with reference to an imagined social group and its competition (e.g. the people versus the elite; the strivers versus the skivers) (2012: 229-47)
  2. Making decisions, by framing problems and solutions (2012: 248-68). Stone (2012: 260) contrasts the ‘rational-analytic model’ with real-world processes in which actors deliberately frame issues ambiguously, shift goals, keep feasible solutions off the agenda, and manipulate analyses to make their preferred solution seem the most efficient and popular.
  3. Defining the role and intended impact of policies, such as when balancing punishments versus incentives to change behaviour, or individual versus collective behaviour (2012: 271-88).
  4. Setting and enforcing rules (see institutions), in a complex policymaking system where a multiplicity of rules interact to produce uncertain outcomes, and a powerful narrative can draw attention to the need to enforce some rules at the expense of others (2012: 289-310).
  5. Persuasion, drawing on reason, facts, and indoctrination. Stone (2012: 311-30) highlights the context in which actors construct stories to persuade: people engage emotionally with information, people take certain situations for granted even though they produce unequal outcomes, facts are socially constructed, and there is unequal access to resources – held in particular by government and business – to gather and disseminate evidence.
  6. Defining human and legal rights, when (a) there are multiple, ambiguous, and intersecting rights (in relation to their source, enforcement, and the populations they serve) (b) actors compete to make sure that theirs are enforced, (c) inevitably at the expense of others, because the enforcement of rights requires a disproportionate share of limited resources (such as policymaker attention and court time) (2012: 331-53)
  7. Influencing debate on the powers of each potential policymaking venue – in relation to factors including (a) the legitimate role of the state in market, community, family, and individual life, (b) how to select leaders, (c) the distribution of power between levels and types of government – and who to hold to account for policy outcomes (2012: 354-77).

Key elements of storytelling include:

  1. Symbols, which sum up an issue or an action in a single picture or word (2012:157-8)
  2. Characters, such as heroes or villain, who symbolise the cause of a problem or source of solution (2012:159)
  3. Narrative arcs, such as a battle by your hero to overcome adversity (2012:160-8)
  4. Synecdoche, to highlight one example of an alleged problem to sum up its whole (2012: 168-71; compare the ‘welfare queen’ example with SCPD)
  5. Metaphor, to create an association between a problem and something relatable, such as a virus or disease, a natural occurrence (e.g. earthquake), something broken, something about to burst if overburdened, or war (2012: 171-78; e.g. is crime a virus or a beast?)
  6. Ambiguity, to give people different reasons to support the same thing (2012: 178-82)
  7. Using numbers to tell a story, based on political choices about how to: categorise people and practices, select the measures to use, interpret the figures to evaluate or predict the results, project the sense that complex problems can be reduced to numbers, and assign authority to the counters (2012:183-205; compare with Speigelhalter)
  8. Assigning Causation, in relation to categories including accidental or natural, ‘mechanical’ or automatic (or in relation to institutions or systems), and human-guided causes that have intended or unintended consequences (such as malicious intent versus recklessness)
  • ‘Causal strategies’ include to: emphasise a natural versus human cause, relate it to ‘bad apples’ rather than systemic failure, and suggest that the problem was too complex to anticipate or influence
  • Actors use these arguments to influence rules, assign blame, identify ‘fixers’, and generate alliances among victims or potential supporters of change (2012: 206-28).

Wider Context and Further Reading: 1. Policy analysis

This post connects to several other 750 Words posts, which suggest that facts don’t speak for themselves. Rather, effective analysis requires you to ‘tell your story’, in a concise way, tailored to your audience.

For example, consider two ways to establish cause and effect in policy analysis:

One is to conduct and review multiple randomised control trials.

Another is to use a story of a hero or a villain (perhaps to mobilise actors in an advocacy coalition).

  1. Evidence-based policymaking

Stone (2012: 10) argues that analysts who try to impose one worldview on policymaking will find that ‘politics looks messy, foolish, erratic, and inexplicable’. For analysts, who are more open-minded, politics opens up possibilities for creativity and cooperation (2012: 10).

This point is directly applicable to the ‘politics of evidence based policymaking’. A common question to arise from this worldview is ‘why don’t policymakers listen to my evidence?’ and one answer is ‘you are asking the wrong question’.

  1. Policy theories highlight the value of stories (to policy analysts and academics)

Policy problems and solutions necessarily involve ambiguity:

  1. There are many ways to interpret problems, and we resolve such ambiguity by exercising power to attract attention to one way to frame a policy problem at the expense of others (in other words, not with reference to one superior way to establish knowledge).
  1. Policy is actually a collection of – often contradictory – policy instruments and institutions, interacting in complex systems or environments, to produce unclear messages and outcomes. As such, what we call ‘public policy’ (for the sake of simplicity) is subject to interpretation and manipulation as it is made and delivered, and we struggle to conceptualise and measure policy change. Indeed, it makes more sense to describe competing narratives of policy change.

box 13.1 2nd ed UPP

  1. Policy theories and storytelling

People communicate meaning via stories. Stories help us turn (a) a complex world, which provides a potentially overwhelming amount of information, into (b) something manageable, by identifying its most relevant elements and guiding action (compare with Gigerenzer on heuristics).

The Narrative Policy Framework identifies the storytelling strategies of actors seeking to exploit other actors’ cognitive shortcuts, using a particular format – containing the setting, characters, plot, and moral – to focus on some beliefs over others, and reinforce someone’s beliefs enough to encourage them to act.

Compare with Tuckett and Nicolic on the stories that people tell to themselves.

 

18 Comments

Filed under 750 word policy analysis, Evidence Based Policymaking (EBPM), Psychology Based Policy Studies, public policy, Storytelling

Policy Analysis in 750 words: Using Statistics and Explaining Risk (Spiegelhalter and Gigerenzer)

Please see the Policy Analysis in 750 words series overview before reading the summary. This post is close to 750 words if you divide it by 2.

David Spiegelhalter (2018) The Art of Statistics: Learning from Data (Pelican, hardback)

Gerd Gigerenzer (2015) Risk Savvy (Penguin)

Spiegelhalter cover

Policy analysis: the skilful consumption and communication of information

Some use the phrase ‘lies, damned lies, and statistics’ to suggest that people can manipulate the presentation of information to reinforce whatever case they want to make. Common examples include the highly selective sharing of data, and the use of misleading images to distort the size of an effect or strength of a relationship between ‘variables’ (when we try to find out if a change in one thing causes a change in another).

In that context, your first aim is to become a skilled consumer of information.

Or, you may be asked to gather and present data as part of your policy analysis, and don’t seek to mislead people (see Mintrom and compare with Riker).

Your second aim is to become an ethical and skilled communicator of information.

In each case, a good rule of thumb is to assume that the analysts who help policymakers learn how to consume and interpret evidence are more influential than the researchers who produce it.

Research and policy analysis are not so different

Although research is not identical to policy analysis, it highlights similar ambitions and issues. Indeed, Spiegelhalter’s (2018: 6-7) description of ‘using data to help us understand the world and make better judgements’ sounds like Mintrom, and the PPDAC approach – identify a problem, plan how to study it, collect and manage data, analyse, and draw/ communicate conclusions (2018: 14) – is not so different from the ‘steps’ to policy analysis that you will find in Bardach or Weimer and Vining.

PPDAC requires us to understand what people need to do to ‘turn the world into data’, such as to produce precise definitions of things and use observation and modelling to estimate their number or the likelihood of their occurrence (2018: 6-7).

More importantly, consider our inability to define things precisely – e.g. economic activity and unemployment, or wellbeing and happiness – and need to accept that our estimates (a) come with often high levels of uncertainty, and (b) are ‘only the starting point to real understanding of the world’ (2018: 8-9).

In that context, the technical skill to gather and analyse information is necessary for research, while the skill to communicate findings is necessary to avoid misleading your audience.

The pitfalls of information communication

Speigelhalter’s initial discussion highlights the great potential to mislead, via:

  1. deliberate manipulation,
  2. a poor grasp of statistics, and/ or
  3. insufficient appreciation of (a) your non-specialist audience’s potential reaction to (b) different ways to frame the same information (2018: 354-62), perhaps based on
  4. the unscientific belief that scientists are objective and can communicate the truth in a neutral way, rather than storytellers with imperfect data (2018: 68-9; 307; 338; 342-53).

Potentially influential communications include (2018: 19-38):

  1. The type of visual, with bar or line-based charts often more useful than pie charts (and dynamic often better than static – 2018: 71)
  2. The point at which you cut off the chart’s axis to downplay or accentuate the difference between results
  3. Framing the results positively (e.g. survival rate) versus negatively (e.g. death rate)
  4. Describing a higher relative risk (e.g. 18%) or absolute risk (e.g. from 6 in 100 to 7 in 100 cases)
  5. Describing risk in relation to decimal places, percentages, or numbers out of 100
  6. Using the wrong way to describe an average (mode, median, or mean – 2018: 46)
  7. Using a language familiar to specialists but confusing to – and subject to misinterpretation by – non-specialists (e.g. odds ratios)
  8. Translating numbers into words (e.g. what does ‘very likely’ mean?) to describe probability (2018: 320).

These problems with the supply of information combine with the ways that citizens and policymakers consume it.

People use cognitive shortcuts, such as emotions and heuristics, to process information (see p60 of Understanding Public Policy, reproduced below).

It can make them vulnerable to framing and manipulation, and prompt them to change their behaviour after misinterpreting evidence in relation to risk: e.g. eating certain foods (2018: 33), anticipating the weather, taking medicines, or refusing to fly after a vivid event (Gigerenzer, 2015: 2-13).

p60 UPP 2nd ed heuristics

Dealing with scientific uncertainty

Communication is important, but the underlying problem may be actual scientific uncertainty about the ability of our data to give us accurate knowledge of the world, such as when:

  1. We use a survey of a sample population, in the hope that (a) respondents provide accurate answers, and (b) their responses provide a representative picture of the population we seek to understand. In such cases, professional standards and practices exist to minimise, but not remove biases associated with questions and sampling (2018: 74).
  2. Some people ignore (and other people underestimate) the ‘margin of error’ in surveys, even though they could be larger than the reported change in data (2018: 189-92; 247).
  3. Alternatives to surveys have major unintended consequences, such as when government statistics are collected unsystematically or otherwise misrepresent outcomes (2018: 84-5)
  4. ‘Correlation does not equal causation’ (see also The Book of Why).
  • The cause of an association between two things could be either of those things, or another thing (2018: 95-9; 110-15).
  • It is usually prohibitively expensive to conduct and analyse research – such as multiple ‘randomised control trials’ to establish cause and effect in the same ways as medicines trials (2018: 104) – to minimise doubt.
  • Further, our complex and uncontrolled world is not as conducive to the experimental trials of social and economic policies.
  1. The misleading appearance of a short term trend often relates to ‘chance variation’ rather than a long-term trend (e.g. in PISA education tables or murder rates – 2018: 131; 249).
  2. The algorithms used to process huge amounts of data may contain unhelpful rules and misplaced assumptions that bias the results, and this problem is worse if the rules are kept secret (2018: 178-87)
  3. Calculating the probability of events is difficult to do, agree how to do, and to understand (2018: 216-20; 226; 239; 304-7).
  4. The likelihood of identifying ‘false positive’ results in research is high (2018: 278-80). Note the comparison to finding someone guilty when innocent, or innocent when guilty (2018: 284 and compare with Gigerenzer, 2015: 33-7; 161-8). However, the professional incentive to minimise these outcomes or admit the research’s limitations is low (2018: 278; 287; 294-302)

Developing statistical and risk ‘literacy’

In that context, Spiegelhalter (2019: 369-71) summarises key ways to consume data effectively, asking: how rigorous is the study, how much uncertainty remains, if the measures are chosen and communicated well, if you can trust the source to do the work well and not spin the results, if the claim fits with other evidence and has a good explanation, and if the effect is important and relevant to key populations. However:

  1. Many such texts describe how they would like the world to work, and give advice to people to help foster that world. The logical conclusion is that this world does not exist, and most people do not have the training, or use these tips, to describe or consume statistics and their implications well.
  2. Policymaking is about making choices, often under immense time and political pressure, in the face of uncertainty versus ambiguity, and despite our inability to understand policy problems or the likely impact of solutions.

I’m not suggesting that, as a result, you should go full Riker. Rather, as with most of the posts in this series, reflect on how you would act – and expect others to act – during the (very long/ not very likely) transition from your world to this better one. What if your collective task is to make just enough sense of the available information, and your options, to make good enough choices?

Gigerenzer cover

In that context, Gigerenzer (2015) identifies the steps we can take to become ‘risk savvy’.

Begin by rejecting (a) the psychological drive to seek the ‘safety blanket’ of certainty (and avoid the ‘fear of doing something wrong and being blamed’), which causes people to (b) place too much faith in necessarily-flawed technologies or tests to reduce uncertainty, instead of (c) learning some basic tools to assess risk while accepting the inevitability of uncertainty and ambiguity (2015: 18-20; 43; 32-40).

Then, employ simple ‘mind tools’ to assess and communicate risk in each case:

  1. Communicate risk using appropriate visuals, categories, and descriptions of risk (e.g. with reference to absolute risk and ‘natural frequences’, expressed as a proportion of 100 (not a % or decimal) and be sceptical if others do not (2015: 25-7; 168)
  • E.g. do not confuse calculations of risk based (a) on known frequencies (such as the coin toss), and (b) unknown frequencies (such as outcomes of complex systems) (2015: 21-6).
  • E.g. be aware of the difference between (a) the accuracy of a test of a problem (its ability to minimise false positive/ negative results), (b) the likelihood that you have the problem it is testing, and (c) the extent to which you will benefit from an intervention for that problem (2015: 33-7; 161-8; 194-7)
  1. Use heuristics that are shown to be efficient and reliable in particular situations.
  • rely frequently on ‘gut feeling’ (‘a judgment 1. that appears quickly in consciousness, 2. whose underlying reasons we are not fully aware of, yet 3. it is strong enough to act upon’)
  • accept the counterintuitive sense that ‘ignoring information can lead to better, faster, and safer decisions’ (2018: 30-1)
  • equate intuition with ‘unconscious intelligence based on personal experience and smart rules of thumb. You need both intuition and reasoning to be rational’ (2018: 123-4)
  • find efficient ways to trust in other people and practices (2018: 99-103)
  • ‘satisfice’ (choose the first option that satisfies an adequate threshold, rather than consider every option) (2018: 148-9)
  1. Value ‘good errors’ that allow us to learn efficiently (via ‘trial and error’) (2015: 47-51)

Wait a minute

I like these Gigerenzer-style messages a lot and, on reflection, seem to make most of my choices using my gut and trial-and-error (and I only electrocuted myself that one time; I’m told that I barked).

Some of his examples – e.g. ask if a hospital uses airline-style checklists (2015: 53; see also Radin’s checklist), or ask your doctor how they would treat their relative, not yours (2015: 63) – are intuitively appealing. The explainers on risk are profoundly important.

However, note that Gigerenzer devotes a lot of his book to describing the defensive nature of sectors such as business, medicine, and government, linked strongly the absence of the right ‘culture’ to allow learning through error.

Trial and error is a big feature in complexity theory, and Lindblom’s incrementalism, but also be aware that you are surrounded by people whose heuristic may be ‘make sure you don’t get the blame’ (or ‘procedure over performance’ (2018: 65). To recommend trial-and-error policy analysis may be a hard sell.

Further reading

This post is part of the Policy Analysis in 750 words series

The 500 and 1000 words series describe how people act under conditions of bounded rationality and policymaking complexity

Winners and losers: communicating the potential impacts of policies (by Cameron Brick, Alexandra Freeman, Steven Wooding, William Skylark, Theresa Marteau & David Spiegelhalter)

See Policy in 500 Words: Social Construction and Policy Design and ask yourself if Gigerenzer’s (2015: 69) ‘fear whatever your social group fears’ is OK when you are running from a lion, but not if you are cooperating with many target populations.

The study of punctuated equilibrium theory is particularly relevant, since its results reject the sense that policy change follows a ‘normal distribution’. See the chart below (from Theories of the Policy Process 2; also found in 5 Images of the Policy Process) and visit the Comparative Agendas Project to see how they gather the data.

True et al figure 6.2

11 Comments

Filed under 750 word policy analysis, public policy

Policy Analysis in 750 words: Barry Hindess (1977) Philosophy and Methodology in the Social Sciences

Please see the Policy Analysis in 750 words series overview before reading the summary. This post started off as 750 words before growing.

20191129_1725232826725708431485596.jpg

Barry Hindess (1977) Philosophy and Methodology in the Social Sciences (Harvester)

‘If the claims of philosophy to a special kind of knowledge can be shown to be without foundation, if they are at best dogmatic or else incoherent, then methodology is an empty and futile pursuit and its prescriptions are vacuous’ (Hindess, 1977: 4).

This book may seem like a weird addition to a series on policy analysis.

However, it follows the path set by Carol Bacchi, asking whose interests we serve when we frame problems for policy analysis, and Linda Tuhiwai Smith, asking whose research counts when we do so.

One important answer is that the status of research and the framing of the problem result from the exercise of power, rather than the objectivity of analysts and natural superiority of some forms of knowledge.

In other posts on ‘the politics of evidence based policymaking’, I describe some frustrations among many scientists that their views on a hierarchy of knowledge based on superior methods are not shared by many policymakers.  These posts can satisfy different audiences: if you have a narrow view of what counts as good evidence, you can focus on the barriers between evidence and policy; if you have a broader view, you can wonder why those barriers seem higher for other forms of knowledge (e.g. Linda Tuhiwai Smith on the marginalisation of indigenous knowledge).

In this post, I encourage you to go a bit further down this path by asking how people accumulate knowledge in the first place.  For example, see introductory accounts by Chalmers, entertaining debates involving Feyerabend, and Hindess’ book to explore your assumptions about how we know what we know.

My take-home point from these texts is that we are only really able to describe convincingly the argument that we are not accumulating knowledge!

The simple insight from Chalmers’ introduction is that inductive (observational) methods to generate knowledge are circular:

  • we engage inductively to produce theory (to generalise from individual cases), but
  • we use theory to engage in any induction, such as to decide what is important to study, and what observations are relevant/irrelevant, and why.

In other words, we need theories of the world to identify the small number of things to observe (to allow us to filter out an almost unlimited amount of signals from out environments), but we need our observations to generate those theories!

Hindess shows that all claims to knowledge involve such circularity: we employ philosophy to identify the nature of the world (ontology) and how humans can generate valid knowledge of it (epistemology) to inform methodology, to state that scientific knowledge is only valid if it lives up to a prescribed method, then argue that the scientific knowledge validates the methodology and its underlying philosophy (1977: 3-22). If so, we are describing something that makes sense according to the rules and practices of its proponents, not an objective scientific method to help us accumulate knowledge.

Further, different social/ professional groups support different forms of working knowledge that they value for different reasons (such as to establish ‘reliability’ or ‘meaning’). To do so, they invent frameworks to help them theorise the world, such as to describe the relationship between concepts (and key concepts such as cause and effect). These frameworks represent a useful language to communicate about our world rather than simply existing independently of it and corresponding to it.

Hindess’ subsequent work explored the context in which we exercise power to establish the status of some forms of knowledge over others, to pursue political ends rather than simply the ‘objective’ goals of science. As described, it is as relevant now as it was then.

How do these ideas inform policy analysis?

Perhaps, by this stage, you are thinking: isn’t this a relativist argument, concluding that we should never assert the relative value of some forms of knowledge over others (like astronomy versus astrology)?

I don’t think so. Rather, it invites us to do two more sensible things:

  1. Accept that different approaches to knowledge may be ‘incommensurable’.
  • They may not share ‘a common set of perceptions’ (or even a set of comparable questions) ‘which would allow scientists to choose between one paradigm and the other . . . there will be disputes between them that cannot all be settled by an appeal to the facts’ (Hindess, 1988: 74)
  • If so, “there is no possibility of an extratheoretical court of appeal which can ‘validate’ the claims of one position against those of another” (Hindess, 1977: 226).
  1. Reject the sense of self-importance, and hubris, which often seems to accompany discussions of superior forms of knowledge. Don’t be dogmatic. Live by the maxim ‘don’t be an arse’. Reflect on the production, purpose, value, and limitations of our knowledge in different contexts (which Spiegelhalter does well).

On that basis, we can have honest discussions about why we should exercise power in a political system to favour some forms of knowledge over others in policy analysis, reflecting on:

  1. The relatively straightforward issue of internal consistency: is an approach coherent, and does it succeed on its own terms?
  • For example, do its users share a clear language, pursue consistent aims with systematic methods, find ways to compare and reinforce the value of each other’s findings, while contributing to a thriving research agenda (as discussed in box 13.3 below)?
  • Or, do they express their aims in other ways, such as to connect research to emancipation, or value respect for a community over the scientific study of that community?
  1. The not straightforward issue of overall consistency: how can we compare different forms of knowledge when they do not follow each other’s rules or standards?
  • g. what if one approach is (said to be) more rigorous and the other more coherent?
  • g. what if one produces more data but another produces more ownership?

In each case, the choice of criteria for comparison involves political choice (as part of a series of political choices), without the ability – described in relation to ‘cost benefit analysis’ – to translate all relevant factors into a single unit.

  1. The imperative to ‘synthesise’ knowledge.

Spiegelhalter provides a convincing description of the benefits of systematic review and ‘meta-analysis’ within a single, clearly defined, scientific approach containing high agreement on methods and standards for comparison.

However, this approach is not applicable directly to the review of multiple forms of knowledge.

So, what do people do?

  • E.g. some systematic reviewers apply the standards of their own field to all others, which (a) tends to produce the argument that very little high quality evidence exists because other people are doing it wrongly, and (b) perhaps exacerbates a tendency for policymakers to attach relatively low value to such evaluations.
  • E.g. policy analysts are more likely to apply different criteria: is it available, understandable, ‘usable’, and policy relevant (e.g. see ‘knowledge management for policy’)?

Each approach is a political choice to include/ exclude certain forms of knowledge according to professional norms or policymaking imperatives, not a technical process to identify the most objective information. If you are going to do it, you should at least be aware of what you are doing.

box 13.3 2nd ed UPP for HIndess post

6 Comments

Filed under 750 word policy analysis