Tag Archives: cost benefit analysis

Policy Analysis in 750 words: Classic 5-step advice

Policy analysis’ describes the identification of a policy problem and possible solutions.

Classic models of policy analysis are client-oriented. Most texts identify the steps that a policy analysis should follow, from identifying a problem and potential solutions, to finding ways to predict and evaluate the impact of each solution. Each text describes this process in different ways, as outlined in Boxes 1-5. However, for the most part, they follow the same five steps:

  1. Define a policy problem identified by your client.
  2. Identify technically and politically feasible solutions.
  3. Use value-based criteria and political goals to compare solutions.
  4. Predict the outcome of each feasible solution.
  5. Make a recommendation to your client.

Further, they share the sense that analysts need to adapt pragmatically to a political environment. Assume that your audience is not an experienced policy analyst. Assume a political environment in which there is limited attention or time to consider problems, and some policy solutions will be politically infeasible. Describe the policy problem for your audience: to help them see it as something worthy of their energy. Discuss a small number of possible solutions, the differences between them, and their respective costs and benefits. Keep it short with the aid of visual techniques that sum up the issue concisely, to minimise cognitive load and make the problem seem solvable.

Box 1. Bardach (2012) A Practical Guide for Policy Analysis

  1. ‘Define the problem’. Provide a diagnosis of a policy problem, using rhetoric and eye-catching data to generate attention.
  2. ‘Assemble some evidence’. Gather relevant data efficiently.
  3. ‘Construct the alternatives’. Identify the relevant and feasible policy solutions that your audience might consider.
  4. ‘Select the criteria’. Typical value judgements relate to efficiency, equity and fairness, the trade-off between individual freedom and collective action, and the extent to which a policy process involves citizens in deliberation.
  5. ‘Project the outcomes’. Focus on the outcomes that key actors care about (such as value for money), and quantify and visualise your predictions if possible.
  6. ‘Confront the trade-offs’. Compare the pros and cons of each solution, such as how much of a bad service policymakers will accept to cut costs.
  7. ‘Decide’. Examine your case through the eyes of a policymaker.
  8. ‘Tell your story’. Identify your target audience and tailor your case. Weigh up the benefits of oral versus written presentation. Provide an executive summary. Focus on coherence and clarity. Keep it simple and concise. Avoid jargon.

Box 2. Dunn (2017) Public Policy Analysis

  1. What is the policy problem to be solved? Identify its severity, urgency, cause, and our ability to solve it. Don’t define the wrong problem, such as by oversimplifying.
  2. What effect will each potential policy solution have? ‘Forecasting’ methods can help provide ‘plausible’ predictions about the future effects of current/ alternative policies.
  3. Which solutions should we choose, and why? Normative assessments are based on values such as ‘equality, efficiency, security, democracy, enlightenment’ and beliefs about the preferable balance between state, communal, and market/ individual solutions (2017: 6; 205).
  4. What were the policy outcomes? ‘Monitoring is crucial because it is difficult to predict policy success, and unintended consequences are inevitable (2017: 250).
  5. Did the policy solution work as intended? Did it improve policy outcomes? Try to measure the outcomes your solution, while noting that evaluations are contested (2017: 332-41).

Box 3. Meltzer and Schwartz (2019) Policy Analysis as Problem Solving

  1. ‘Define the problem’. Problem definition is a political act of framing, as part of a narrative to evaluate the nature, cause, size, and urgency of an issue.
  2. ‘Identify potential policy options (alternatives) to address the problem’. Identify many possible solutions, then select the ‘most promising’ for further analysis (2019: 65).
  3. Specify the objectives to be attained in addressing the problem and the criteria  to  evaluate  the  attainment  of  these  objectives  as  well as  the  satisfaction  of  other  key  considerations  (e.g.,  equity,  cost, equity, feasibility)’.
  4. ‘Assess the outcomes of the policy options in light of the criteria and weigh trade-offs between the advantages and disadvantages of the options’.
  5. ‘Arrive at a recommendation’. Make a preliminary recommendation to inform an iterative process, drawing feedback from clients and stakeholder groups (2019: 212).

Box 4. Mintrom (2012) Contemporary Policy Analysis

  1. ‘Engage in problem definition’. Define the nature of a policy problem, and the role of government in solving it, while engaging with many stakeholders (2012: 3; 58-60).
  2. ‘Propose alternative responses to the problem’. Identify how governments have addressed comparable problems, and a previous policy’s impact (2012: 21).
  3. ‘Choose criteria for evaluating each alternative policy response’. ‘Effectiveness, efficiency, fairness, and administrative efficiency’ are common (2012: 21).
  4. ‘Project the outcomes of pursuing each policy alternative’. Estimate the cost of a new policy, in comparison with current policy, and in relation to factors such as savings to society or benefits to certain populations.
  5. ‘Identify and analyse trade-offs among alternatives’. Use your criteria and projections to compare each alternative in relation to their likely costs and benefits.
  6. ‘Report findings and make an argument for the most appropriate response’. Client-oriented advisors identify the beliefs of policymakers and tailor accordingly (2012: 22).

Box 5 Weimer and Vining (2017) Policy Analysis: Concepts and Practice

  1. ‘Write to Your Client’. Having a client such as an elected policymaker requires you to address the question they ask, by their deadline, in a clear and concise way that they can understand (and communicate to others) quickly (2017: 23; 370-4).
  2. ‘Understand the Policy Problem’. First, ‘diagnose the undesirable condition’. Second, frame it as ‘a market or government failure (or maybe both)’.
  3. ‘Be Explicit About Values’ (and goals). Identify (a) the values to prioritise, such as ‘efficiency’, ‘equity’, and ‘human dignity’, and (b) ‘instrumental goals’, such as ‘sustainable public finance or political feasibility’, to generate support for solutions.
  4. ‘Specify Concrete Policy Alternatives’. Explain potential solutions in sufficient detail to predict the costs and benefits of each ‘alternative’ (including current policy).
  5. ‘Predict and Value Impacts’. Short deadlines dictate that you use ‘logic and theory, rather than systematic empirical evidence’ to make predictions efficiently (2017: 27)
  6. ‘Consider the Trade-Offs’. Each alternatives will fulfil certain goals more than others. Produce a summary table to make value-based choices about trade-offs (2017: 356-8).
  7. ‘Make a Recommendation’. ‘Unless your client asks you not to do so, you should explicitly recommend one policy’ (2017: 28).

This is an excerpt from The Politics of Policy Analysis, found here: https://paulcairney.wordpress.com/policy-analysis-in-750-words/

4 Comments

Filed under 750 word policy analysis, Uncategorized

Policy Analysis in 750 words: Marleen Brans, Iris Geva-May, and Michael Howlett (2017) Routledge Handbook of Comparative Policy Analysis

Please see the Policy Analysis in 750 words series overview before reading the summary (and click here for the full list of authors). This post is a mere 500 words over budget (not including these words describing the number of words).

Brans et al 2017 cover

Marleen Brans, Iris Geva-May, and Michael Howlett (editors) (2017) Routledge Handbook of Comparative Policy Analysis (London: Routledge)

The Handbook … covers … the state of the art knowledge about the science, art and craft of policy analysis in different countries, at different levels of government and by all relevant actors in and outside government who contribute to the analysis of problems and the search for policy solutions’ (Brans et al, 2017: 1).

This book focuses on the interaction between (in Lasswell’s terms) ‘analysis for policy’ (policy analysis) and ‘analysis of policy’ (policy process research). In other words,

  • what can the study of policy analysis tell us about policymaking, and
  • what can studies of policymaking tell budding policy analysts about the nature of their task in relation to their policymaking environment?

Brans et al’s (2017: 1-6) opening discussion suggests that this task is rather unclear and complicated. They highlight the wide range of activity described by the term ‘policy analysis’:

  1. The scope of policy analysis is wide, and its meaning unclear

Analysts can be found in many levels and types of government, in bodies holding governments to account, and outside of government, including interest groups, think tanks, and specialist firms (such as global accountancy or management consultancy firms – Saint-Martin, 2017).

Further, ‘what counts’ as policy analysis can relate to the people that do it, the rules they follow, the processes in which they engage, the form of outputs, and the expectations of clients (Veselý, 2017: 103; Vining and Boardman, 2017: 264).

  1. The role of a policy analyst varies remarkably in relation to context

It varies over time, policy area, type of government (such as central, subnational, local), country, type of political system (e.g. majoritarian and consensus democracies), and ‘policy style’.

  1. Analysis involves ‘science, art and craft’ and the rules are written and unwritten

The process of policy analysis – such as to gather and analyse information, define problems, design and compare solutions, and give policy advice – includes ‘applied social and scientific research as well as more implicit forms of practical knowledge’, and ‘both formal and informal professional practices’ (see also studies of institutions and networks).

  1. The policy process is complex.

It is difficult to identify a straightforward process in which analysts are clearly engaged in multiple, well-defined ‘stages’ of policymaking.

  1. Key principles and practices can be institutionalised, contested, or non-existent.

The idea of policy analysis principles – ‘of transparency, effectiveness, efficiency and accountability through systematic and evidence-based analysis’ – may be entrenched in places like the US but not globally.

In some political systems (particularly in the ‘Anglo-Saxon family of nations’), the most-described forms of policy analysis (in the 750 words series) may be taken for granted (2017: 4):

Even so, the status of science and expertise is often contested, particularly in relation to salient and polarised issues, or more generally:

  • During ‘attempts by elected politicians to restore the primacy of political judgement in the policymaking process, at the expense of technical or scientific evidence’ (2017: 5).
  • When the ‘blending of expert policy analysis with public consultation and participation’ makes ‘advice more competitive and contested’ (2017: 5).
  • When evidence based really means evidence informed, given that there are many legitimate claims to knowledge, and evidence forms one part of a larger process of policy design (van Nispen and de Jong, 2017: 153).

In many political systems, there may be less criticism of the idea of ‘systematic and evidence-based analysis’ because there less capacity to process information. It is difficult to worry about excessively technocratic approaches if they do not exist (a point that CW made to me just before I read this book).

Implications for policy analysis

  1. It is difficult to think of policy analysis as a ‘profession’.

We may wonder if ‘policy analysis’ can ever be based on common skills and methods (such as described by Scott, 2017, and in Weimer and Vining), connected to ‘formal education and training’, a ‘a code of professional conduct’, and the ability of organisations to control membership (Adachi, 2017: 28; compare with Radin and Geva-May).

  1. Policy analysis is a loosely-defined collection of practices that vary according to context.

Policy analysis may, instead, be considered a collection of ‘styles’ (Hassenteufel and Zittoun, 2017), influenced by:

  • competing analytical approaches in different political systems (2017: 65)
  • bureaucratic capacity for analysis (Mendez and Dussauge-Laguna, 2017: 82)
  • a relative tendency to contract out analysis (Veselý, 2017: 113)
  • the types and remits of advisory bodies (e.g. are they tasked simply with offering expert advice, or also to encourage wider participation to generate knowledge?) (Crowley and Head, 2017)
  • the level of government in which analysts work, such as ‘subnational’ (Newman, 2017) or ‘local’ (Lundin and Öberg, 2017)
  • the type of activity, such as when (‘performance’) budgeting analysis is influenced heavily by economic methods and ‘new public management’ reforms (albeit with limited success, followed by attempts at reform) (van Nispen and de Jong, 2017: 143-52)

Policy analysis can also describe a remarkably wide range of activity, including:

  • Public inquiries (Marier, 2017)
  • Advice to MPs, parliaments, and their committees (Wolfs and De Winter, 2017)
  • The strategic analysis of public opinion or social media data (Rothmayr Allison, 2017; Kuo and Cheng, 2017)
  • A diverse set of activities associated with ‘think tanks’ (Stone and Ladi, 2017) and ‘political party think tanks’ (Pattyn et al, 2017)
  • Analysis for and by ‘business associations’ (Vining and Boardman, 2017), unions (Schulze and Schroeder, 2017), and voluntary/ non-profit organisations (Evans et al, 2017), all of whom juggle policy advice to government with keeping members on board.
  • The more-or-less policy relevant work of academic researchers (Blum and Brans, 2017; compare with Dunn and see the EBPM page).
  1. The analysis of and for policy is not so easy to separate in practice.

When defining policy analysis largely as a collection of highly-variable practices, in complex policymaking systems, we can see the symbiotic relationship between policy analysis and policy research. Studying policy analysis allows us to generate knowledge of policy processes. Policy process research demonstrates that the policymaking context influences how we think about policy analysis.

  1. Policy analysis education and training is incomplete without policy process research

Put simply, we should not assume that graduates in ‘policy analysis’ will enter a central government with high capacity, coherent expectations, and a clear demand for the same basic skills. Yet, Fukuyama argues that US University programmes largely teach students:

a battery of quantitative methods … applied econometrics, cost-benefit analysis, decision analysis, and, most recently, use of randomized experiments for program evaluation’ that ‘will tell you what the optimal policy should be’, but not ‘how to achieve that outcome. The world is littered with optimal policies that don’t have a snowball’s chance in hell of being adopted’.

In that context, additional necessary skills include: stakeholder mapping, to identify who is crucial to policy success, defining policy problems in a way that stakeholders and policymakers can support, and including those actors continuously during a process of policy design and delivery. These skills are described at more length by Radin and Geva May, while Botha et al (2017) suggest that the policy analysis programmes (across North American and European Universities) offer a more diverse range of skills (and support for experiential learning) than Fukuyama describes.

6 Comments

Filed under 750 word policy analysis, Evidence Based Policymaking (EBPM), public policy

Policy Analysis in 750 words: William Dunn (2017) Public Policy Analysis

Please see the Policy Analysis in 750 words series overview before reading the summary. This book is a whopper, with almost 500 pages and 101 (excellent) discussions of methods, so 800 words over budget seems OK to me. If you disagree, just read every second word.  By the time you reach the cat hanging in there baby you are about 300 (150) words away from the end.

Dunn 2017 cover

William Dunn (2017) Public Policy Analysis 6th Ed. (Routledge)

Policy analysis is a process of multidisciplinary inquiry aiming at the creation, critical assessment, and communication of policy-relevant knowledge … to solve practical problemsIts practitioners are free to choose among a range of scientific methods, qualitative as well as quantitative, and philosophies of science, so long as these yield reliable knowledge’ (Dunn, 2017: 2-3).

Dunn (2017: 4) describes policy analysis as pragmatic and eclectic. It involves synthesising policy relevant (‘usable’) knowledge, and combining it with experience and ‘practical wisdom’, to help solve problems with analysis that people can trust.

This exercise is ‘descriptive’, to define problems, and ‘normative’, to decide how the world should be and how solutions get us there (as opposed to policy studies/ research seeking primarily to explain what happens).

Dunn contrasts the ‘art and craft’ of policy analysts with other practices, including:

  1. The idea of ‘best practice’ characterised by 5-step plans.
  • In practice, analysis is influenced by: the cognitive shortcuts that analysts use to gather information; the role they perform in an organisation; the time constraints and incentive structures in organisations and political systems; the expectations and standards of their profession; and, the need to work with teams consisting of many professions/ disciplines (2017: 15-6)
  • The cost (in terms of time and resources) of conducting multiple research and analytical methods is high, and highly constrained in political environments (2017: 17-8; compare with Lindblom)
  1. The too-narrow idea of evidence-based policymaking
  • The naïve attachment to ‘facts speak for themselves’ or ‘knowledge for its own sake’ undermines a researcher’s ability to adapt well to the evidence-demands of policymakers (2017: 68; 4 compare with Why don’t policymakers listen to your evidence?).

To produce ‘policy-relevant knowledge’ requires us to ask five questions before (Qs1-3) and after (Qs4-5) policy intervention (2017: 5-7; 54-6):

  1. What is the policy problem to be solved?
  • For example, identify its severity, urgency, cause, and our ability to solve it.
  • Don’t define the wrong problem, such as by oversimplifying or defining it with insufficient knowledge.
  • Key aspects of problems including ‘interdependency’ (each problem is inseparable from a host of others, and all problems may be greater than the sum of their parts), ‘subjectivity’ and ‘artificiality’ (people define problems), ‘instability’ (problems change rather than being solved), and ‘hierarchy’ (which level or type of government is responsible) (2017: 70; 75).
  • Problems vary in terms of how many relevant policymakers are involved, how many solutions are on the agenda, the level of value conflict, and the unpredictability of outcomes (high levels suggest ‘wicked’ problems, and low levels ‘tame’) (2017: 75)
  • ‘Problem-structuring methods’ are crucial, to: compare ways to define or interpret a problem, and ward against making too many assumptions about its nature and cause; produce models of cause-and-effect; and make a problem seem solve-able, such as by placing boundaries on its coverage. These methods foster creativity, which is useful when issues seem new and ambiguous, or new solutions are in demand (2017: 54; 69; 77; 81-107).
  • Problem definition draws on evidence, but is primarily the exercise of power to reduce ambiguity through argumentation, such as when defining poverty as the fault of the poor, the elite, the government, or social structures (2017: 79; see Stone).
  1. What effect will each potential policy solution have?
  • Many ‘forecasting’ methods can help provide ‘plausible’ predictions about the future effects of current/ alternative policies (Chapter 4 contains a huge number of methods).
  • ‘Creativity, insight, and the use of tacit knowledge’ may also be helpful (2017: 55).
  • However, even the most-effective expert/ theory-based methods to extrapolate from the past are flawed, and it is important to communicate levels of uncertainty (2017: 118-23; see Spiegelhalter).
  1. Which solutions should we choose, and why?
  • ‘Prescription’ methods help provide a consistent way to compare each potential solution, in terms of its feasibility and predicted outcome, rather than decide too quickly that one is superior (2017: 55; 190-2; 220-42).
  • They help to combine (a) an estimate of each policy alternative’s outcome with (b) a normative assessment.
  • Normative assessments are based on values such as ‘equality, efficiency, security, democracy, enlightenment’ and beliefs about the preferable balance between state, communal, and market/ individual solutions (2017: 6; 205 see Weimer & Vining, Meltzer & Schwartz, and Stone on the meaning of these values).
  • For example, cost benefit analysis (CBA) is an established – but problematic – economics method based on finding one metric – such as a $ value – to predict and compare outcomes (2017: 209-17; compare Weimer & Vining, Meltzer & Schwartz, and Stone)
  • Cost effectiveness analysis uses a $ value for costs, but compared with other units of measurement for benefits (such as outputs per $) (2017: 217-9)
  • Although such methods help us combine information and values to compare choices, note the inescapable role of power to decide whose values (and which outcomes, affecting whom) matter (2017: 204)
  1. What were the policy outcomes?
  • ‘Monitoring’ methods help identify (say): levels of compliance with regulations, if resources and services reach ‘target groups’, if money is spent correctly (such as on clearly defined ‘inputs’ such as public sector wages), and if we can make a causal link between the policy inputs/ activities/ outputs and outcomes (2017: 56; 251-5)
  • Monitoring is crucial because it is so difficult to predict policy success, and unintended consequences are almost inevitable (2017: 250).
  • However, the data gathered are usually no more than proxy indicators of outcomes. Further, the choice of indicators reflect what is available, ‘particular social values’, and ‘the political biases of analysts’ (2017: 262)
  • The idea of ‘evidence based policy’ is linked strongly to the use of experiments and systematic review to identify causality (2017: 273-6; compare with trial-and-error learning in Gigerenzer, complexity theory, and Lindblom).
  1. Did the policy solution work as intended? Did it improve policy outcomes?
  • Although we frame policy interventions as ‘solutions’, few problems are ‘solved’. Instead, try to measure the outcomes and the contribution of your solution, and note that evaluations of success and ‘improvement’ are contested (2017: 57; 332-41).  
  • Policy evaluation is not an objective process in which we can separate facts from values.
  • Rather, values and beliefs are part of the criteria we use to gauge success (and even their meaning is contested – 2017: 322-32).
  • We can gather facts about the policy process, and the impacts of policy on people, but this information has little meaning until we decide whose experiences matter.

Overall, the idea of ‘ex ante’ (forecasting) policy analysis is a little misleading, since policymaking is continuous, and evaluations of past choices inform current choices.

Policy analysis methods are ‘interdependent’, and ‘knowledge transformations’ describes the impact of knowledge regarding one question on the other four (2017: 7-13; contrast with Meltzer & Schwartz, Thissen & Walker).

Developing arguments and communicating effectively

Dunn (2017: 19-21; 348-54; 392) argues that ‘policy argumentation’ and the ‘communication of policy-relevant knowledge’ are central to policymaking’ (See Chapter 9 and Appendices 1-4 for advice on how to write briefs, memos, and executive summaries and prepare oral testimony).

He identifies seven elements of a ‘policy argument’ (2017: 19-21; 348-54), including:

  • The claim itself, such as a description (size, cause) or evaluation (importance, urgency) of a problem, and prescription of a solution
  • The things that support it (including reasoning, knowledge, authority)
  • Incorporating the things that could undermine it (including any ‘qualifier’, the communication of uncertainty about current knowledge, and counter-arguments).

The key stages of communication (2017: 392-7; 405; 432) include:

  1. ‘Analysis’, focusing on ‘technical quality’ (of the information and methods used to gather it), meeting client expectations, challenging the ‘status quo’, albeit while dealing with ‘political and organizational constraints’ and suggesting something that can actually be done.
  2. ‘Documentation’, focusing on synthesising information from many sources, organising it into a coherent argument, translating from jargon or a technical language, simplifying, summarising, and producing user-friendly visuals.
  3. ‘Utilization’, by making sure that (a) communications are tailored to the audience (its size, existing knowledge of policy and methods, attitude to analysts, and openness to challenge), and (b) the process is ‘interactive’ to help analysts and their audiences learn from each other.

 

hang-in-there-baby

 

Policy analysis and policy theory: systems thinking, evidence based policymaking, and policy cycles

Dunn (2017: 31-40) situates this discussion within a brief history of policy analysis, which culminated in new ways to express old ambitions, such as to:

  1. Use ‘systems thinking’, to understand the interdependence between many elements in complex policymaking systems (see also socio-technical and socio-ecological systems).
  • Note the huge difference between (a) policy analysis discussions of ‘systems thinking’ built on the hope that if we can understand them we can direct them, and (b) policy theory discussions that emphasise ‘emergence’ in the absence of central control (and presence of multi-centric policymaking).
  • Also note that Dunn (2017: 73) describes policy problems – rather than policymaking – as complex systems. I’ll write another post (short, I promise) on the many different (and confusing) ways to use the language of complexity.
  1. Promote ‘evidence based policy, as the new way to describe an old desire for ‘technocratic’ policymaking that accentuates scientific evidence and downplays politics and values (see also 2017: 60-4).

In that context, see Dunn’s (47-52) discussion of comprehensive versus bounded rationality:

  • Note the idea of ‘erotetic rationality’ in which people deal with their lack of knowledge of a complex world by giving up on the idea of certainty (accepting their ‘ignorance’), in favour of a continuous process of ‘questioning and answering’.
  • This approach is a pragmatic response to the lack of order and predictability of policymaking systems, which limits the effectiveness of a rigid attachment to ‘rational’ 5 step policy analyses (compare with Meltzer & Schwartz).

Dunn (2017: 41-7) also provides an unusually useful discussion of the policy cycle. Rather than seeing it as a mythical series of orderly stages, Dunn highlights:

  1. Lasswell’s original discussion of policymaking functions (or functional requirements of policy analysis, not actual stages to observe), including: ‘intelligence’ (gathering knowledge), ‘promotion’ (persuasion and argumentation while defining problems), ‘prescription’, ‘invocation’ and ‘application’ (to use authority to make sure that policy is made and carried out), and ‘appraisal’ (2017: 42-3).
  2. The constant interaction between all notional ‘stages’ rather than a linear process: attention to a policy problem fluctuates, actors propose and adopt solutions continuously, actors are making policy (and feeding back on its success) as they implement, evaluation (of policy success) is not a single-shot document, and previous policies set the agenda for new policy (2017: 44-5).

In that context, it is no surprise that the impact of a single policy analyst is usually minimal (2017: 57). Sorry to break it to you. Hang in there, baby.

hang-in-there-baby

 

13 Comments

Filed under 750 word policy analysis, public policy

Policy Analysis in 750 words: Rachel Meltzer and Alex Schwartz (2019) Policy Analysis as Problem Solving

Please see the Policy Analysis in 750 words series overview before reading the summary. This post might well represent the largest breach of the ‘750 words’ limit, so please get comfortable. I have inserted a picture of a cat hanging in there baby after the main (*coughs*) 1400-word summary. The rest is bonus material, reflecting on the links between this book and the others in the series.

Meltzer Schwartz 2019 cover

Rachel Meltzer and Alex Schwartz (2019) Policy Analysis as Problem Solving (Routledge)

We define policy analysis as evidence-based advice giving, as the process by which one arrives at a policy recommendation to address a problem of public concern. Policy analysis almost always involves advice for a client’ (Meltzer and Schwartz, 2019: 15).

Meltzer and Schwartz (2019: 231-2) describe policy analysis as applied research, drawing on many sources of evidence, quickly, with limited time, access to scientific research, or funding to conduct a lot of new research (2019: 231-2). It requires:

  • careful analysis of a wide range of policy-relevant documents (including the ‘grey’ literature often produced by governments, NGOs, and think tanks) and available datasets
  • perhaps combined with expert interviews, focus groups, site visits, or an online survey (see 2019: 232-64 on methods).

Meltzer and Schwartz (2019: 21) outline a ‘five-step framework’ for client-oriented policy analysis. During each step, they contrast their ‘flexible’ and ‘iterative’ approach with a too- rigid ‘rationalistic approach’ (to reflect bounded, not comprehensive, rationality):

  1. ‘Define the problem’.

Problem definition is a political act of framing, not an exercise in objectivity (2019: 52-3). It is part of a narrative to evaluate the nature, cause, size, and urgency of an issue (see Stone), or perhaps to attach to an existing solution (2019: 38-40; compare with Mintrom).

In that context, ask yourself ‘Who is defining the problem? And for whom?’ and do enough research to be able to define it clearly and avoid misunderstanding among you and your client (2019: 37-8; 279-82):

  • Identify your client’s resources and motivation, such as how they seek to use your analysis, the format of analysis they favour, their deadline, and their ability to make or influence the policies you might suggest (2019: 49; compare with Weimer and Vining).
  • Tailor your narrative to your audience, albeit while recognising the need to learn from ‘multiple perspectives’ (2019: 40-5).
  • Make it ‘concise’ and ‘digestible’, not too narrowly defined, and not in a way that already closes off discussion by implying a clear cause and solution (2019: 51-2).

In doing so:

  • Ask yourself if you can generate a timeline, identify key stakeholders, and place a ‘boundary’ on the problem.
  • Establish if the problem is urgent, who cares about it, and who else might care (or not) (2019 : 46).
  • Focus on the ‘central’ problem that your solution will address, rather than the ‘related’ and ‘underlying’ problems that are ‘too large and endemic to be solved by the current analysis’ (2019: 47).
  • Avoid misdiagnosing a problem with reference to one cause. Instead, ‘map’ causation with reference to (say) individual and structural causes, intended and unintended consequences, simple and complex causation, market or government failure, and/ or the ability to blame an individual or organisation (2019: 48-9).
  • Combine quantitative and qualitative data to frame problems in relation to: severity, trends in severity, novelty, proximity to your audience, and urgency or crisis (2019: 53-4).

During this process, interrogate your own biases or assumptions and how they might affect your analysis (2019: 50).

2. ‘Identify potential policy options (alternatives) to address the problem’.

Common sources of ideas include incremental changes from current policy, ‘client suggestions’, comparable solutions (from another time, place, or policy area), reference to common policy instruments, and ‘brainstorming’ or ‘design thinking’ (2019: 67-9; see box 2.3 and 7.1, below, from Understanding Public Policy).

box 2.3 2nd ed UPP

Identify a ‘wide range’ of possible solutions, then select the (usually 3-5) ‘most promising’ for further analysis (2019: 65). In doing so:

  • be careful not to frame alternatives negatively (e.g. ‘death tax’ – 2019: 66)
  • compare alternatives in ‘good faith’ rather than keeping some ‘off the table’ to ensure that your preferred solution looks good (2019: 66)
  • beware ‘ best practice’ ideas that are limited in terms of (a) applicability (if made at a smaller scale, or in a very different jurisdiction), and (b) evidence of success (2019: 70; see studies of policy learning and transfer)
  • think about how to modify existing policies according to scale or geographical coverage, who to include (and based on what criteria), for how long, using voluntary versus mandatory provisions, and ensuring oversight (2019: 71-3)
  • consider combinations of common policy instruments, such as regulations and economic penalties/ subsidies (2019: 73-7)
  • consider established ways to ‘brainstorm’ ideas (2019: 77-8)
  • note the rise of instruments derived from the study of psychology and behavioural public policy (2019: 79-90)
  • learn from design principles, including ‘empathy’, ‘co-creating’ policy with service users or people affected, ‘prototyping’ (2019: 90-1)

box 7.1

3. ‘Specify the objectives to be attained in addressing the problem and the criteria to  evaluate  the  attainment  of  these  objectives  as  well as  the  satisfaction  of  other  key  considerations  (e.g.,  equity,  cost, equity, feasibility)’.

Your objectives relate to your problem definition and aims: what is the problem, what do you want to happen when you address it, and why?

  • For example, questions to your client may include: what is your organization’s ‘mission’, what is feasible (in terms of resources and politics), which stakeholders to you want to include, and how will you define success (2019: 105; 108-12)?

In that values-based context, your criteria relate to ways to evaluate each policy’s likely impact (2019: 106-7). They should ensure:

  • Comprehensiveness. E.g. how many people, and how much of their behaviour, can you influence while minimizing the ‘burden’ on people, businesses, or government? (2019: 113-4)
  • Mutual Exclusiveness. In other words, don’t have two objectives doing the same thing (2019: 114).

Common criteria include (2019: 116):

  1. Effectiveness. The size of its intended impact on the problem (2019: 117).
  2. Equity (fairness). The impact in terms of ‘vertical equity’ (e.g. the better off should pay more), ‘horizontal equity’ (e.g. you should not pay more if unmarried), fair process, fair outcomes, and ‘intergenerational’ equity (e.g. don’t impose higher costs on future populations) (2019: 118-19).
  3. Feasibility (administrative, political, and technical). The likelihood of this policy being adopted and implemented well (2019: 119-21)
  4. Cost (or financial feasibility). Who would bear the cost, and their willingness and ability to pay (2019: 122).
  5. Efficiency. To maximise the benefit while minimizing costs (2019: 122-3).

 

4. ‘Assess the outcomes of the policy options in light of the criteria and weigh trade-offs between the advantages and disadvantages of the options’.

When explaining objectives and criteria,

  • ‘label’ your criteria in relation to your policy objectives (e.g. to ‘maximize debt reduction’) rather than using generic terms (2019: 123-7)
  • produce a table – with alternatives in rows, and criteria in columns – to compare each option
  • quantify your policies’ likely outcomes, such as in relation to numbers of people affected and levels of income transfer, or a percentage drop in the size of the problem, but also
  • communicate the degree of uncertainty related to your estimates (2019: 128-32; see Spiegelhalter)

Consider using cost-benefit analysis to identify (a) the financial and opportunity cost of your plans (what would you achieve if you spent the money elsewhere?), compared to (b) the positive impact of your funded policy (2019: 141-55).

  • The principle of CBA may be intuitive, but a thorough CBA process is resource-intensive, vulnerable to bias and error, and no substitute for choice. It requires you to make a collection of assumptions about human behaviour and likely costs and benefits, decide whose costs and benefits should count, turn all costs and benefits into a single measure, and imagine how to maximise winners and compensate losers (2019: 155-81; compare Weimer and Vining with Stone).
  • One alternative is cost-effectiveness analysis, which quantifies costs and relates them to outputs (e.g. number of people affected, and how) without trying to translate them into a single measure of benefit (2019: 181-3).
  • These measures can be combined with other thought processes, such as with reference to ‘moral imperatives’, a ‘precautionary approach’, and ethical questions on power/ powerlessness (2019: 183-4).

 

5. ‘Arrive at a recommendation’.

Predict the most likely outcomes of each alternative, while recognising high uncertainty (2019: 189-92). If possible,

  • draw on existing, comparable, programmes to predict the effectiveness of yours (2019: 192-4)
  • combine such analysis with relevant theories to predict human behaviour (e.g. consider price ‘elasticity’ if you seek to raise the price of a good to discourage its use) (2019: 193-4)
  • apply statistical methods to calculate the probability of each outcome (2019: 195-6), and modify your assumptions to produce a range of possibilities, but
  • note Spiegelhalter’s cautionary tales and anticipate the inevitable ‘unintended consequences’ (when people do not respond to policy in the way you would like) (2019: 201-2)
  • use these estimates to inform a discussion on your criteria (equity, efficiency, feasibility) (2019: 196-200)
  • present the results visually – such as in a ‘matrix’ – to encourage debate on the trade-offs between options
  • simplify choices by omitting irrelevant criteria and options that do not compete well with others (2019: 203-10)
  • make sure that your recommendation (a) flows from the analysis, and (b) is in the form expected by your client (2019: 211-12)
  • consider making a preliminary recommendation to inform an iterative process, drawing feedback from clients and stakeholder groups (2019: 212).

 

hang-in-there-baby

 

Policy analysis in a wider context

Meltzer and Schwartz’s approach makes extra sense if you have already read some of the other texts in the series, including:

  1. Weimer and Vining, which represents an exemplar of an X-step approach informed heavily by the study of economics and application of economic models such as cost-benefit-analysis (compare with Radin’s checklist).
  2. Geva-May on the existence of a policy analysis profession with common skills, heuristics, and (perhaps) ethics (compare with Meltzer and Schwartz, 2019: 282-93)
  3. Radin, on:
  • the proliferation of analysts across multiple levels of government, NGOs, and the private sector (compare with Meltzer and Schwartz, 2019: 269-77)
  • the historic shift of analysis from formulation to all notional stages (contrast with Meltzer and Schwartz, 2019: 16-7 on policy analysis not including implementation or evaluation)
  • the difficulty in distinguishing between policy analysis and advocacy in practice (compare with Meltzer and Schwartz, 2019: 276-8, who suggest that actors can choose to perform these different roles)
  • the emerging sense that it is difficult to identify a single client in a multi-centric policymaking system. Put another way, we might be working for a specific client but accept that their individual influence is low.
  1. Stone’s challenge to
  • a historic tendency for economics to dominate policy analysis,
  • the applicability of economic assumptions (focusing primarily on individualist behaviour and markets), and
  • the pervasiveness of ‘rationalist’ policy analysis built on X-steps.

Meltzer and Schwartz (2019: 1-3) agree that economic models are too dominant (identifying the value of insights from ‘other disciplines – including design, psychology, political science, and sociology’).

However, they argue that critiques of rational models exaggerate their limitations (2019: 23-6). For example:

  • these models need not rely solely on economic techniques or quantification, a narrow discussion or definition of the problem, or the sense that policy analysis should be comprehensive, and
  • it is not problematic for analyses to reflect their client’s values or for analysts to present ambiguous solutions to maintain wide support, partly because
  • we would expect the policy analysis to form only one part of a client’s information or strategy.

Further, they suggest that these critiques provide no useful alternative, to help guide new policy analysts. Yet, these guides are essential:

to be persuasive, and credible, analysts must situate the problem, defend their evaluative criteria, and be able to demonstrate that their policy recommendation is superior, on balance, to other alternative options in addressing the problem, as defined by the analyst. At a minimum, the analyst needs to present a clear and defensible ranking of options to guide the decisions of the policy makers’ (Meltzer and Schwartz, 2019: 4).

Meltzer and Schwartz (2019: 27-8) then explore ways to improve a 5-step model with insights from approaches such as ‘design thinking’, in which actors use a similar process – ‘empathize, define the problem, ideate, prototype, test and get feedback from others’ – to experiment with policy solutions without providing a narrow view on problem definition or how to evaluate responses.

Policy analysis and policy theory

One benefit to Meltzer and Schwartz’s approach is that it seeks to incorporate insights from policy theories and respond with pragmatism and hope. However, I think you also need to read the source material to get a better sense of those theories, key debates, and their implications. For example:

  1. Meltzer and Schwartz (2019: 32) note correctly that ‘incremental’ does not sum up policy change well. Indeed, Punctuated Equilibrium Theory shows that policy change is characterised by a huge number of small and a small number of huge changes.
  • However, the direct implications of PET are not as clear as they suggest. Baumgartner and Jones have both noted that they can measure these outcomes and identify the same basic distribution across a political system, but not explain or predict why particular policies change dramatically.
  • It is useful to recommend to policy analysts that they invest some hope in major policy change, but also sensible to note that – in the vast majority of cases – it does not happen.
  • On his point, see Mintrom on policy analysis for the long term, Weiss on the ‘enlightenment’ function of research and analysis, and Box 6.3 (from Understanding Public Policy), on the sense that (a) we can give advice to ‘budding policy entrepreneurs’ on how to be effective analysts, but (b) should note that all their efforts could be for nothing.

box 6.3

  1. Meltzer and Schwartz (2019: 32-3) tap briefly into the old debate on whether it is preferable to seek radical or incremental change. For more on that debate, see chapter 5 in the 1st ed of Understanding Public Policy in which Lindblom notes that proposals for radical/ incremental changes are not mutually exclusive.
  2. Perhaps explore the possible tension between Meltzer and Schwartz’s (2019: 33-4) recommendation that (a) policy analysis should be ‘evidence-based advice giving’, and (b) ‘flexible and open-ended’.
  • I think that Stone’s response would be that phrases such as ‘evidence based’ are not ‘flexible and open-ended’. Rather, they tend to symbolise a narrow view of what counts as evidence (see also Smith, and Hindess).
  • Further, note that the phrase ‘evidence based policymaking’ is a remarkably vague term (see the EBPM page), perhaps better seen as a political slogan than a useful description or prescription of policymaking.

 

Finally, if you read enough of these policy analysis texts, you get a sense that many are bunched together even if they describe their approach as new or distinctive.

  • Indeed, Meltzer and Schwarz (2019: 22-3) provide a table (containing Bardach and Patashnik, Patton et al, Stokey and Zeckhauser, Hammond et al, and Weimer & Vining) of ‘quite similar’ X-step approaches.
  • Weimer and Vining also discuss the implications of policy theories and present the sense that X-step policy analysis should be flexible and adaptive.
  • Many texts – including Radin, and Smith (2016) – focus on the value of case studies to think through policy analysis in particular contexts, rather than suggesting that we can produce a universal blueprint.

However, as Geva-May might suggest, this is not a bad thing if our aim is to generate the sense that policy analysis is a profession with its own practices and heuristics.

 

 

16 Comments

Filed under 750 word policy analysis, agenda setting, Evidence Based Policymaking (EBPM), public policy

Policy Analysis in 750 words: Michael Mintrom (2012) Contemporary Policy Analysis

Please see the Policy Analysis in 750 words series overview before reading the summary. This summary is not 750 words. I can only apologise.

Michael Mintrom (2012) Contemporary Policy Analysis (Oxford University Press)

Mintrom (2012: xxii; 17) describes policy analysis as ‘an enterprise primarily motivated by the desire to generate high quality information to support high-quality decisions’ and stop policymakers ‘from making ill-considered choices’ (2012: 17). It is about giving issues more ‘serious attention and deep thought’ than busy policymakers, rather than simply ‘an exercise in the application of techniques’ to serve clients (2012: 20; xxii).

It begins with six ‘Key Steps in Policy Analysis’ (2012: 3-5):

  1. ‘Engage in problem definition’

Problem definition influences the types of solutions that will be discussed (although, in some cases, solutions chase problems).

Define the nature and size of a policy problem, and the role of government in solving it (from maximal to minimal), while engaging with many stakeholders with different views (2012: 3; 58-60).

This task involves a juggling act. First, analysts should engage with their audience to work out what they need and when (2012 : 81). However, second, they should (a) develop ‘critical abilities’, (b) ask themselves ‘why they have been presented in specific ways, what their sources might be, and why they have arisen at this time’, and (c) present ‘alternative scenarios’ (2012: 22; 20; 27).

  1. ‘Propose alternative responses to the problem’

Governments use policy instruments – such as to influence markets, tax or subsidize activity, regulate behaviour, provide services (directly, or via commissioning or partnership), or provide information – as part of a coherent strategy or collection of uncoordinated measures (2012: 30-41). In that context, try to:

  • Generate knowledge about how governments have addressed comparable problems (including, the choice to not intervene if an industry self-regulates).
  • Identify the cause of a previous policy’s impact and if it would have the same effect now (2012: 21).
  • If minimal comparable information is available, consider wider issues from which to learn (2012: 76-7; e.g. alcohol policy based on tobacco).

Consider the wider:

 

  1. ‘Choose criteria for evaluating each alternative policy response’

There are no natural criteria, but ‘effectiveness, efficiency, fairness, and administrative efficiency’ are common (2012: 21). ‘Effective institutions’ have a marked impact on social and economic life and provide political stability (2012: 49). Governments can promote ‘efficient’ policies by (a) producing the largest number of winners and (b) compensating losers (2012: 51-2; see Weimer and Vining on Kaldor-Hicks). They can prioritise environmental ‘sustainability’ to mitigate climate change, the protection of human rights and ‘human flourishing’, and/or a fair allocation of resources (2012: 52-7).

  1. ‘Project the outcomes of pursuing each policy alternative’

Estimate the costs of a new policy, in comparison with current policy, and in relation to factors such as (a) overall savings to society, and/or (b) benefits to certain populations (any policy will benefit some social groups more than others). Mintrom (2012: 21) emphasises ‘prior knowledge and experience’ and ‘synthesizing’ work by others alongside techniques such as cost-benefit analyses.

  1. ‘Identify and analyse trade-offs among alternatives’

Use your criteria and projections to compare each alternative in relation to their likely costs and benefits.

  1. ‘Report findings and make an argument for the most appropriate response’

Mintrom (2012: 5) describes a range of advisory roles.

(a) Client-oriented advisors identify the beliefs of policymakers and anticipate the options worth researching (although they should not simply tell clients what they want to hear – 2012: 22). They may only have the time to answer a client’s question quickly and on their own. Or, they need to create and manage a team project (2012: 63-76).

(b) Other actors, ‘who want to change the world’, research options that are often not politically feasible in the short term but are too important to ignore (such as gender mainstreaming or action to address climate change).

In either case, the format of a written report – executive summary, contents, background, analytical strategy, analysis and findings (perhaps including a table comparing goals and trade-offs between alternatives), discussion, recommendation, conclusion, annex – may be similar (2012: 82-6).

Wider context: the changing role of policy analysts

Mintrom (2012: 5-7) describes a narrative – often attributed to Radin – of the changing nature of policy analysis, comparing:

  1. (a) a small group of policy advisors, (b) with a privileged place in government, (c) giving allegedly technical advice, using economic techniques such as cost-benefit analysis.
  2. (a) a much larger profession, (b) spread across – and outside of – government (including external consultants), and (c) engaging more explicitly in the politics of policy analysis and advice.

It reflects wider changes in government, (a) from the ‘clubby’ days to a much more competitive environment debating a larger number and wider range of policy issues, subject to (b) factors such as globalisation that change the task/ context of policy analysis.

If so, any advice on how to do policy analysis has to be flexible, to incorporate the greater diversity of actors and the sense that complex policymaking systems require flexible skills and practices rather than standardised techniques and outputs.

The ethics of policy analysis

In that context, Mintrom (2012: 95-108) emphasises the enduring role for ethical policy analysis, which can relate to:

  1. ‘Universal’ principles such as fairness, compassion, and respect
  2. Specific principles to project the analyst’s integrity, competence, responsibility, respectfulness, and concern for others
  3. Professional practices, such as to
  • engage with many stakeholders in problem definition (to reflect a diversity of knowledge and views)
  • present a range of feasible solutions, making clear their distributional effects on target populations, opportunity costs (what policies/ outcomes would not be funded if this were), and impact on those who implement policy
  • be honest about (a) the method of calculation, and (b) uncertainty, when projecting outcomes
  • clarify the trade-offs between alternatives (don’t stack-up the evidence for one)
  • maximise effective information sharing, rather than exploiting the limited attention of your audience (compare with Riker).
  1. New analytical strategies (2012: 114-15; 246-84)
  1. the extent to which social groups are already ‘systematically disadvantaged’,
  2. the causes (such as racism and sexism) of – and potential solutions to – these outcomes, to make sure
  3. that new policies reduce or do not perpetuate disadvantages, even when
  4. politicians may gain electorally from scapegoating target populations and/ or
  5. there are major obstacles to transformative policy change.

Therefore, while Mintrom’s (2012: 3-5; 116) ‘Key Steps in Policy Analysis’ are comparable to Bardach and Weimer and Vining, his emphasis is often closer to Bacchi’s.

The entrepreneurial policy analyst

Mintrom (2012: 307-13) ends with a discussion of the intersection between policy entrepreneurship and analysis, highlighting the benefits of ‘positive thinking’, creativity, deliberation, and leadership. He expands on these ideas further in So you want to be a policy entrepreneur?

16 Comments

Filed under 750 word policy analysis, agenda setting, Evidence Based Policymaking (EBPM), public policy

Policy Analysis (usually) in 750 words: David Weimer and Adrian Vining (2017) Policy Analysis

Please see the Policy Analysis in 750 words series overview before reading the summary.

Please note that this book is the longest in the series (almost 500 pages), so a 750 word summary would have been too heroic.

David Weimer and Adrian Vining (2017) Policy Analysis: Concepts and Practice 6th Edition (Routledge)

Weimer and Vining (2017: 23-8; 342-75) describe policy analysis in seven steps:

  1. ‘Write to Your Client’

Having a client such as an elected policymaker (or governmental or nongovernmental organization) requires you to: address the question they ask, by their chosen deadline, in a clear and concise way that they can understand (and communicate to others) quickly (2017: 23; 370-4).

Their sample documents are 18 pages, including an executive summary and summary table.

  1. ‘Understand the Policy Problem’

First, ‘diagnose the undesirable condition’, such as by

  • placing your client’s initial ‘diagnosis’ in a wider perspective (e.g. what is the role of the state, and what is its capacity to intervene?), and
  • providing relevant data (usually while recognising that you are not an expert in the policy problem).

Second, frame it as ‘a market or government failure (or maybe both)’, to

  • show how individual or collective choices produce inefficient allocations of resources and poor outcomes (2017: 59-201 and 398-434 provides a primer on economics), and
  • identify the ways in which people have addressed comparable problems in other policy areas (2017: 24).
  1. ‘Be Explicit About Values’ (and goals)

Identify the values that you seek to prioritise, such as ‘efficiency’, ‘equity’, and ‘human dignity’.

Treat values as self-evident goals. They exist alongside the ‘instrumental goals’ – such as ‘sustainable public finance or political feasibility’ – necessary to generate support for policy solutions.

‘Operationalise’ those goals to help identify the likely consequences of different choices.

For example, define efficiency in relation to (a) the number of outputs per input and/or (b) a measurable or predictable gain in outcomes, such as ‘quality-adjusted life years’ in a population (2017: 25-6).

Weimer and Vining describe two analyses of efficiency at length:

  • Cost Benefit Analysis (CBA) to (a) identify the most efficient outcomes by (b) translating all of the predicted impacts of an alternative into a single unit of analysis (such as a dollar amount), on the assumption (c) that we can produce winners from policy and compensate losers (see Kaldor-Hicks) (2017: 352-5 and 398-434).
  • Public Agency Strategic Analysis (PASA) to identify ways in which public organisations can change to provide more benefits (such as ‘public value’) with the same resources (2017: 435-50).
  1. ‘Specify Concrete Policy Alternatives’

Explain potential solutions in sufficient detail to predict the costs and benefits of each ‘alternative’ (including current policy).

Compare specific and well-worked alternatives, such as from ‘academic policy researchers’ or ‘advocacy organizations’.

Identify the potential to adopt and tailor more generic policy instruments (see 2017: 205-58 on the role of taxes, expenditure, regulation, staffing, and information-sharing; and compare with Hood and Margetts).

Engage in ‘borrowing’ proposals or models from credible sources, and ‘tinkering’ (using only the relevant elements of a proposal) to make sure they are relevant to your problem (2017: 26-7; 359).

  1. ‘Predict and Value Impacts’

Ideally, you would have the time and resources to (a) produce new research and/or (b) ‘conduct a meta-analysis’ of relevant evaluations to (c) provide ‘confident assessments of impacts’ and ‘engage in highly touted evidence-based policy making’ (see EBPM).

However, ‘short deadlines’ and limited access to ‘directly relevant data’ prompt you to patch together existing research that does not answer your question directly (see 2017: 327-39; 409-11).

Consequently, ‘your predictions of the impacts of a unique policy alternative must necessarily be guided by logic and theory, rather than systematic empirical evidence’ (2017: 27) and ‘we must balance sometimes inconsistent evidence to reach conclusions about appropriate assertions’ (2017: 328).

  1. ‘Consider the Trade-Offs’

It is almost inevitable that, if you compare multiple feasible alternatives, each one will fulfil certain goals more than others.

Producing, and discussing with your clients, a summary table allows you make value-based choices about trade-offs – such as between the most equitable or efficient choice – in the context of a need to manage costs and predict political feasibility (2017: 28; 356-8).

  1. ‘Make a Recommendation’

‘Unless your client asks you not to do so, you should explicitly recommend one policy’ (2017: 28).

Even so, your analysis of alternatives is useful to (a) show your work (to emphasise the value of policy analysis), and (b) anticipate a change in circumstances (that affects the likely impact of each choice) or the choice by your client to draw different conclusions.

Policy analysis in a wider context: comparisons with other texts

  1. Policy analysis requires flexibility and humility

As with Smith (and Bardach), note how flexible this advice must be, to reflect factors such as:

  • the (unpredictable) effect that different clients and contexts have on your task
  • the pressure on your limited time and resources
  • the ambiguity of broad goals such as equity and human dignity
  • a tendency of your clients to (a) not know, or (b) choose not to reveal their goals before you complete your analysis of possible policy solutions (2017: 347-9; compare with Lindblom)
  • the need to balance many factors – (a) answering your client’s question with confidence, (b) describing levels of uncertainty and ambiguity, and (c) recognising the benefit of humility – to establish your reputation as a provider of credible and reliable analysis (2017: 341; 363; 373; 453).
  1. Policy analysis as art and craft as well as science

While some proponents of EBPM may identify the need for highly specialist scientific research proficiency, Weimer and Vining (2017: 30; 34-40) describe:

  • the need to supplement a ‘solid grounding’ in economics and statistics with political awareness (the ‘art and craft of policy analysis’), and
  • the ‘development of a professional mind-set’ rather than perfecting ‘technical skills’ (see the policy analysis profession described by Radin).

This approach requires some knowledge of policy theories (see 1000 and 500) to appreciate the importance of factors such as networks, institutions, beliefs and motivation, framing, lurches of attention, and windows of opportunity to act (compare with ‘how far should you go?’).

Indeed, pp259-323 has useful discussions of (a) strategies including ‘co-optation’, ‘compromise’, ‘rhetoric’, Riker’s ‘heresthetics’, (b) the role of narrative in ‘writing implementation scenarios’, and (c) the complexity of mixing many policy interventions.

  1. Normative and ethical requirements for policy analysis

Bacchi’s primary focus is to ask fundamental questions about what you are doing and why, and to challenge problem definitions that punish powerless populations.

In comparison, Weimer and Vining emphasise the client orientation which limits your time, freedom, and perhaps inclination to challenge so strongly.

Still, this normative role is part of an ethical duty to:

  • balance a ‘responsibility to client’ with ‘analytical integrity’ and ‘adherence to one’s personal conception of the good society’, and challenge the client if they undermine professional values (2017: 43-50)
  • reflect on the extent to which a policy analyst should seek to be an ‘Objective Technician’, ‘Client’s Advocate’ or ‘Issue Advocate’ (2017: 44; compare with Pielke and Jasanoff)
  • recognise the highly political nature of seemingly technical processes such as cost-benefit-analysis (see 2017: 403-6 on ‘Whose Costs and Benefits Count’), and
  • encourage politicians to put ‘aside their narrow personal and political interests for the greater good’ (2017: 454).

 

20 Comments

Filed under 750 word policy analysis, Evidence Based Policymaking (EBPM), public policy