Almost. I have sent a full draft following external feedback and review (next stage: copy-editing). All going well, it will be out in November 2019.
Classic studies suggest that the most profound and worrying kinds of power are the hardest to observe. We often witness highly visible political battles and can use pluralist methods to identify who has material resources, how they use them, and who wins. However, key forms of power ensure that many such battles do not take place. Actors often use their resources to reinforce social attitudes and policymakers’ beliefs, to establish which issues are policy problems worthy of attention and which populations deserve government support or punishment. Key battles may not arise because not enough people think they are worthy of debate. Attention and support for debate may rise, only to be crowded out of a political agenda in which policymakers can only debate a small number of issues.
Studies of power relate these processes to the manipulation of ideas or shared beliefs under conditions of bounded rationality (see for example the NPF). Manipulation might describe some people getting other people to do things they would not otherwise do. They exploit the beliefs of people who do not know enough about the world, or themselves, to know how to identify and pursue their best interests. Or, they encourage social norms – in which we describe some behaviour as acceptable and some as deviant – which are enforced by the state (for example, via criminal justice and mental health policy), but also social groups and individuals who govern their own behaviour with reference to what they feel is expected of them (and the consequences of not living up to expectations).
Such beliefs, norms, and rules are profoundly important because they often remain unspoken and taken for granted. Indeed, some studies equate them with the social structures that appear to close off some action. If so, we may not need to identify manipulation to find unequal power relationships: strong and enduring social practices help some people win at the expense of others, by luck or design.
In practice, these more-or-less-observable forms of power co-exist and often reinforce each other:
Example 1. The control of elected office is highly skewed towards men. Male incumbency, combined with social norms about who should engage in politics and public life, signal to women that their efforts may be relatively unrewarded and routinely punished – for example, in electoral campaigns in which women face verbal and physical misogyny – and the oversupply of men in powerful positions tends to limit debates on feminist issues.
Example 2. ‘Epistemic violence’ describes the act of dismissing an individual, social group, or population by undermining the value of their knowledge or claim to knowledge. Specific discussions include: (a) the colonial West’s subjugation of colonized populations, diminishing the voice of the subaltern; (b) privileging scientific knowledge and dismissing knowledge claims via personal or shared experience; and (c) erasing the voices of women of colour from the history of women’s activism and intellectual history.
It is in this context that we can understand ‘critical’ research designed to ‘produce social change that will empower, enlighten, and emancipate’ (p51). Powerlessness can relate to the visible lack of economic material resources and factors such as the lack of opportunity to mobilise and be heard.
I thank James Georgalakis for inviting me to speak at the inaugural event of IDS’ new Evidence into Policy and Practice Series, and the audience for giving extra meaning to my story about the politics of ‘evidence-based based policymaking’. The talk (using powerpoint) and Q&A is here:
James invited me to respond to some of the challenges raised to my talk – in his summary of the event – so here it is.
I’m working on a ‘show, don’t tell’ approach, leaving some of the story open to interpretation. As a result, much of the meaning of this story – and, in particular, the focus on limiting participation – depends on the audience.
For example, consider the impact of the same story on audiences primarily focused on (a) scientific evidence and policy, or (b) participation and power.
Normally, when I talk about evidence and policy, my audience is mostly people with scientific or public health backgrounds asking why do policymakers ignore scientific evidence? I am usually invited to ruffle feathers, mostly by challenging a – remarkably prevalent – narrative that goes like this:
In that context, I suggest that there are many claims to policy-relevant knowledge, policymakers have to ignore most information before making choices, and they are not in control of the policy process for which they are ostensibly in charge.
Limiting participation as a strategic aim
Then, I say to my audience that – if they are truly committed to maximising the use of scientific evidence in policy – they will need to consider how far they will go to get what they want. I use the metaphor of an ethical ladder in which each rung offers more influence in exchange for dirtier hands: tell stories and wait for opportunities, or demonise your opponents, limit participation, and humour politicians when they cherry-pick to reinforce emotional choices.
It’s ‘show don’t tell’ but I hope that the take-home point for most of the audience is that they shouldn’t focus so much on one aim – maximising the use of scientific evidence – to the detriment of other important aims, such as wider participation in politics beyond a reliance on a small number of experts. I say ‘keep your eyes on the prize’ but invite the audience to reflect on which prizes they should seek, and the trade-offs between them.
Limited participation – and ‘windows of opportunity’ – as an empirical finding
I did suggest that most policymaking happens away from the sphere of ‘exciting’ and ‘unruly’ politics. Put simply, people have to ignore almost every issue almost all of the time. Each time they focus their attention on one major issue, they must – by necessity – ignore almost all of the others.
For me, the political science story is largely about the pervasiveness of policy communities and policymaking out of the public spotlight.
The logic is as follows. Elected policymakers can only pay attention to a tiny proportion of their responsibilities. They delegate the rest to bureaucrats at lower levels of government. Bureaucrats lack specialist knowledge, and rely on other actors for information and advice. Those actors trade information for access. In many cases, they develop effective relationships based on trust and a shared understanding of the policy problem.
Trust often comes from a sense that everyone has proven to be reliable. For example, they follow norms or the ‘rules of the game’. One classic rule is to contain disputes within the policy community when actors don’t get what they want: if you complain in public, you draw external attention and internal disapproval; if not, you are more likely to get what you want next time.
For me, this is key context in which to describe common strategic concerns:
Where is the power analysis in all of this?
I rarely use the word power directly, partly because – like ‘politics’ or ‘democracy’ – it is an ambiguous term with many interpretations (see Box 3.1). People often use it without agreeing its meaning and, if it means everything, maybe it means nothing.
However, you can find many aspects of power within our discussion. For example, insider and outsider strategies relate closely to Schattschneider’s classic discussion in which powerful groups try to ‘privatise’ issues and less powerful groups try to ‘socialise’ them. Agenda setting is about using resources to make sure issues do, or do not, reach the top of the policy agenda, and most do not.
These aspects of power sometimes play out in public, when:
However, they are no less important when they play out routinely:
In other words, the word ‘power’ is often hidden because the most profound forms of power often seem to be hidden.
In the context of our discussion, power comes from the ability to define some evidence as essential and other evidence as low quality or irrelevant, and therefore define some people as essential or irrelevant. It comes from defining some issues as exciting and worthy of our attention, or humdrum, specialist and only relevant to experts. It is about the subtle, unseen, and sometimes thoughtless ways in which we exercise power to harness people’s existing beliefs and dominate their attention as much as the transparent ways in which we mobilise resources to publicise issues. Therefore, to ‘maximise the use of evidence’ sounds like an innocuous collective endeavour, but it is a highly political and often hidden use of power.
In policy studies, there is a profound difference between uncertainty and ambiguity:
Both concepts relate to ‘bounded rationality’: policymakers do not have the ability to process all information relevant to policy problems. Instead, they employ two kinds of shortcut:
I make an artificially binary distinction, uncertain versus ambiguous, and relate it to another binary, rational versus irrational, to point out the pitfalls of focusing too much on one aspect of the policy process:
Actors can try to solve uncertainty by: (a) improving the quality of evidence, and (b) making sure that there are no major gaps between the supply of and demand for evidence. Relevant debates include: what counts as good evidence?, focusing on the criteria to define scientific evidence and their relationship with other forms of knowledge (such as practitioner experience and service user feedback), and what are the barriers between supply and demand?, focusing on the need for better ways to communicate.
Actors try to solve ambiguity by exercising power to increase attention to, and support for, their favoured interpretation of a policy problem. You will find many examples of such activity spread across the 500 and 1000 words series:
A focus on reducing uncertainty gives the impression that policymaking is a technical process in which people need to produce the best evidence and deliver it to the right people at the right time.
In contrast, a focus on reducing ambiguity gives the impression of a more complicated and political process in which actors are exercising power to compete for attention and dominance of the policy agenda. Uncertainty matters, but primarily to describe the role of a complex policymaking system in which no actor truly understands where they are or how they should exercise power to maximise their success.
For a longer discussion, see Fostering Evidence-informed Policy Making: Uncertainty Versus Ambiguity (PDF)
Or, if you fancy it in French: Favoriser l’élaboration de politiques publiques fondées sur des données probantes : incertitude versus ambiguïté (PDF)
Here is the relevant opening section in UPP:
‘Framing’ is a metaphor to describe the ways in which we understand, and use language selectively to portray, policy problems. There are many ways to describe this process in many disciplines, including communications, psychological, and sociological research. There is also more than one way to understand the metaphor.
For example, I think that most scholars describe this image (from litemind) of someone deciding which part of the world on which to focus.
However, I have also seen colleagues use this image, of a timber frame, to highlight the structure of a discussion which is crucial but often unseen and taken for granted:
The first kind of framing relates to bounded rationality or the effect of our cognitive processes on the ways in which we process information (and influence how others process information):
In that context, you can see one meaning of framing: other actors portray information selectively to influence the ways in which we see the world, or which parts of the world capture our attention (here is a simple example of wind farms).
In policy theory, framing studies focus on ambiguity: there are many ways in which we can understand and define the same policy problem (note terms such as ‘problem definition’ and a ‘policy image’). Therefore, actors exercise power to draw attention to, and generate support for, one particular understanding at the expense of others. They do this with simple stories or the selective presentation of facts, often coupled with emotional appeals, to manipulate the ways in which we process information.
Think about the extent to which we take for granted certain ways to understand or frame issues. We don’t begin each new discussion with reference to ‘first principles’. Instead, we discuss issues with reference to:
(a) debates that have been won and may not seem worth revisiting (imagine, for example, the ways in which ‘socialist’ policies are treated in the US)
(b) other well-established ways to understand the world which, when they seem to dominate our ways of thinking, are often described as ‘hegemonic’ or with reference to paradigms.
In such cases, the timber frame metaphor serves two purposes:
(a) we can conclude that it is difficult but not impossible to change.
(b) if it is hidden by walls, we do not see it; we often take it for granted even though we should know it exists.
Framing the social, not physical, world
These metaphors can only take us so far, because the social world does not have such easily identifiable physical structures. Instead, when we frame issues, we don’t just choose where to look; we also influence how people describe what we are looking at. Or, ‘structural’ frames relate to regular patterns of behaviour or ways of thinking which are more difficult to identify than in a building. Consequently, we do not all describe structural constraints in the same way even though, ostensibly, we are looking at the same thing.
In this respect, for example, the well-known ‘Overton window’ is a sort-of helpful but also problematic concept, since it suggests that policymakers are bound to stay within the limits of what Kingdon calls the ‘national mood’. The public will only accept so much before it punishes you in events such as elections. Yet, of course, there is no such thing as the public mood. Rather, some actors (policymakers) make decisions with reference to their perception of such social constraints (how will the public react?) but they also know that they can influence how we interpret those constraints with reference to one or more proxies, including opinion polls, public consultations, media coverage, and direct action:
They might get it wrong, and suffer the consequences, but it still makes sense to say that they have a choice to interpret and adapt to such ‘structural’ constraints.
Framing, power and the role of ideas
We can bring these two ideas about framing together to suggest that some actors exercise power to reinforce dominant ways to think about the world. Power is not simply about visible conflicts in which one group with greater material resources wins and another loses. It also relates to agenda setting. First, actors may exercise power to reinforce social attitudes. If the weight of public opinion is against government action, maybe governments will not intervene. The classic example is poverty – if most people believe that it is caused by fecklessness, what is the role of government? In such cases, power and powerlessness may relate to the (in)ability of groups to persuade the public, media and/ or government that there is a reason to make policy; a problem to be solved. In other examples, the battle may be about the extent to which issues are private (with no legitimate role for government) or public (and open to legitimate government action), including: should governments intervene in disputes between businesses and workers? Should they intervene in disputes between husbands and wives? Should they try to stop people smoking in private or public places?
Second, policymakers can only pay attention to a tiny amount of issues for which they are responsible. So, actors exercise power to keep some issues on their agenda at the expense of others. Issues on the agenda are sometimes described as ‘safe’: more attention to these issues means less attention to the imbalances of power within society.