Tag Archives: INGSA

Principles of science advice to government: key problems and feasible solutions

Q: can we design principles of science advice to government to be universal, exhaustive, coherent, clearly defined, and memorable?

If not, we need to choose between these requirements. So, who should get to choose and what should their criteria be?

I provide six scenarios to help us make clear choices between trade-offs. Please enjoy the irony of a 2000-word post calling for a small number of memorable heuristics.

world-science-forum-need-for-principles

In 2015, the World Science Forum declared the value of scientific advice to government and called for a set of principles to underpin the conduct of people giving that advice, based on the principles including transparency, visibility, responsibility, integrity, independence, and accountability. INGSA is taking this recommendation forward, with initial discussions led by Peter Gluckman, James Wilsdon and Daniel Sarewitz and built on many existing documents outlining those principles, followed by consultation and key contributions from people like Heather Douglas and Marc Saner. Here is Marc Saner summing up the pre-conference workshop, and David Mair inviting us to reflect on our aims:

marc-saner

I outline some of those points of tension in this huge table, paraphrasing three days of discussion before and during INGSA’s Science and Policymaking conference in September 2016.

table-1-snip

Here is Dan Sarewitz inviting scientists to reject a caricature of science and the idea that scientists can solve problems simply by producing evidence.sarewitz

table-1b-snipNote: the links in the table don’t work! Here they are: frame, honest brokers, new and diverse generation

Two solutions: a mega-document or a small set of heuristics

One solution to this problem is a super-document incorporating all of the points of all key players. The benefit is that we can present it as a policy solution in the knowledge that (a) very few people will read the document, (b) anyone will be able to find their points in it, and (c) it will be too long and complicated for many people to identify serious contradictions between different beliefs about how to supply and demand science advice. It would literally have weight (if you printed it out) but would not be used as a common guide for scientists and government audiences across the globe, except perhaps as a legitimising document (‘I adhered to the principles’).

Another solution is to produce a super-short document, built on a rationale that should be familiar to anyone giving science advice to policymakers: tell people only the information you expect them to remember, from a brief conversation in the elevator or a half-page document. In other words, the world is complex but we need to simplify it to allow us to act or, at least, to get the attention of your audience. We tell this to scientists advising government – keep it brief and accessible to encourage simple but effective heuristics – but the same point applies to scientists themselves. They may have huge brains, but they also make decisions based on ‘rational’ and ‘irrational’ shortcuts to information. So, giving them a small set of simple and memorable rules will trump a long and worthy but forgettable document.

Producing heuristics for science advice is a political exercise

This is no mean feat because the science community will inevitably produce a large number of different and often-contradictory recommendations for science advice principles. Turning them into a small set of rules is an exercise of power to decide which interpretation of the rules counts and whose experiences they most reflect.

Many scientists would like to think that we can produce a solution to this problem by gathering evidence and seeking consensus, but life is not that simple: we have different values, understandings of the world, priorities and incentives, and there comes a point when you have to make choices which produce winners and losers. So, let’s look at a few options and you can tell me which one you’d choose (or suggest your own in the comments section).

To be honest, I’m finding it difficult to know which principle links to which practices, and if some principles are synonymous enough to lump together. Indeed, in options 3 and 4 the authors have modified the original principles listed by the WSF (from responsibility, integrity, independence, accountability, transparency, visibility). Note how easier it is to remember option 1 which, I think, is the most naïve and least useful option. As in life, the more nuanced accounts are harder to explain and remember.

Option 1: the neutral scientist

  • Demonstrate independence by collaborating only with scientists
  • Demonstrate transparency and visibility by publishing all your data and showing your calculations
  • Demonstrate integrity by limiting your role to evidence and ‘speaking truth to power’
  • Demonstrate responsibility and accountability through peer review and other professional mechanisms for quality control

Option 2: the ‘honest broker’

  • Demonstrate independence by working with policymakers only when you demarcate your role
  • Demonstrate transparency and visibility by publishing your data and declaring your involvement in science advice
  • Demonstrate integrity by limiting your role to evidence and influencing the search for the right question or providing evidence-based options, not explicit policy advice
  • Demonstrate responsibility and accountability through peer review and other professional mechanisms for quality control

Option 3: my interpretation of the Wilsdon and Sarewitz opening gambit

  • Demonstrate independence by communicating while free of political influence, declaring your interests, and making your role clear
  • Demonstrate transparency and visibility by publishing your evidence as quickly and fully as possible
  • Demonstrate integrity by limiting your role to the role of intellectually free ‘honest broker’, while respecting the limits to your advice in a democratic system
  • Demonstrate diversity by working across scientific disciplines and using knowledge from ‘civil society’
  • Demonstrate responsibility and accountability through mechanisms including peer review and other professional means for quality control, public dialogue, and the development of clear lines of institutional accountability.

Option 4: my interpretation of the Heather Douglas modification

  • Demonstrate integrity by having proper respect for inquiry (including open mindedness to the results, and reflective on how one’s values influence interpretation)
  • Take responsibility for the production of advice which is scientifically accurate, explained well and in a transparent way (clearly, and with an openness about the values underpinning judgements), and responsive to societal concerns
  • Demonstrate accountability to the expert community by encouraging other scientists to ‘call them out’ for misjudgements, and to advisees by encouraging them to probe the values underpinning science advice.
  • Demonstrate independence by rejecting illegitimate political interference (e.g. expert selection, too-specific problem definition, dictating or altering scientific results)
  • Demonstrate legitimacy by upholding these principles and complementary values (such as to encourage diversity of participation)

Here is Heather Douglas explaining the links between each principle at the pre-conference workshop (and here is her blog post):

heather-douglas

Option 5: the responsible or hesitant advocate

  • Demonstrate independence by working closely with policymakers but establishing the boundaries between science advice and collective action
  • Demonstrate transparency and visibility by publishing your data, declaring your involvement in science advice, and declaring the substance of your advice
  • Demonstrate integrity by making a sincere attempt to link policy advice to the best available evidence tailored to the legitimate agenda of elected policymakers
  • Demonstrate responsibility and accountability through professional mechanisms for quality control and institutional mechanisms to record the extent of your involvement in policy decisions

Option 6: the openly political advocate for evidence-based policy

  • Demonstrate independence by establishing an analytically distinct role for ‘scientific thinking’ within collective political choice
  • Demonstrate transparency and visibility by publishing relevant scientific data, and declaring the extent of your involvement in policy advice
  • Demonstrate integrity by making a sincere attempt to link policy advice to the best available evidence tailored to the legitimate agenda of elected policymakers
  • Demonstrate responsibility and accountability through professional mechanisms for quality control and institutional mechanisms to reward or punish your judgement calls
  • Demonstrate the effectiveness of evidence-based policy by establishing a privileged position in government for mechanisms obliging policymakers to gather and consider scientific evidence routinely in decisions
  • Select any legitimate strategy necessary – telling stories, framing problems, being entrepreneurial when proposing solutions, leading and brokering discussions – to ensure that policies are based on scientific evidence.

See Also:

Expediting evidence synthesis for healthcare decision-making: exploring attitudes and perceptions towards rapid reviews using Q methodology

The article distinguishes between (for example) a purist position on systematic reviews, privileging scientific criteria, and a pragmatic position on rapid reviews, privileging the needs of policymakers.

See also:

The INGSA website http://www.ingsa.org/

See also ‘The Brussels Declaration’

 

17 Comments

Filed under Evidence Based Policymaking (EBPM), public policy

What sciences count in government science advice?

One theme of the Science and Policymaking conference (#EUINGSA16) is interdisciplinarity. Most people are calling for joint work to help inform major policy problems, with some criticising a tendency to forget the social sciences and, in particular, humanities.

The same can be said for the study of science advice to government. Most scientific contributions to the discussion are from people with a ‘hard science’ background describing their personal experiences without much discussion of the evidence on science advice in policy settings provided by the ‘softer’ disciplines. This is where many of those forgotten disciplines come in, to answer 4 key questions:

  1. What makes people like policymakers tick?

The obvious discipline is psychology, to understand the links between ‘rational’ and ‘irrational’ policymaking. The other is education, to help explain how adults learn (which is, I think, what scientists expect of politicians).

  1. What messages work?

In this case, we have established disciplines, such as the study of communication, and the ‘science of stories’ in political science, as well as multi-disciplinary approaches to ‘science diplomacy’.

  1. How can we make the process work for us?

We can use psychological insights to identify how to influence policymakers: exploiting ‘fluency’ (people pay attention to things with which they are already familiar) and manipulating people’s cognitive biases to get what we want.

  1. Should we make the process work for us?

We can draw on philosophy to help us decide how far we should go to get what we want. We can also draw on anthropology to help us work out why we are so uncomfortable when talking about crossing the line from impartial adviser to policy actor.

By lucky chance, there is a special issue of articles drawing on these insights (and more) to identify how to ‘maximise the use of evidence in policy’.

See also: The Politics of Evidence-Based Policymaking

The Politics of Evidence Based Policymaking:3 messages

 

 

3 Comments

Filed under Evidence Based Policymaking (EBPM), public policy