Principles of science advice to government: key problems and feasible solutions

Q: can we design principles of science advice to government to be universal, exhaustive, coherent, clearly defined, and memorable?

If not, we need to choose between these requirements. So, who should get to choose and what should their criteria be?

I provide six scenarios to help us make clear choices between trade-offs. Please enjoy the irony of a 2000-word post calling for a small number of memorable heuristics.

world-science-forum-need-for-principles

In 2015, the World Science Forum declared the value of scientific advice to government and called for a set of principles to underpin the conduct of people giving that advice, based on the principles including transparency, visibility, responsibility, integrity, independence, and accountability. INGSA is taking this recommendation forward, with initial discussions led by Peter Gluckman, James Wilsdon and Daniel Sarewitz and built on many existing documents outlining those principles, followed by consultation and key contributions from people like Heather Douglas and Marc Saner. Here is Marc Saner summing up the pre-conference workshop, and David Mair inviting us to reflect on our aims:

marc-saner

I outline some of those points of tension in this huge table, paraphrasing three days of discussion before and during INGSA’s Science and Policymaking conference in September 2016.

table-1-snip

Here is Dan Sarewitz inviting scientists to reject a caricature of science and the idea that scientists can solve problems simply by producing evidence.sarewitz

table-1b-snipNote: the links in the table don’t work! Here they are: frame, honest brokers, new and diverse generation

Two solutions: a mega-document or a small set of heuristics

One solution to this problem is a super-document incorporating all of the points of all key players. The benefit is that we can present it as a policy solution in the knowledge that (a) very few people will read the document, (b) anyone will be able to find their points in it, and (c) it will be too long and complicated for many people to identify serious contradictions between different beliefs about how to supply and demand science advice. It would literally have weight (if you printed it out) but would not be used as a common guide for scientists and government audiences across the globe, except perhaps as a legitimising document (‘I adhered to the principles’).

Another solution is to produce a super-short document, built on a rationale that should be familiar to anyone giving science advice to policymakers: tell people only the information you expect them to remember, from a brief conversation in the elevator or a half-page document. In other words, the world is complex but we need to simplify it to allow us to act or, at least, to get the attention of your audience. We tell this to scientists advising government – keep it brief and accessible to encourage simple but effective heuristics – but the same point applies to scientists themselves. They may have huge brains, but they also make decisions based on ‘rational’ and ‘irrational’ shortcuts to information. So, giving them a small set of simple and memorable rules will trump a long and worthy but forgettable document.

Producing heuristics for science advice is a political exercise

This is no mean feat because the science community will inevitably produce a large number of different and often-contradictory recommendations for science advice principles. Turning them into a small set of rules is an exercise of power to decide which interpretation of the rules counts and whose experiences they most reflect.

Many scientists would like to think that we can produce a solution to this problem by gathering evidence and seeking consensus, but life is not that simple: we have different values, understandings of the world, priorities and incentives, and there comes a point when you have to make choices which produce winners and losers. So, let’s look at a few options and you can tell me which one you’d choose (or suggest your own in the comments section).

To be honest, I’m finding it difficult to know which principle links to which practices, and if some principles are synonymous enough to lump together. Indeed, in options 3 and 4 the authors have modified the original principles listed by the WSF (from responsibility, integrity, independence, accountability, transparency, visibility). Note how easier it is to remember option 1 which, I think, is the most naïve and least useful option. As in life, the more nuanced accounts are harder to explain and remember.

Option 1: the neutral scientist

  • Demonstrate independence by collaborating only with scientists
  • Demonstrate transparency and visibility by publishing all your data and showing your calculations
  • Demonstrate integrity by limiting your role to evidence and ‘speaking truth to power’
  • Demonstrate responsibility and accountability through peer review and other professional mechanisms for quality control

Option 2: the ‘honest broker’

  • Demonstrate independence by working with policymakers only when you demarcate your role
  • Demonstrate transparency and visibility by publishing your data and declaring your involvement in science advice
  • Demonstrate integrity by limiting your role to evidence and influencing the search for the right question or providing evidence-based options, not explicit policy advice
  • Demonstrate responsibility and accountability through peer review and other professional mechanisms for quality control

Option 3: my interpretation of the Wilsdon and Sarewitz opening gambit

  • Demonstrate independence by communicating while free of political influence, declaring your interests, and making your role clear
  • Demonstrate transparency and visibility by publishing your evidence as quickly and fully as possible
  • Demonstrate integrity by limiting your role to the role of intellectually free ‘honest broker’, while respecting the limits to your advice in a democratic system
  • Demonstrate diversity by working across scientific disciplines and using knowledge from ‘civil society’
  • Demonstrate responsibility and accountability through mechanisms including peer review and other professional means for quality control, public dialogue, and the development of clear lines of institutional accountability.

Option 4: my interpretation of the Heather Douglas modification

  • Demonstrate integrity by having proper respect for inquiry (including open mindedness to the results, and reflective on how one’s values influence interpretation)
  • Take responsibility for the production of advice which is scientifically accurate, explained well and in a transparent way (clearly, and with an openness about the values underpinning judgements), and responsive to societal concerns
  • Demonstrate accountability to the expert community by encouraging other scientists to ‘call them out’ for misjudgements, and to advisees by encouraging them to probe the values underpinning science advice.
  • Demonstrate independence by rejecting illegitimate political interference (e.g. expert selection, too-specific problem definition, dictating or altering scientific results)
  • Demonstrate legitimacy by upholding these principles and complementary values (such as to encourage diversity of participation)

Here is Heather Douglas explaining the links between each principle at the pre-conference workshop (and here is her blog post):

heather-douglas

Option 5: the responsible or hesitant advocate

  • Demonstrate independence by working closely with policymakers but establishing the boundaries between science advice and collective action
  • Demonstrate transparency and visibility by publishing your data, declaring your involvement in science advice, and declaring the substance of your advice
  • Demonstrate integrity by making a sincere attempt to link policy advice to the best available evidence tailored to the legitimate agenda of elected policymakers
  • Demonstrate responsibility and accountability through professional mechanisms for quality control and institutional mechanisms to record the extent of your involvement in policy decisions

Option 6: the openly political advocate for evidence-based policy

  • Demonstrate independence by establishing an analytically distinct role for ‘scientific thinking’ within collective political choice
  • Demonstrate transparency and visibility by publishing relevant scientific data, and declaring the extent of your involvement in policy advice
  • Demonstrate integrity by making a sincere attempt to link policy advice to the best available evidence tailored to the legitimate agenda of elected policymakers
  • Demonstrate responsibility and accountability through professional mechanisms for quality control and institutional mechanisms to reward or punish your judgement calls
  • Demonstrate the effectiveness of evidence-based policy by establishing a privileged position in government for mechanisms obliging policymakers to gather and consider scientific evidence routinely in decisions
  • Select any legitimate strategy necessary – telling stories, framing problems, being entrepreneurial when proposing solutions, leading and brokering discussions – to ensure that policies are based on scientific evidence.

See Also:

Expediting evidence synthesis for healthcare decision-making: exploring attitudes and perceptions towards rapid reviews using Q methodology

The article distinguishes between (for example) a purist position on systematic reviews, privileging scientific criteria, and a pragmatic position on rapid reviews, privileging the needs of policymakers.

See also:

The INGSA website http://www.ingsa.org/

See also ‘The Brussels Declaration’

 

18 Comments

Filed under Evidence Based Policymaking (EBPM), public policy

18 responses to “Principles of science advice to government: key problems and feasible solutions

  1. Pingback: Realistic ‘realist’ reviews: why do you need them and what might they look like? | Paul Cairney: Politics & Public Policy

  2. Science At The Local

    Thanks for the write up. It’s good to see people talking and writing about this.

    I agree that big documents are unlikely to be read, but I’m still sympathetic to the inclination to write everything down and have it somewhere for ppl to look up as needed.

    I’m not sure that rules or guidance are the most important thing right now. What we need is to build a movement – which seems to be happening – of interested and empowered scientists, bureaucrats and politicians, to talk about and act on these issues.

    We need – dare I say it – to experiment. To set up different systems, different options as you say above, and to pay close attention to their effectiveness, and to share stories about it.These should originate from the needs of the jurisdiction and its unique local situation.

    In this sense I would say the greatest needs are developing a culture amongst bureaucrats of experimentation and honest reflection. These qualities are far from universal in the public sector (or society) in my experience. There ain’t no Department of Failure Celebration.

    Cheers

  3. Pingback: October 13, 2016 | Engaging Philosophy: News at Waterloo

  4. Pingback: The theory and practice of evidence-based policy transfer: can we learn how to reduce territorial inequalities? | Paul Cairney: Politics & Public Policy

  5. Pingback: Evidence Based Policy Making: 5 things you need to know and do | Paul Cairney: Politics & Public Policy

  6. Pingback: Storytelling for Policy Change: promise and problems | Paul Cairney: Politics & Public Policy

  7. Pingback: How can political actors take into account the limitations of evidence-based policy-making? 5 key points | Paul Cairney: Politics & Public Policy

  8. Pingback: Why doesn’t evidence win the day in policy and policymaking? | Paul Cairney: Politics & Public Policy

  9. Pingback: What 10 questions should we put to evidence for policy experts? | Paul Cairney: Politics & Public Policy

  10. Pingback: How far should you go to secure academic ‘impact’ in policymaking? From ‘honest brokers’ to ‘research purists’ and Machiavellian manipulators | Paul Cairney: Politics & Public Policy

  11. Pingback: The role of ‘standards for evidence’ in ‘evidence informed policymaking’ | Paul Cairney: Politics & Public Policy

  12. Pingback: Evidence based policymaking: 7 key themes | Paul Cairney: Politics & Public Policy

  13. Pingback: Stop Treating Cognitive Science Like a Disease | Paul Cairney: Politics & Public Policy

  14. Pingback: #EU4Facts: 3 take-home points from the JRC annual conference | Paul Cairney: Politics & Public Policy

  15. Pingback: Why don’t policymakers listen to your evidence? | Paul Cairney: Politics & Public Policy

  16. Pingback: Taking lessons from policy theory into practice: 3 examples | Paul Cairney: Politics & Public Policy

  17. Pingback: COVID-19 policy in the UK: Did the UK Government ‘follow the science’? Reflections on SAGE meetings | Paul Cairney: Politics & Public Policy

  18. Pingback: Creeping crisis: the UK government’s response to COVID-19 (and the role of experts) | Paul Cairney: Politics & Public Policy

Leave a comment