Tag Archives: cognitive science

Stop Treating Cognitive Science Like a Disease


At the beginning is a guest post by Professor Steven Sloman, responding to Professor Daniel Sarewitz’s post in the Guardian called Stop treating science denial like a disease.  At the end is Dan Sarewitz’s reply. If you are wondering why this debate is now playing out on my website, there is a connection of sorts, in: (a) the work of the European Commission’s JRC, with Sloman speaking at its annual conference EU4Facts, and (b) the work of INGSA on government-science advice, in which Sarewitz plays a key role.  

Modern science has its problems. As reviewed in a recent editorial by Daniel Sarewitz, many branches of science have been suffering from a replication crisis. Scientists are under tremendous pressure to publish, and cutting scientific corners has, in some fields, become normal. This, he thinks, justifies a kind of science denialism, one that recognizes that not every word expressed by a scientist should be taken on faith.

Sarewitz is right on a couple of counts: Not every branch of science has equal authority. And in many areas, too much of too little value is being published. Some of it does not pass even weak tests of scientific care and rigor. But his wild claim in favor of denialism is bluster: Science is making faster progress today than at any time in history.

Sarewitz’s intended victim in his piece is cognitive science. He argues that cognitive science appeals to a deficit model (my term) to explain science denialism. People are ignorant, in Sarewitz’s parody of cognitive science, and therefore they fail to understand science. If only they were smarter, or taught the truth about science, they wouldn’t deny it, but rather use it as a guide to truth, justice, and all things good.

This is a position in cognitive science, especially cognitive science of the 70’s and 80’s. But even cognitive science makes progress and today it is a minority view. What does modern cognitive science actually suggest about our understanding of science denial? The answer is detailed in our book The Knowledge Illusion that Sarewitz takes issue with. He would have done well to read it before reviewing it because what we say is diametrically opposed to his report, and largely consistent with his view, though a whole lot more nuanced.

The deficit model applies to one form of reasoning, what we call intuition. The human brain generates beliefs based on naïve causal models about how the world works. These are often sketchy and flawed (consider racists’ understanding of people of other races). Individuals are quite ignorant about how the world works, not because people are stupid, but because the world is so complex. The chaotic, uncertain nature of the universe means that everything we encounter is a tangle of enormous numbers of elements and interactions, far more than any individual could comprehend, never mind retain. As we show in our book, even the lowly ballpoint pen represents untold complexity. The source of ignorance is not so much about the biology of the individual; it’s about the complexity of the world that the individual lives in.

Despite their ignorance, humans have accomplished amazing things, from creating symphonies to laptops. How? In large part by relying on a second form of human reasoning, deliberation. Deliberation is not constrained wholly by biology because it extends beyond the individual. Deliberative thought uses the body to remember for us and even to compute. That’s why emotions are critical for good decision making and why children use their fingers to count. Thinking also uses the world. We compute whether it’s safe to cross the street by looking to see if a car is coming, and we use the presence of dirty dishes on the counter to tell us whether the dishes need doing.

But more than anything, deliberation uses other people. Whether we’re getting our dishwasher fixed, our spiritual lives developed, or our political opinions formed, we are guided by those we deem experts and those we respect in our communities. To a large extent, people are not the rational processors of information that some enlightenment philosophers dreamed about; we are shills for our communities.

The positive side of this is that people are built to collaborate; we are social entities in the most fundamental way, as thinkers. The negative side is that we can subscribe to ideologies that are perpetuated to pursue the self-interest of community leaders, ideologies that have no rational basis. Indeed, the most fervent adherents of a view tend to know the least about it. Fortunately, we have found (not just assumed as Sarewitz says) that when people are asked to explain the consequences of the policies they adhere to, they become less extremist as they discover they don’t really understand.

Scientists live in communities too, and science is certainly vulnerable to these same social forces. That’s why the scientific method was developed: to put ideas to the test and let the cream rise to the top. This takes time, but because science reports eventually to the truth inherent in nature, human foible and peer review can only steer it off course temporarily.

Cognitive science has historically bought into the deficit model, treating failures of science literacy as a kind of disease. But Sarewitz should practice the care and rigor that he preaches by reporting correctly: Cognitive science, like many forms of science, is slowly getting it right.

Reply by Dan Sarewitz

Normally I don’t respond to this kind of thing but a couple points demand rebuttal.

First:  I actually did read their book, cover-to-cover.  Neither the Guardian piece, nor the longer talk from which it draws, are book reviews, they are critiques of the larger intellectual program that The Knowledge Illusion positions itself within.

Second, the idea that deliberative and collaborative activities are a powerful sources of human creativity that overcome the cognitive limits of the individual is an entirely familiar one that has been well-recognized for centuries.  As Professor Sloman indicates, it occupies much of his book, and much of his comment above.  But it was not relevant to my concerns, which were how Sloman and co-author Fernbach position human cognitive limits as a source of so much difficulty in today’s world.  They write:  “Because we confuse the knowledge in our heads with the knowledge we have access to, we are largely unaware of how little we understand.  We live with the belief that we understand more than we do.  As we will explore in the rest of the book, many of society’s most pressing problems stem from this illusion.” [p. 129, my italics]  They wrote this, I didn’t.

Third, having read their book carefully, I am indeed well aware that Sloman and Fernbach understand the limits of the deficit model.  But as they make clear in a subsection of chapter 8, entitled “Filling the Deficit,” they still believe that IF ONLY people understood more about science, then “many of the societies most pressing problems” could be more effectively addressed:  “And the knowledge illusion means that we don’t check our understanding often or deeply enough.  This is a recipe for antiscientific thinking.” [p. 169] A page later they write: “[P]erhaps it is too soon to give up on the deficit model.”

This is where my posting sought to engage with Professor Sloman’s book.  I don’t think that people’s understanding of scientific knowledge has much at all to do with “many of society’s most pressing problems,” for reasons that I point toward in the Guardian piece, and have written extensively about in many other forums.  Professor Sloman may not agree with this position, but his comments above fail to indicate that he actually recognizes or understands it.

Finally, Professor Sloman writes, both tendentiously and with an apparently tone-deaf ear, that my “wild claim in favor of denialism is bluster: Science is making faster progress today than at any time in history.”  The progress of science is irrelevant to my argument, which addresses the intersection of politics and a scientific enterprise that is always pushing into the realm of the uncertain, the unknown, the unknowable, the contestable, the contingent—even as it also, sometimes and in some directions, makes magnificent progress.

Perhaps there is a valuable discussion to be had about whether poor understanding of science by the public is relevant to “many of society’s most pressing problems.”  My view is that this is an overblown, distracting, and to some extent dangerous belief on the part of some scientists, as I indicate in the Guardian post and in many other writings.  Professor Sloman may disagree, but his complaints here are about something else entirely, something that I didn’t write.

1 Comment

Filed under Psychology Based Policy Studies, Uncategorized