People Don’t Trust Science When It Becomes Politics

by The Center for Bioethics and Culture on May 5, 2011

By Wesley J. Smith, J.D., Special Consultant to the CBC

Scientific American has a piece out trying to explain why people “don’t trust scientists.” I think the very topic illustrates part of the problem. Hubris. Those in the Politicized Science community — and Scientific American is part of that crowd — intentionally conflate science the technique with scientism, a liberal ideology.

So, how does Daniel T. Willingham answer the question, “Why so many people choose not to believe what scientists say?” From the article:

On public policy issues, Americans believe that science leaders are more knowledgeable and impartial than leaders in other sectors of society, such as business or government. Why do people say that they trust scientists in general but part company with them on specific issues?

I think that’s because people differentiate properly between what are sometimes called bench scientists, and the politicized “science” advocates that too often seek to harness our general support for science as the horses to pull their own political/ideological agenda carts. We’ve seen that repeatedly in the global warming debate, the embryonic stem cell issue, environmental controversies, and etc. They also seek to conflate a scientific “finding,” e.g., the earth has warmed in the last century — with advocates’ desired political “solution,” e.g., global warming hysteria.

He sniffs that some of us are fooled into not believing science because different studies sometimes reach different conclusions, noting that science often reaches new conclusions based on updated knowledge. There is some truth here. On the other hand, the idea that the “scientific study” somehow settles things is ludicrous given the number of conflicting scientific studies that are based on methodology rather than more recently gained knowledge. Moreover, people understand that scientific studies are like the Bible, you can get them to say almost anything you want. In other words, too often the answer that is wanted seems to precede the actual study — kind of like the Warren Commission conclusions.

Willingham next says we need to teach better how old ways of looking at things — by which he means religion — have interfered with science:

Asking science teachers to impart enough content to understand all the issues may be unrealistic, but they might be able to improve people’s appreciation for the accuracy of scientific knowledge. Through the study of the history of science, students might gain an understanding both of their own motivations for belief and of science as a method of knowing. If a student understands how a medieval worldview could have made a geocentric theory of the solar system seem correct, it is a short step to seeing similar influences in oneself . . . Science may not be the only way of organizing and understanding our experience, but for accuracy it fares better than religion, politics and art. That’s the lesson.

No, the lesson is how “the scientists” too often overreach. It can — and indeed does — tell us what is. But it can’t tell us what ought to be or not be. It can’t tell us what is right and what is wrong. Too often politicized scientists expand science’s purview from the objective to the subjective. That is when matters go off the track.

And then there is the oft stated conceit that people who don’t believe “the scientists” are fooled by advocates being paid to have heterodox opinions:

In reconciling our rational and irrational motives for belief, we have become good at kidding ourselves. Because we want to see ourselves as rational beings, we find reasons to maintain that our beliefs are accurate. One or two contrarians are sufficient to convince us that the science is “controversial” or “unsettled.” If people knew that other motives might compromise the accuracy of their beliefs, most would probably try to be on their guard.

Please. In Crowd advocacy science reminds me of a high school clique. The idea that the GWH community and promoters of ESCR/human cloning are “objective” as opposed to their opponents, is ludicrous on its face. There is too much water under that particular bridge to support its infrastructure.

Willingham uses the parents of autistic children’ continuing to hold onto the clear falsehood that vaccines caused the kids’ disease, as his prime example of public distrust. But one need not be a scientist to understand that phenomenon: We have created a culture in which “someone” is always to “blame” when bad things happen to good people. And the media ballyhoo these intensely emotional narratives. In a parent’s intense pain, it can be hard to accept that sometimes, it isn’t anybody’s fault.

Finally, we should take note of the divisive “war” that some in the Politicized Science community have declared against religion and the decidedly and very vocal disrespect these advocates (not most scientists) voice about faith in general and the faithful in specific. People don’t want to follow snobs.

So, let’s distinguish between science the method — which is indeed trustworthy (although certainly not infallible) — and “SCIENCE,” which is often political advocacy masking as science. Unfortunately, the latter is a corruption of the former and undermines the very enterprise it claims to vigorously support.

Previous post:

Next post: