From earliest childhood, I wanted to be a scientist. “Santa” brought me an inexpensive microscope when I was 7. I drove my parents crazy, dragging them over to the microscope to see a succession of samples that I had positioned under the 25x lens. Later in my childhood I had a chemistry set; I collected rocks (too many of which I neglected to take out of my pockets before my shorts went into the wash, to the great consternation of my mother); I pursued frogs and chipmunks with reckless abandon; I had a small telescope and almost froze in the Canadian winters, staring at a clear sky in subzero weather; I had an insect collection.
My mother, who had trained as a nurse, hoped my scientific curiosity would lead me to medicine, but my evolution as a scientist stalled out on insects. They were, then and now, absolutely fascinating to me. But in order to be a scientist, it’s not enough to be curious. Rather, it’s necessary to learn the scientific method: how to pose a hypothesis, how to design an experiment, and especially how to allow facts to dictate conclusions.
Modern science dates only from the late 16th and early 17th centuries, with the work of Copernicus and Galileo in astronomy, Vesalius in anatomy, Harvey in blood circulation, and Francis Bacon, who promulgated inductive reasoning and the use of experimentation. Prior to that time, natural philosophers (as they were then called) relied on deductive reasoning — Aristotle saw frogs emerging from the mud at the bottom of ponds in the spring and deduced that they arose from the mud itself, through spontaneous generation. Galen, a second century Greek living in Rome, deduced that blood must be formed from food in the liver, and then sweated off as perspiration, rather than recirculated. That conclusion was accepted wisdom until the 17th century, when, with a few simple experiments and observations, Harvey demonstrated just how wrong Galen was — and just how powerful inductive reasoning can be — in developing a coherent understanding of the facts.
But acceptance of the scientific method was by no means instantaneous. Many people are often reluctant to give up their view of things based on some new or inconvenient facts. In Galileo’s case, the Catholic Church promptly weighed in and threatened to excommunicate him if he did not renounce his belief that the earth revolved around the sun and not the converse; indeed, a few early scientists were burned at the stake for heresy. Darwin was openly mocked for espousing the belief that species could evolve over time, in a process now known as evolution. Gregor Mendel, the father of genetics, was simply ignored when he published his findings on the inheritance of certain characteristics of garden peas.
Even within the scientific community, it is not unusual to see reluctance to abandon accepted dogma despite overwhelming evidence to the contrary. Doctors in the 19th century routinely did not wash their hands after examining a patient or even after performing surgery. They were confident that microbes arose from diseased tissues rather than causing the disease in the first place. Death rates in hospitals eventually plummeted but only after a long campaign to convince medical staff of the virtues of antiseptic surgery.
Early in the 20th century, a German scientist named Alfred Wegener, noting the way the coastline of South America mirrors the coastline of Africa, proposed that continents moved, over very long periods of time, in a process called continental drift. He was ridiculed and then ignored for decades — everyone knew that continents were permanently fixed in place — until the gradual accumulation of additional evidence during the 1950s and 1960s compelled scientists to accept Wegener’s theory. (Sadly, Wegener himself died in 1930, never having had the satisfaction of knowing his ideas would eventually prevail.)
The point is: The scientific method requires that accepted wisdom be abandoned in the face of contrary facts. Scientists must be willing to change their minds when confronted with compelling evidence. Dogma and ideology have no place in the world of science.
But dogma and ideology clearly have a place outside the world of science. Religion and science have long had an uneasy relationship, although friction between them is by no means inevitable. Many scientists are also deeply religious and have no problem separating beliefs based on faith from those based on facts. So it is ironic that, as mainstream religion and organized science continue to forge a growing détente, political ideology has entered the picture and recreated the divide. For example, the words “climate change” are now more likely to incite a riot than to start a debate.
This example, along with many others, provides a source of great concern to those who had hoped we had moved to a point where public policy generally results from a fact-based decision-making process. But we are reminded that dogma and ideology still lurk in the shadows, out of sight but in no way defeated, awaiting only the opportunity to reassert themselves, as now seems to be happening in our country with disconcerting regularity.
In the June 2018 issue of Scientific American (“Universities are Vital for Bridging the Science Gap”), Peter Salovey, the president of Yale University and an internationally renowned psychologist, called on colleges and universities to teach their students to think like scientists:
“The best way we can transcend ideology is to teach our students, regardless of their majors, to think like scientists.”
“Knowledge is power but only if individuals are able to analyze and compare information against their personal beliefs, are willing to champion data-driven decision making over ideology, and have access to a wealth of research findings to inform policy discussion and decisions.”
Well, that sounds perfectly reasonable. But is it the case that once facts become known, they will invariably triumph over ideology? Every day we see instances, even within the scientific community, where data are being challenged when long-standing beliefs are put in jeopardy by these data. Why should we assume that the general public would respond differently were they to have a deeper understanding of how scientists think?
Ironically, in the May 2018 edition of Scientific American (“People who Understand Evolution are More Likely to Accept It”), there is a short article that reports on a survey designed to see if broader scientific knowledge is correlated with broader acceptance of a sometimes-contentious scientific theory, in this case acceptance of Darwin’s theory of evolution.
The survey results are revelatory. On the one hand, acceptance of evolution as fact increases markedly as a function of broad knowledge of science, rising from about 25 percent among individuals with almost no scientific knowledge to as much as 100 percent among those with abundant scientific knowledge (as defined by correctly answering a battery of 25 questions about science).
On the other hand, the same survey showed an equally strong correlation between the acceptance of evolution as fact and religious and political beliefs. Although scientifically literate “less religious” or “very liberal” individuals were highly likely to accept evolution as fact, only about 50 percent of highly scientifically knowledgeable “more religious” or “very conservative” individuals did so.
So Dr. Salovey’s optimism that, if colleges teach all students to think like scientists, we can “transcend ideology” may be unrealistic. Equally disturbing is the fact that Americans seem to be drifting to opposite poles in their thinking and beliefs, sharpening the political divide that today seems wider than ever between religious conservatives and liberal secular humanists.
The larger challenge for colleges and universities is to go beyond teaching all students to think like scientists and encourage exploring, in an even-handed matter, why people believe as they do, and to help students understand that people of good will can have very different views on controversial matters and still be worthy of respect. Facts are not enough. Rationality and civility are just as important if we are to expect that data-driven decision-making will eventually shape national public policies.