When Facts and Ideology Collide

From earliest childhood, I wanted to be a scientist. “Santa” brought me an inexpensive microscope when I was 7. I drove my parents crazy, dragging them over to the microscope to see a succession of samples that I had positioned under the 25x lens. Later in my childhood I had a chemistry set; I collected rocks (too many of which I neglected to take out of my pockets before my shorts went into the wash, to the great consternation of my mother); I pursued frogs and chipmunks with reckless abandon; I had a small telescope and almost froze in the Canadian winters, staring at a clear sky in subzero weather; I had an insect collection.

My mother, who had trained as a nurse, hoped my scientific curiosity would lead me to medicine, but my evolution as a scientist stalled out on insects. They were, then and now, absolutely fascinating to me. But in order to be a scientist, it’s not enough to be curious. Rather, it’s necessary to learn the scientific method: how to pose a hypothesis, how to design an experiment, and especially how to allow facts to dictate conclusions.

Modern science dates only from the late 16th and early 17th centuries, with the work of Copernicus and Galileo in astronomy, Vesalius in anatomy, Harvey in blood circulation, and Francis Bacon, who promulgated inductive reasoning and the use of experimentation. Prior to that time, natural philosophers (as they were then called) relied on deductive reasoning — Aristotle saw frogs emerging from the mud at the bottom of ponds in the spring and deduced that they arose from the mud itself, through spontaneous generation. Galen, a second century Greek living in Rome, deduced that blood must be formed from food in the liver, and then sweated off as perspiration, rather than recirculated. That conclusion was accepted wisdom until the 17th century, when, with a few simple experiments and observations, Harvey demonstrated just how wrong Galen was — and just how powerful inductive reasoning can be — in developing a coherent understanding of the facts.

But acceptance of the scientific method was by no means instantaneous. Many people are often reluctant to give up their view of things based on some new or inconvenient facts. In Galileo’s case, the Catholic Church promptly weighed in and threatened to excommunicate him if he did not renounce his belief that the earth revolved around the sun and not the converse; indeed, a few early scientists were burned at the stake for heresy. Darwin was openly mocked for espousing the belief that species could evolve over time, in a process now known as evolution. Gregor Mendel, the father of genetics, was simply ignored when he published his findings on the inheritance of certain characteristics of garden peas.

Even within the scientific community, it is not unusual to see reluctance to abandon accepted dogma despite overwhelming evidence to the contrary. Doctors in the 19th century routinely did not wash their hands after examining a patient or even after performing surgery. They were confident that microbes arose from diseased tissues rather than causing the disease in the first place. Death rates in hospitals eventually plummeted but only after a long campaign to convince medical staff of the virtues of antiseptic surgery.

Early in the 20th century, a German scientist named Alfred Wegener, noting the way the coastline of South America mirrors the coastline of Africa, proposed that continents moved, over very long periods of time, in a process called continental drift. He was ridiculed and then ignored for decades — everyone knew that continents were permanently fixed in place — until the gradual accumulation of additional evidence during the 1950s and 1960s compelled scientists to accept Wegener’s theory. (Sadly, Wegener himself died in 1930, never having had the satisfaction of knowing his ideas would eventually prevail.)

The point is: The scientific method requires that accepted wisdom be abandoned in the face of contrary facts. Scientists must be willing to change their minds when confronted with compelling evidence. Dogma and ideology have no place in the world of science.

But dogma and ideology clearly have a place outside the world of science. Religion and science have long had an uneasy relationship, although friction between them is by no means inevitable. Many scientists are also deeply religious and have no problem separating beliefs based on faith from those based on facts. So it is ironic that, as mainstream religion and organized science continue to forge a growing détente, political ideology has entered the picture and recreated the divide. For example, the words “climate change” are now more likely to incite a riot than to start a debate.

This example, along with many others, provides a source of great concern to those who had hoped we had moved to a point where public policy generally results from a fact-based decision-making process. But we are reminded that dogma and ideology still lurk in the shadows, out of sight but in no way defeated, awaiting only the opportunity to reassert themselves, as now seems to be happening in our country with disconcerting regularity.

In the June 2018 issue of Scientific American (“Universities are Vital for Bridging the Science Gap”), Peter Salovey, the president of Yale University and an internationally renowned psychologist, called on colleges and universities to teach their students to think like scientists:

“The best way we can transcend ideology is to teach our students, regardless of their majors, to think like scientists.”

“Knowledge is power but only if individuals are able to analyze and compare information against their personal beliefs, are willing to champion data-driven decision making over ideology, and have access to a wealth of research findings to inform policy discussion and decisions.”

Well, that sounds perfectly reasonable. But is it the case that once facts become known, they will invariably triumph over ideology? Every day we see instances, even within the scientific community, where data are being challenged when long-standing beliefs are put in jeopardy by these data. Why should we assume that the general public would respond differently were they to have a deeper understanding of how scientists think?

Ironically, in the May 2018 edition of Scientific American (“People who Understand Evolution are More Likely to Accept It”), there is a short article that reports on a survey designed to see if broader scientific knowledge is correlated with broader acceptance of a sometimes-contentious scientific theory, in this case acceptance of Darwin’s theory of evolution.

The survey results are revelatory. On the one hand, acceptance of evolution as fact increases markedly as a function of broad knowledge of science, rising from about 25 percent among individuals with almost no scientific knowledge to as much as 100 percent among those with abundant scientific knowledge (as defined by correctly answering a battery of 25 questions about science).

On the other hand, the same survey showed an equally strong correlation between the acceptance of evolution as fact and religious and political beliefs. Although scientifically literate “less religious” or “very liberal” individuals were highly likely to accept evolution as fact, only about 50 percent of highly scientifically knowledgeable “more religious” or “very conservative” individuals did so.

So Dr. Salovey’s optimism that, if colleges teach all students to think like scientists, we can “transcend ideology” may be unrealistic. Equally disturbing is the fact that Americans seem to be drifting to opposite poles in their thinking and beliefs, sharpening the political divide that today seems wider than ever between religious conservatives and liberal secular humanists.

The larger challenge for colleges and universities is to go beyond teaching all students to think like scientists and encourage exploring, in an even-handed matter, why people believe as they do, and to help students understand that people of good will can have very different views on controversial matters and still be worthy of respect. Facts are not enough. Rationality and civility are just as important if we are to expect that data-driven decision-making will eventually shape national public policies.

Why America’s Obsession with “the Best” is Destroying Higher Education

Animal behaviorists and psychologists tell us that athletic competitions are symbolic warfare. In a simplistic parallel, the Boston Red Sox and the New York Yankees (substitute the name of your favorite teams as you wish) are today what Athens and Sparta were in ancient Greece, or Venice and Genoa were in the Renaissance, or any number of Central and South American city-states were in the years before colonization by Europeans. Sports teams are tribal identifiers, uniting their “fans” (allegedly an abbreviation of “fanatic”) into an “army.” Members of the “army” often display their support by wearing the team’s colors or portions of their uniforms, thereby providing easy visual confirmation of their partisanship.

On one level, identifying with a team is a harmless outlet for the need to belong to something or to identify with like-minded individuals on a particular subject, and most people are able to keep their passion for their team within socially acceptable bounds.

But perhaps because modern technology has made watching an athletic competition at a distance (via television or online streaming) so easy and available, sports have become ever more integrated into our lives. Indeed, based on television viewership, sporting events are now our preferred form of entertainment. It may well be the case that at no time in human history have sports competitions been as socially significant as they are today.

The problem is that the rules and objectives of sports are easily applied to other areas of social endeavor where they don’t belong. In sports, the point of competition, especially in current times, is not merely to compete; it is to win. We want to know which team is “the best,” and we create “playoffs” or “title matches” or “championships” to settle the question of who or what is “the best.” Huge numbers of people are glued to their various media outlets for major championships; enormous amounts of money are wagered on the outcomes; coverage of the big games is so ubiquitous that those not interested find it difficult to identify someone with whom to have a conversation on any other topic.

Regrettably, this obsession with “the best” is undermining higher education in three distinct ways.

First, a great deal of money is spent by colleges on their athletics programs, and that money is therefore unavailable to support academics or other more central functions of the college. It is not unusual to find a basketball or football coach who is, by five or more times, the most highly compensated university employee. Yet it is rare that an athletics program generates enough revenue to pay for its operating budget (let alone the capital costs of arenas and stadiums). In fact, fewer than two dozen universities make a profit on athletics, and it is common that a university in Division 1 will divert several tens of millions of dollars annually from its general operating budget into the athletic budget to make up the difference between revenue and expenses.

Why are universities so often besotted by their athletics programs?

The most common answer is to build school spirit, by which they mean to create a campus climate that will induce students to enroll and alumni to donate — but that only happens if they have winning teams. Of course, they can’t all be “the best” at the national level — by definition there is only one of those — but they can potentially win their conference, and perhaps be selected for post conference play, and (depending on the sport) even be on national television. Or they can go broke trying to reach that level of distinction. We might wish that some of these schools would show similar commitment to enhancing the quality of their educational programs.

In sum, college athletics generally contributes significantly to the overall cost of running the university, and therefore also contributes disproportionately to the rapidly rising costs of tuition and fees, but with no demonstrable educational benefit to the students.

Second, college sports teams are now being used as proxies for the institutions themselves. “Our football team won the national championship;” ergo, we are “the best” in the classroom as well. That reasoning is completely ridiculous, but easily exemplified: the University of Alabama, long a powerhouse in football but not historically highly regarded for its academics (U.S. News & World Report currently ranks it the 110th best national university) has nonetheless parlayed its football success into an expansion of its enrollment. Over the past decade it has grown its undergraduate student body by nearly 13,000 students, almost all of whom are from out-of-state, for a total of well over 16,000 students who are not from Alabama. Even though it discounts these students by a collective $100 million, the University of Alabama still nets more than twice as much tuition money per student from each out-of-state student as it does for each in-state student — so this program has been a huge financial success.

The irony, of course, is that the success of the football program has utterly nothing to do with the quality of the academic program. But our obsession with sports, and especially with winning, creates a visceral desire to be associated with “the best,” even if the reference is to a sports team and not the university itself.

Third, the ranking of sports teams has carried over to the ranking of institutions — not on the basis of wins and losses, obviously, but in an effort to determine “quality” (a synonym for “the best.”) It’s easy enough to construct a method of deciding the best football or basketball team (although the methods themselves are often subject to intense criticism, especially if the method chosen appears to put your team at a disadvantage). But what are the metrics that properly determine institutional quality? Universities are not sports teams; they do many things and serve many purposes. How do you construct a fair and objective set of metrics by which to measure educational quality?

The quick answer is that you do not, because there is almost no agreement on what the primary role of college should be, let alone how to measure how well it is doing. If graduation rates are what counts, colleges will preferentially select the academically strongest high school seniors they can find, and that will mean an overwhelming number of students from relatively affluent, mostly college-educated, families. First generation and/or low-income students will be passed over and left behind — not a desirable outcome for a society that claims to believe in meritocracy and the opportunity for social mobility.

The same kind of unintended consequence will befall almost any criterion you might choose.

Accordingly, the default criterion becomes how famous the college is, which is often a surrogate for how much press coverage its sports teams generate, or how large an endowment it has — and the size of the endowment is also generally related to its age (since endowments grow larger with the passage of time, those colleges that have been around the longest are advantaged). Most such schools are a few of the flagship public universities, or private colleges that have served generations of the American aristocracy, who have been both loyal and generous.

But this line of reasoning leads us to the huge problem we face today.

Too many prospective students believe that their long-term success hinges on whether they are admitted to “the best” college or university, although (frustratingly for these overachievers) the identity of the “best” university varies, depending on who or what is doing the ranking. “Best” can also vary from year to year. For example, over the past 20 years, U.S. News has annually named either Harvard or Princeton the best national university; the California Institute of Technology broke through to occupy that spot once, in 2000. While a given student may well have a preference, presumably not too many applicants would become despondent because they were admitted to Harvard but not Princeton (or vice versa).

Yet the fact that there is more than one “best” university (meaning that there are several institutions widely regarded as top universities) does not equate to satisfying the demand from well-qualified students seeking admission to “the best.” Lost in the stampede of prospective students seeking an instantly recognizable name to decorate their diplomas is the critical point that the real “best” campus is not necessarily the most famous, but the one that best serves the particular student’s interests. Unfortunately, our collective obsession with “winners” and “the best” drives the decision of college choice completely in the wrong direction.

There is no easy solution to this dilemma. As long as prospective students (and their families) remain obsessed about gaining entry to one of the handful of very highly ranked colleges and universities, we will continue to read stories of hard-working and conscientious students being denied entry to the college of their choice, and the heartbreak and sense of failure that results from that denial. We will read of the unfairness when a high-achieving, low-income student, without any resources or support system, misses out because a student from an affluent family, with ready access to tutors, private counselors and summer enrichment programs, was offered admission instead.

It’s like the problem a professional sports team based in a city of moderate size (and limited budget) has in competing with a big city team for the best players — and I use that particular analogy because it reinforces my contention that we have become used to thinking about so much of what happens in our society from the standpoint of how sports are structured. The best players want to play for the best teams, and the best teams are usually those with the biggest budgets. Top students want to attend top universities, and top universities are those with the most recognizable names.

The reality is that the prestige of the school has little to do with the long-term success of the individual student. A diploma from a famous school may help in getting a first job, but very quickly career success will depend on the attributes of the individual, not on the name of the school on the diploma. And many students feel out-of-place at so-called “best” schools, where everyone is an academic superstar, and many of them are rich as well.

As long as we have 20 times more students trying to fit into a place like Harvard (and that’s the actual ratio of applicants to admits), we can be sure that the great majority of them will face disappointment — even despair. Inasmuch as Harvard will never choose to grow in order to meet the demand of rising numbers of applicants, it’s time to change the narrative. Parents and high school counselors need to assure high school seniors that they will not be seen as failures should they attend anything less than the “best” university. College admission is not the equivalent of winning the 100-meter dash. College admission is the start, not the end, of the race — and that start can happen at any college or university.

Take a deep breath, prospective freshmen.You are not failures at the age of 18 because you were not offered admission to an Ivy League school. Focus instead on enjoying the education you will receive at the college that accepted you.

In the long run, you’ll be just fine.

Students Fight to Free Prisoners

(My essay in the Sunday Providence Journal, June 3, 2018)

By now, many millennials and members of Generation Z are accustomed to being labeled as lazy, entitled, self-absorbed. But as we conclude another graduation season here in Rhode Island, I’m reminded that those generalizations overlook the idealism and drive, the passion and compassion of the young graduates I see crossing the stage to get diplomas.

Case in point: Roger Williams University students are taking on the hard and thankless task of challenging totalitarian governments by advocating for scholars who have been thrown in prison or otherwise silenced for daring to speak their minds.

For seven years now, students in an RWU advocacy seminar have been working in partnership with the nonprofit Scholars at Risk Network. And in the fall, RWU will assume a leadership role in New England, helping other colleges to undertake this unheralded research and advocacy work.

This is a far cry from the facile image of teenagers snapping selfies, seeking instant gratification. This is detailed, long-term organizing work that involves conducting research, gathering signatures on petitions, meeting with federal officials, launching social media campaigns and trying in any way possible to break through the cacophony of daily news to let the world know about little-known academics locked away in far-off lands.

This is also a far cry from the stereotype of ivory tower isolation. This is the essence of RWU’s purpose: To strengthen society through engaged teaching and learning.

For example, RWU students have been advocating for Ilham Tohti, an economics professor in Beijing who is serving a life sentence in China for speaking out against the religious and cultural persecution of the Uyghur people, in the country’s northwestern region. He has been called “China’s Mandela.”

Tohti started writing about the tensions in that region in the 1990s, and he set up a Chinese-language website, Uyghur online, to mediate differences between the minority Uyghurs and the majority Han Chinese.

In 2013, the police detained him at Beijing’s airport as he prepared to leave with his daughter, Jewher, to become a visiting scholar at Indiana University. And in 2014, he was arrested by Chinese authorities and sentenced to life in prison on charges of separatism, sparking worldwide condemnation.

RWU students collected signatures on petitions calling for Tohti’s release. They launched a #FreeTohti campaign on social media. They traveled to Washington, D.C., to speak with officials from the State Department. They met with key supporters, such as U.S. Senators Sheldon Whitehouse and Jack Reed, both Rhode Island Democrats. And they accompanied Tohti’s daughter as she testified before Congress.

Cuban journalist Normando Hernández González also was freed, and, while behind bars, he was heartened to hear that advocates were calling for his release.

“When he was in jail in Cuba, he knew we were working on his behalf,” said Adam Braver, the RWU associate professor of creative writing and University Library program director who runs the advocacy seminar. “He told us it gave him hope in a way he hadn’t had before, and he got treated a little bit better because they know people are watching.”

By advocating for imprisoned scholars, students aren’t just learning the fundamental skills involved in research and advocacy. They aren’t just learning the intricacies and vagaries of international relations. They are learning to be citizens of the world.

This world needs people of good will who are committed to sustained action. We need people undeterred by the frustrating scarcity of easy solutions or quick results. We need people devoted to defending freedom of expression and fighting for freedom from oppression.

At a time when authoritarian rulers are clamping down on dissent, we need the next generation to shine a light in the darkest corners. In an era of rampant cynicism, we need a youthful burst of energy and optimism.

This is the generation the world needs now. And it gives me hope.