In my four previous posts to this blog, I discussed a series of expectations, concerns and remedies that politicians, parents and the media have for higher education (“Now Everyone Has a Solution for Higher Education,” The Chronicle of Higher Education, Nov. 29, 2013). Taken collectively, this list contains items that are often unrealistic and, at times, contradictory.
Well, that’s easy for me to say. As a university president, I might be expected to be an apologist for the status quo in higher education. But this is an important issue to get right: what aspects of our current economic dilemma properly belong at the feet of higher education, and what components are someone else’s responsibility? It does no one any good for society to create expectations of higher education that higher education has neither the capacity nor the intention to resolve.
How did we get to this point? Let’s start at the beginning. We can trace the history of higher education back at least as far as Plato, and the fields of study that came to be known as the trivium (grammar, logic and rhetoric) and the quadrivium (arithmetic, geometry, music and astronomy). Collectively, these represented the seven liberal arts and were separate from (although sometimes prerequisite for) fields such as law, medicine and theology. This model persisted in Europe through the Middle Ages and the Renaissance, and well into the 19th century.
In America, by the time of the Revolutionary War, nine degree-granting colleges had been established, all built on the medieval European model, all associated with specific religious denominations, and all focused on relatively wealthy white men who were primarily focused on becoming members of the clergy.
Several things happened during the latter half of the 19th century that profoundly affected the direction and subsequent future of American higher education. First, the Morrill Act of 1862 established state-run land grant colleges focused in part on agricultural and engineering education – a very different model from the traditional liberal arts. Second, in 1869, Charles W. Eliot became president of Harvard. Eliot introduced the elective system and brought into practice the Jeffersonian ideal of a meritocracy based on talent and competition. Third, in 1876, Johns Hopkins University was established, based on the German model of focusing not just on the study of the past (teaching), but the advancement of knowledge (research). Many other universities promptly adopted this model, and it is the most common model used by American universities today.
Specialized institutions, with focused missions, also came into being. Schools focused on educating women (1742) were established, as were schools for freed slaves (1837). Normal schools (1839) focused on teacher preparation; polytechnics, such as MIT (1861), focused on industrial science and technology; junior colleges (1901) focused on providing students the opportunity to receive the first two years of an undergraduate degree in their home communities.
By the beginning of the 20th century, employers had begun preferentially hiring college and university graduates, an outcome that led to a doubling of college attendance between 1920 and 1930.
In 1944, the GI Bill permitted almost 4.5 million returning veterans the opportunity to attend college for up to three years at no cost. A college education became the ladder to the middle class, as it remains to this day.
So where are we? This brief history of higher education demonstrates that it has not been a static industry. To the contrary, it has expanded or reinvented itself repeatedly, as times and circumstances dictated. It is no longer possible to become a doctor or a lawyer by apprenticing oneself to a practitioner. It is not even possible to attend medical or law school without an undergraduate degree. Licensing organizations, in areas as diverse as accounting, architecture, engineering, law and medicine, predicate the awarding of certification in part on an earned degree in those professions. Corporations hire new employees based, at least in part, not just on the possession of a degree, but the possession of a particular degree.
Possession of a university degree has increasingly become the essential ticket for entry into the world of middle-class jobs.
In that sense, higher education is a victim of its own success. Because our past graduates have been found to be competent employees, businesses and organizations often limit their search for new employees to people with college and university degrees. As a consequence, those without a degree are excluded from a growing list of well-paying jobs. Not surprisingly, demand for higher education has never been greater.
But access to higher education requires a certain level of intelligence, a strong K-12 preparation, a good work ethic – and, often, a considerable amount of money. The absence of any of these precludes either gaining access to, or graduating from, a college or university (or at least certain colleges and universities).
But not being admitted to (or graduating from) a good (ideally, great) college or university is so consequential that society expects higher education institutions to enable students to overcome any, or all, of these shortcomings.
And this is why, when some prospective students are excluded from our world, we face the wrath of politicians, parents and the media.
Is it reasonable to expect colleges and universities:
- to assist students in overcoming a weak K-12 preparation?
- and to ensure that students from historically underrepresented groups are increasingly included in future entering classes (regardless of the quality of their educational foundation)?
- and to accept students without regard to their ability to pay for their education?
- and to guarantee that the large majority of the students we admit not only graduate in a timely fashion, but do so with a well developed set of highly marketable skills – and having little or no debt?
If colleges and universities can’t do all these things, is it because we are unwilling to employ technology, or to accept evidence presented as equivalent to what the student would have learned with us, even when we are far from certain that the work is anywhere near equivalent? Should federal student aid dollars be withheld from us if we do not, as individual institutions, achieve a prescribed level of outcome on a series of measures relating to cost, graduation and the attaining by our graduates of well-paying jobs?
So let’s quickly review. Certain of our students are in the professions, learning the skills necessary to succeed in those professions. Passage rates on the bar examination, the CPA examination, or the professional engineer examination provide external validation of the adequacy of our educational curriculum.
But fully half of today’s students are in the liberal arts, the ancient home of higher education. They are learning to think critically, to reason, to analyze, to synthesize, to frame arguments, to bring order to a chaotic jumble of information, to communicate effectively, both orally and in writing. These are all socially useful skills, the hallmarks of an educated individual – but they were never intended to prepare people for a specific type of employment. It is the employers themselves who have relied on the inherent value of a college education to provide them with smart employees who can be trained, or taught, the specific skills they need for the specific job they have been employed to do.
Yet we are now being told that our education is not good enough. Our students must graduate “job ready.” Employers no longer want to spend their time and money training their new employees. Higher education is being expected to plug the gap – and if that means training our students rather than educating them, so be it.
This is not a uniquely American issue. A recent report from France relates the story of how the government wants to undertake curriculum reform to ensure that college graduates are more employable by French businesses (“French Government Wants Academe to Work More Closely with Business,” Inside Higher Ed, Jan. 24, 2014). Needless to say, this idea is going down sideways with many French academics.
And then there is a report of an Australian study of higher education in various countries with a provocative title: “Study Finds That Countries That Fund Freely and Regulate Loosely Have Best Higher Ed,” Inside Higher Ed, Jan. 14, 2014. A ranking was prepared based on resources; regulatory environment; connectivity between institutions internationally; and teaching and research output. The U.S. ranks highest on output – but here is the warning from the study: “Governments that are tight-fisted and keen to exercise control are least likely to preside over a higher education system of quality.”
The United States has benefitted mightily from having what has long been recognized as the best system of higher education in the world – and the quality of that system stems, in large part, from the absence of a centralized ministry of education and from the presence of considerable resources. Both of these circumstances are now in jeopardy, and so, too, is our reputation for world-class quality.
Higher education is not perfect, and it must adapt to changing circumstances (as it has repeatedly done in the past). There is a conversation to be had between higher education and the state and federal governments – and society as a whole – in order to come to an agreement on where responsibility for particular expectations properly rests. However, the present scapegoating of higher education in order to justify greater governmental intrusion (or fewer governmental resources – or both) has created the single greatest threat the American economy faces today.
This conversation begins with higher education’s speaking out on what it does, and what it takes responsibility for. We cannot allow others to impose their expectations on us without so much as a whimper on our part. Almost 400 years of higher education history in this country depends on our speaking up.
Let’s begin the conversation!