A quick history of education
The history of education is fascinating, if only because it can be seen as evidence of how the powerful have repeatedly used limiting access to education as a weapon to deny rights and opportunities to the underclass.
Let me restate that point, because it is important: Historically, the expansion and enhancement of literacy rarely resulted from a deliberate policy of benevolence but instead occurred because at various times expanded literacy proved to be in the economic interests of the ruling class. We should never assume that the default position is increased literacy; to the contrary, the default position is continued ignorance of the underclass, and it is shocking to see confirmation of that even today. (Witness current efforts in Washington and in various states to focus public higher education on job preparation, at the expense of a traditional liberal arts education. [1])
In the Middle Ages, before Gutenberg and his printing press created mass-produced books, formal education was essentially lacking in most European countries — unless you were a monk. The capacity to read and write was essential for members of the clergy because they needed to be able to read the Mass or to copy manuscripts. But since the Mass was read to the congregation, it was not necessary (and, from the standpoint of the church, not necessarily desirable) for common folk to be able to read and write themselves. They needed to know only what God wanted of them, and the monks, who could read God’s word, were there to tell them. So no good could come from universal literacy.
The printing press, the Renaissance and the Age of Enlightenment all worked to change the thinking about literacy. And as the Industrial Age began to replace a largely agrarian economy, workers needed to be able to read, write and add and subtract numbers, in order to provide greater utility to the merchants and factory owners for whom they worked.
During the 19th century, free public education through grade six became a common goal in the United States and in many European countries, and the resulting more literate workforce drove economies upward. In America, the idea of a universal high school education began to spread during the latter years of the 19th century and into the early years of the 20th century. But as late as 1940, only half of American adults had earned a high school diploma. (There are still almost 25 million adult Americans today — nearly 12 percent of all Americans over the age of 25 — who never finished high school.) [2]
Currently, high school completion rates nationally stand at 84 percent. But that means 16 percent of the class of 2017 failed to complete high school, and too many of those are entering a world without the skills needed to secure a well-paying job.
During the 1950s and 1960s, as it became increasingly clear that a far brighter economic future awaited those individuals who were able to attend and complete college, parents lobbied to do away with the two-track system in high school (wherein high-achieving students were placed on the “college” track and less successful students were put on the “clerical and trades” track), and instead to create a system that would prepare all students for college. (Whether the students actually chose to attend college was entirely up to them and their parents, but the argument was that they all should be prepared to attend college.)
This was an important change because, until this point, the expansion of educational opportunities was driven by the economic needs of the state or nation. Now we began to see an argument being made on behalf of the individual: it’s not enough that we have sufficient numbers of college graduates to meet our country’s economic needs. I want my child to have that opportunity.
Who should bear the cost of education?
Thus began the tension between the twin beneficiaries of expanded numbers of college graduates: society as a whole, because more college graduates meant increased tax revenues due to higher salaries; and the individual, because the college graduate clearly benefited economically and in so many other ways, in contrast to the person with just a high school education.
This tension has been at the heart of the debate over the last half century about the funding of higher education: Is it the responsibility of society at large, or should it be primarily the responsibility of the individual, and his or her parents?
College funding obligations aside, the problem with the goal of universal college preparation was that it resulted in too many high school students with a high school diploma in one hand and a college rejection letter in the other (or a letter of acceptance that required the student to do remedial work before being able to enroll in college-level courses). That is, a high school diploma and college readiness are not the same thing. Indeed, in some states more than half of students with a high school credential are not qualified to enroll in credit-bearing courses in college.
One answer to this problem has been to increase the size and number of community colleges, where students with little money and/or a poor K-12 education can begin their post-secondary work. For example, under the California Master Plan of 1960, students in the top eighth of their high school class were guaranteed a seat at one of the campuses of the University of California; students in the top third were admitted to a campus of the California State University; and everyone else was obliged to enroll in a community college. The idea was that students who completed their associate’s degree at the community college would then be offered the chance to transfer to a branch of the University of California or the California State University, where they could complete their undergraduate degree.
The problem in California and other states was that the “open admission” standard for community colleges — where anyone can enroll, even without having completed high school in some states — placed the community colleges in the position of having to remediate the relatively poor K-12 education received by a majority of their students. The students were, in effect, repeating work they ostensibly had been taught in high school but that they obviously had never learned. Even the California State University, which admitted only the top third of the high school graduating class, found itself with enormous numbers of students who required remediation before they could enroll in college mathematics or English courses.
The odds against completing a college degree rise dramatically if a student requires remediation. Six-year graduation rates at the California State University campuses vary considerably, but several have historically been below 40 percent, a figure that is sadly typical for many state colleges in other states as well.
Graduation rates at community colleges are far worse, in large measure because the percentage of college-ready students is even lower than in the state colleges. So three-year graduation rates (for a two-year associate’s degree) often fall below 20 percent, and most never finish their degree, let alone transfer to a four-year school.
How much education is enough?
The cost of providing an opportunity for a college education to students who did not receive a quality K-12 education, or who are “late bloomers,” is enormous. And yet the reason behind creating this opportunity remains as valid today as it was when the “two-track” system was abandoned: How else are people who are born poor, who attended failing K-12 schools, who never “caught fire” in high school (or whose fire was somehow extinguished) to have the chance to improve their lives?
Access to a college education is that chance.
Americans note with disapproval those countries that force all secondary students who aspire to attend college to take a qualifying exam, the outcome of which is determinative as to whether that student will progress to a well-paying job or be relegated to the ranks of the working poor. America is the land of opportunity! We believe in second chances! Everyone should have the right to a college education if they have the intelligence and the will to succeed!
But there remains the question of the cost of providing such opportunities.
Although places such as New York City and California experimented with free public higher education, it proved too expensive to be sustained. Our collective will to create additional capacity at colleges and universities waned as the tax bills arrived. Besides, we reasoned, since almost half the students who start college today don’t finish, surely that suggests that many college students shouldn’t have been in college in the first place. In fact, the argument goes, we would be better off with fewer college students, and we could do that by raising entrance standards and excluding those who are clearly not capable (or at least not ready now) of handling a college curriculum successfully.
The fallacy of this line of reasoning is that it assumes that our current educational paradigm, at all levels of education, is as good and as equitable as it possibly could be, so anyone failing to complete a particular level of education does so because of personal inadequacy (“poor learner”), not because of inferior or inadequate instruction.
This simple dichotomy ignores the reality that students are not automatons. Many people face crises while they are seeking an education: a family emergency to which they must devote their attention; food or housing insecurity; not enough money for child care, books, transportation or tuition; discrimination because of race, national origin, religion, gender or sexual orientation.
The point is: While it is convenient to blame the unsuccessful student for failing to be successful, much of the time that lack of success stems not from inferior intellect or a willingness to work hard but from limitations entirely beyond the ability of the student to control.
Moreover, if we are already educating as many people at the college level as are capable of learning at that level, how do we explain the fact that there are many countries in the world that are educating greater proportions of their citizenry than we are? (And be careful about suggesting that their educational standards are lower than ours.)
Is our country doing enough to create successful and equitable educational pathways for students, both at the K-12 and the college levels? How do we meet the economic needs of our country with our current model? Reducing the number of students in college will do nothing to resolve the problems business and industry have in finding a sufficient number of workers with the skills and abilities these businesses require.
Do we educate to meet social needs, or is it about personal development?
In a recent article, [3] Derek Bok, the former president of Harvard University, makes a persuasive argument for a dramatic increase in the production of college and university graduates. Mr. Bok proposes increasing institutional capacity and doubling the number of college graduates by pouring a great many more tax dollars into public institutions. Unfortunately, in today’s anti-tax world, that suggestion will surely fall on the deafest of ears.
And really what Mr. Bok’s proposal demonstrates is the utter futility of calls for action that are not truly directed at anyone in particular, that do not include a consideration of the steps that would need to be taken to achieve the results being called for, and that come from someone — and I say this with the greatest respect for Mr. Bok — who is not in a position to deliver any of the outcomes that he wishes to see (since he is no longer the president of a university).
Mr. Bok is by no means alone in this position. The Lumina Foundation, a foundation with considerable resources, has been calling for years for a significant increase in the production of college graduates (and would appear to have the opportunity to make investments in ideas that might well advance its agenda). And early in his first term, President Obama himself called for a near-doubling of the proportion of college graduates by 2026 (but he never made a strong push to create the political will to achieve this outcome).
Regrettably, we have seen only very modest gains in the percentage of American adults with a post-secondary degree (associate’s or bachelor’s) or high-value certificate over the past decade. We must conclude that calls for action, in and of themselves, have limited value in enhancing the level of higher education attained by adult Americans. We need a more comprehensive and inclusive campaign.
With that in mind, might there be ways by which we could make education more successful for more people, thereby lowering the cost per degree received? And if there are, shouldn’t we start using them?
We will examine these questions in upcoming postings to this blog site. First, however, let’s examine how well colleges and universities are doing with our current model of higher education.
Next week: Is there a disconnect between what America needs and what colleges actually do?
[1] “Mike Lee, Mia Love: It’s time to modernize higher education,” Deseret News, Dec. 15, 2017. (Mike Lee is the junior senator from Utah; Mia Love is a congresswoman from Utah. Together, they have introduced the HERO Act—Higher Education Reform Opportunity—in the Senate and House to “open doors to new education opportunities” that do not involve four-year degrees at traditional colleges and universities, but focus instead on massive online open courses—MOOCs—and certification exams that utilize “alternative accreditation paths.”)
[2] “Educational Attainment in the United States: 2015,” U.S. Census Bureau
[3] The Boston Globe, Sept. 9, 2017