In 2017, the University of Virginia reported an operating budget of almost $3.2 billion, assets of $11.2 billion, and liabilities of more than $7.8 billion. The university includes UVA Global LLC, a wholly owned subsidiary based in Shanghai; an athletics enterprise with 25 programs and $24 million in revenues and expenses; a police force with 67 officers; an investment company that manages resources from 25 tax-exempt foundations, each with its own board; ownership of numerous art, historical, and scholarly collections, including more than five million printed volumes; capital assets in the form of academic buildings, dorms, and a Unesco-recognized World Heritage Site; a top-ranked medical center with several affiliated health companies, more than 12,000 employees, and its own budget of almost $1.5 billion; a concert-and-events venue for everything from monster-truck rallies to the Rolling Stones; a recycling business; a mental-healthcare provider; and a transportation system with a fleet of buses and cars. Incidentally, UVa also educates around 16,000 undergraduates and 6,500 graduate and professional students each year.
In 1963 the University of California’s president, Clark Kerr, famously predicted this state of affairs when he described the postwar American university as a “multiversity” — an institution serving varied, even conflicting, interests and oriented to a range of purposes.
Today, Kerr’s multiversity seems quaint. Universities both public and private contend with an ever-expanding range of demands and expectations: that they satisfy the health-care needs of local populations, that they redress manifold social inequalities, that they serve as engines of economic activity and growth — even as the most elite among them have historically exacerbated some of these very same problems.
The multiversity Kerr described was not the result of any considered plan or coherent philosophy. Rather, it emerged inadvertently as a congeries of historical conceptions of the university. Kerr identified three salient traditions. The first was represented by Cardinal Newman, founder of the University of Dublin in the mid-19th century. Newman regarded the purpose of the university as the pursuit of knowledge for its own sake, cultivating gentlemen suited to lives of erudition, taste, and intellectual refinement. The second was embodied in Abraham Flexner, an American educational reformer who, in 1930, founded the Institute for Advanced Study, in Princeton, N.J. He invoked a German model that defined the university as an institution devoted to specialized research.
Finally, Kerr described the “American model,” which he saw most strongly reflected in the land-grant movement of the latter half of the 19th century. This distinctly American idea of the university was born of an explicit twinning of higher education and the democratic project, opening the doors of the academy to a broader public and emphasizing such “practical” fields of study as engineering and agriculture. If Newman’s university served the generalist and Flexner’s the specialist, the American model was to serve the demos.
Kerr saw all three models as coexisting in the multiversity. The balance among them varied by institution, but, under the watchful stewardship of presidents, they remained in a general state of homeostasis. In the 55 years since Kerr’s treatise, however, the “American model” has increasingly eclipsed the other two. Regardless of what they do or how they fund and organize themselves, American universities understand themselves as institutions in service to the public.
Yet this shift has proceeded with no clear sense of who that public is or why universities ought to serve it — and, perhaps most important, no clear sense of how they should serve it. While the challenge of Kerr’s multiversity was to balance the interests of students, faculty members, administrators, and governing boards, its contemporary derivative — call it the “omniversity”— is beholden to a public whose often conflicting interests are far more complex and intractable than suggested by the deceptively monolithic ideal of “the public.”
American universities’ democratic commitment has been both empowering and imperiling. It has compelled them to open themselves up to previously excluded publics, but it has also encouraged them to accrue a range of functions they were never imagined for — and are often ill-equipped to take on. The university’s appetite for always doing more could prove to be its undoing.
If the university is to flourish and continue to play a vital role in American life, it needs to reinterpret its democratic legacy. And it needs to do so with a frank acknowledgment of the fragility of the public it purports to serve. The university is what it is today, in part, because of the atrophy of other public institutions, which has left universities to fill a widening void. Higher education is in a precarious position; so too is the American republic. In order not just to save themselves but to fulfill their social role, universities need a more refined understanding of their responsibilities to the public — and of how to meet them in ways that are consistent with their own animating purpose. They also need an honest appreciation of their limits.
We issue this charge from the University of Virginia, an institution that, since its founding 200 years ago, has cast itself as a guardian, as Thomas Jefferson put it in 1818, of “the public prosperity,” even while, for much of its history, consistently limiting that “public” to wealthy white men. One of us is a member of the faculty; the other is a career administrator. If the chatter about higher education is to be believed — faculty members bemoaning “administrative bloat” and parasitical “BS jobs,” and administrators sighing about the vanity and cluelessness of the faculty — surely we represent warring sides in the struggle for the soul of the university. But we see our respective roles as expressions of the university’s admirable if often misdirected interpretation of its public purpose. And on this we agree: In an era of public disinvestment and disintegration, universities need to reconsider how they can most effectively serve “the public” whose flourishing they have long been charged with sustaining.
Until the last third of the 19th century, American higher education was a predominantly private and ecclesiastical endeavor. But the federal government intervened on a grand scale with the passage of the Morrill Act in 1862, granting states federal land for the establishment of public universities to offer training in agriculture, engineering, and the liberal arts.
Until then, most colleges educated a largely East Coast, Protestant, white-male elite, molding Christian gentlemen who would go on to lead the nation. If universities existed “to form the statesmen, legislators and judges, on whom public prosperity and individual happiness are so much to depend,” as Jefferson first described the purpose of UVa, then the newly imagined institutions of the Morrill Act sought to serve that public more directly.
By the end of the century, American universities had not only begun to rival those in Berlin and Göttingen but to represent a new and distinctly American vision of higher education. The University of Chicago and its founding president, William Rainey Harper, are exemplary. In “The University and Democracy,” a speech delivered at Berkeley in 1899, Harper lent the legislative intent of the Morrill Act a moral force. The university, he declared, found its legitimacy in a democratic public; it was “of the people, and for the people, whether considered individually or collectively.”
Get the Chronicle Review Newsletter
Sign up to receive highlights from our magazine of ideas and the arts, delivered once a week.
But Harper raised the rhetorical stakes: The university was also the “agency established by heaven itself to proclaim the principles of democracy. … It is the university that, as the center of thought, is to maintain for democracy the unity so essential for its success.” The university was a divine agent entrusted with a historical task. It was the prophet who proclaimed the promise of democracy, the philosopher who reflected on its problems; it was the priest who preserved the practices of human communion and maintained the “religious cultus” of democratic traditions. American higher education did not just educate democratic citizens. It sustained democracy itself.
Harper’s vision became a guiding ideal across America. In 1905, Charles Van Hise, president of the University of Wisconsin, first articulated what would come to be known as the Wisconsin Idea: he would not rest until the “beneficent influence of the University reaches every family of the state.” Addressing Princeton’s faculty and students as early as 1896, Woodrow Wilson declared that “when all is said, it is not learning but the spirit of service that will give a college place in the public annals of the nation.”
Over the course of the 20th century, this democratic self-understanding begot wondrous goods. Universities gradually expanded access, admitting women, Jews, and people of color. From 1890 to 1940, enrollment increased roughly fivefold. After World War II, enrollment rates soared, then stabilized in the 1970s, only to continue their upward advance in the 1980s and 1990s. Universities could no longer legitimately presume that their student bodies of years past — elite, white, male, generally Protestant — represented the public beyond their campuses.
As universities enrolled more students, they also assumed more and more complex social functions. Between 1900 and 1950, they absorbed professional training, establishing schools for business, medicine, dentistry, pharmacy, law, and theology, among others. They built dormitories, hospitals, football stadiums, laboratory schools and child-care facilities, public museums and performance halls, and, as the footprint of the campus expanded, transportation systems and off-campus housing.
During the Cold War, serving the public also turned out to be in universities’ material interests. They took advantage of enormous federal resources. Beginning with the Servicemen’s Readjustment Act of 1944 (the GI Bill) and continuing with the National Defense Education Act of 1958 and the Higher Education Act of 1965, universities were transformed. Enrollments, physical and administrative infrastructure, student financial aid, and federally funded research all expanded. Increasingly, universities turned to their alumni as a significant source of revenue. To invest in higher education was to invest in the democratic project. This was a social compact.
The irony of the success of the “American” model was that it authorized a proliferation of purposes. Harper had described the university as a prophet of democracy. Prophets are chosen instruments, divinely elected to share a god’s clarion message. But whose voice, amid the myriad voices of “the people,” was the university to relay?
The reimagination of the university as fundamentally democratic was glorious and good, but also, as it turned out, perilous. It licensed universities to pursue, relatively unconstrained, activities and functions for which they were often ill-equipped; it also distracted them from their basic strengths in education and research.
There is another reason, just as crucial, that the American university developed into the corporate hydra it is today.
In neighborhoods and communities across the country, universities are not just the largest employer but are often one of the few remaining public institutions. Over the past half-century, and with accelerating speed since the 1980s, universities have attempted to fill a growing vacuum in American public life. They’re not growing (simply) to satisfy the egos of presidents or mollify crazed sports fans, but to address real and pressing social needs such as health care, child care, economic development, and social mobility. The plight of contemporary universities is that they may be the last public institutions left standing.
They can’t do it alone. And the more they try to, the more their authority and potential for public good is eroded — internally by increasingly fractious debates among faculty members and administrators about “mission creep,” resource allocation, “corporatization,” and so on; and externally by government and public perceptions that (ironically) universities are pursuing private interests and should therefore forfeit the rights and privileges, such as tax-exempt status, afforded to entities serving the public good.
In this twilight of American public life, universities have assumed the responsibility not just to address — to study, theorize, research — social problems but also to redress them. They have increasingly assumed the burden of attempting to repair and sustain a society that extends well beyond their campus gates. Consider the large-scale projects that Harvard and the University of Pennsylvania have undertaken in the past several years to expand their campuses and “revitalize” and “renew” local neighborhoods — projects motivated by a broader failure of social institutions. Such efforts at “urban renewal” have hardly been uniformly salutary, often resulting in the destruction of functioning communities — in some cases, ironically, to accommodate new schools of public health or social work. Universities have always served distinct functions and purposes. But how much should we, that ill-defined demos, hold them responsible for repairing a society whose public institutions are unraveling?
The fate of American universities over the course of the 20th and now 21st centuries has been inextricable from the fate of American society more broadly. How can they fulfill their democratic responsibilities but avoid the endless accretion of functions that risks undermining them? How can universities adjudicate among their proliferating purposes?
Scholars such as Christopher Newfield have consistently called for universities to recover a “public good conception” to overcome their capture by private interests. But it is precisely such a vague public commitment that makes the contemporary university’s situation untenable. The conflicting interests of the public, the systematic and long-term disinvestment in public institutions more broadly, the amalgamation of public and private interests — all of these make any return to an unalloyed commitment to an idealized “public” difficult and ill-advised. The university’s democratic commitments have become too centrifugal, pulling apart its interests, energies, and purposes. To save itself and to better serve its democratic purpose, the university needs to be not more but less reactive to public demands.
One consequence of the ascendance of the “American” model is that it forced universities to justify themselves in public terms. But today the only widely shared moral language, the only commonly accepted way to talk and think about ideals and purposes, is the rubric of economic utility. So universities describe themselves in terms of economic value — their contributions to economic development, to technology innovation, to work-force training. The collapse of our public institutions might be matched only by the poverty of our moral imaginations.
What we are calling for is a university whose democratic responsibilities are revivified by its animating purpose: what Daniel Coit Gilman, founding president of the Johns Hopkins University, described as the “acquisition, conservation, refinement, and distribution of knowledge.” Universities cannot sustain the aspirations of a democratic society on their own. But they can — and ought to — serve democratic ideals by more intently focusing on their unique role: to create and share knowledge.
The university’s role in a thriving democracy, however, makes sense only within a coherent and functioning social whole. Its ability to educate and create knowledge depends upon local preschools to care for the young children of faculty and staff members and students; a reliable public transportation system to deliver people to and from campus; a health-care system to tend to our bodies and minds; and local, state, and federal governments to pass relevant legislation and fund civic infrastructure, including the work of universities themselves.
Democracy does not need a prophet; it needs a public. And universities can help sustain, nurture, and establish that public by bringing knowledge out into the world and defending it as a common good. The history of American universities and that of the American republic are interwoven, and so too are their futures. It is not enough to save the university; we must redeem American public life.
Adam Daniel is senior associate dean for administration and planning at the University of Virginia, where Chad Wellmon is a professor of German language and literature.
In a recent Chronicle Review essay with the clickbait headline (which the authors did not write) “Why the University’s Insatiable Appetite Will Be Its Undoing,” Adam Daniel and Chad Wellmon, respectively an administrator and a professor at the University of Virginia, argue that the university should be more focused on what it does best — teaching and research — and less responsive to broad social pressures: “To save itself and to better serve its democratic purpose, the university needs to be not more but less reactive to public demands.”
There are serious problems with arguments like this, much in the air right now, that blame universities for everything: overbuilding, high tuition, teaching too many subjects, incurring too much debt. Universities, according to Daniel and Wellmon, are simply doing too much all around.
Maybe that’s true for UVa, though I suspect not. It is certainly not true for the majority of the universities in the United States facing serious economic problems, problems which are not of their own making.
Assaults like Daniel and Wellmon’s are worryingly short on specifics, and therefore leave us with few means for finding a constructive solution. Instead, they all too readily echo the drumbeat — most common in conservative circles but not only there — that higher education costs too much and doesn’t do its job. I agree that tuition is too high at many of our universities, and I am an ardent champion for higher-education redesign that better supports our students in a complex world. However, if we do not take seriously the reasons we are in the state we are in right now, we will come up with more spurious and wrong-headed “solutions” that exacerbate rather than remedy the problems in higher education today.
“Always historicize!” isn’t a bad idea if you are looking to find a solution to a problem, rather than a scapegoat. What follows are some key assumptions made by policy makers and the public over the course of the last several decades, beginning with the reversal of the post-World War II investment in U.S. higher education during the governorship and then the presidency of Ronald Reagan. Each of these arguments for educational reform and retrenchment has contributed to the current crisis.
No. 1: “Higher ed should be run like businesses.” Colleges and universities, the thinking goes, need to be entrepreneurial. They need to hire CEOs as presidents, and their boards should be composed of business people. As a consequence, universities end up pursuing big grants and big donors. We know this favors science. We also know, from Christopher Newfield’s work, that it incurs long-term costs — buildings, labs, staff — that persist after the initial massive investment and after the granting organizations or the private donors have moved on to other interests. And it leads to the escalation of administrative salaries, with universities competing with corporations for college presidencies. The move to external funding also requires increased administrative staff (not bloat) to manage the complexities of budget, intellectual property and copyright agreements, income and profit sharing, and many other contingencies.
No. 2: “The public should not need to fund higher education. Higher education should fund itself.” In recent years, we have witnessed massive state cutbacks to higher ed, resulting in a roughly 20-percent-to-50-percent per capita reduction in public subsidy in some states. So tuition rises. Some states, such as Colorado, now subsidize under 5 percent of university costs. The rest comes from private or public funding sources (such as Pell Grants or grants from government research agencies) or, tragically, higher tuition.
No. 3: “Higher ed doesn’t really train students for the future. It’s out of date.” Increasing numbers of Americans think higher education is no longer worth it (although, given the growth in Kaplan-style SAT cram schools and the escalation of applications to elite colleges and universities, it is clear that the affluent are still working to ensure that their own kids go to college). However, many of the attempts to bring college “up to date” are badly misinformed wastes — for instance, MOOCs, which certainly won’t do the trick. They enrich technology entrepreneurs without improving the quality of learning. Lots of bad policy is justified by this one, perhaps most notoriously the California State University system’s hasty 2013 implementation (for an undisclosed sum) of the for-profit Udacity online courses in remedial math, algebra, and statistics at San Jose State. Supposedly these online courses were going to outperform actual classroom teachers. The retreat from that program in the face of poor results was as rapid as its adoption, yet the clarion call for “technology” to solve educational woes remains.
No. 4: “Higher ed costs too much.” It absolutely does. You now need to be rich to afford many universities. But there is huge variation. Community college is still relatively inexpensive — but also lacks the resources to expend on those students facing the biggest challenges. Belt-tightening is hardly necessary in university and community-college systems where costs are already low and resources very scarce, where faculty with full-time jobs teach heavy loads, and where well over half of courses are taught by underpaid adjunct professors with no benefits or security. Belt-tightening? At many public universities (and private too), students are facing food insecurity. And so are adjunct faculty. Institutions are impoverished. They have been robbed.
No. 5: “Make international student visas more difficult to attain.” The recent rise in xenophobia and difficulties in obtaining student visas have led to a diminishing number of students from all around the world coming to the U.S. American higher ed is valued everywhere, and we used to have the international student body to prove it. After a decade of inviting international students (for cultural, social, intellectual, and, one must acknowledge, financial reasons), now such students are going to … Canada. Universities are feeling the effects everywhere, and so will our labor force.
Higher ed needs to change. But accusing it of insatiability will only justify more damaging cutbacks. Where will those be made? Who will make them? And will students and faculty, knowledge and teaching and research, be the winners? Or will this end up being another blame-the-victim assault on higher ed? If we aren’t sufficiently explicit about the pressures that have brought us to this juncture, we undermine any chance for sane, reasoned, innovative reform.
Cathy N. Davidson is a professor of English at the Graduate Center of the City University of New York and the author of The New Education: How To Revolutionize the University To Prepare Students for a World in Flux (Basic Books, 2017).