Research Universities and the Future of Work
The handful of premier institutions cannot by themselves educate the next generation of highly skilled workers, but they can—as they did in the twentieth century—apply their formidable intellectual power to understanding and addressing critical national challenges.
Growing national concern about the future of work has three main causes. The first is technological: astonishingly rapid advances in artificial intelligence and the accompanying likelihood that it will supplant human tasks in a wide range of occupations. The second is economic: the fragile material situation of millions of people in the United States who have seen their modest fortunes stagnate or decline in recent decades despite unprecedented accumulation of wealth at the top of the income ladder. The third is political: the outcome of the US presidential election of 2016, in which a charismatic candidate tapped the resentments of modestly educated and marginally employed people to win the White House.
US research universities are implicated in all these phenomena. University engineers and computer scientists devise tools foundational to the digital revolution. Their social scientists theorize about the character of modern economies and advise governments on how to regulate them. Their business schools teach entrepreneurs and managers how to best profit from evolving technologies and markets, and their liberal arts programs admonish future tech and finance professionals to embrace the diversity created by ever more rapid and global flows of people and ideas. Obtaining a degree from one of these schools is a virtual prerequisite for being branded a coastal elite in today’s ever more venomous cultural politics. The entire enterprise relies on a steady outlay of tax monies to subsidize university physical plants, fuel research programs, and underwrite tuition grants and loans.
Research universities also are ideally positioned to help the nation adapt to a future in which the relationships between education, work, and economic security will be much different than they were a generation ago. Then, a four-year university degree was a reliable insurance policy for well-compensated lifelong employment. Then, that degree was handsomely subsidized by state and federal governments and within reasonable financial reach of most people who had finished 12th grade. Now, that degree is a serious personal and family investment, leveraged with loans, whose completion comes with much less assurance of job security throughout adulthood. This fact is just the sort of wicked problem—complex, multifaceted, implicating virtually every domain of contemporary society—that research universities are ideally positioned to take on.
When I speak of research universities, I refer to the perhaps one hundred institutions whose variety and overall quality of training and research make them truly comprehensive engines of human capital. These schools comprise the definitive elite of a highly heterogeneous national postsecondary order. Social scientists increasingly use the organic metaphor of ecology to describe this order, in which schools both cooperate and compete with one another for relatively advantageous niches in fiercely competitive markets for students, tuition, faculty, research dollars, philanthropic patronage, government subsidy, and prestige. Although research universities are a relatively rare academic species in a national institutional population that includes some 5,000 postsecondary schools, they strongly shape the entire ecosystem. They receive greatly disproportionate shares of research grants; they train the lion’s share of future faculty; and—in a uniquely American phenomenon—their athletics teams produce the content for an entire genre of spectator sports. Both the officially public research universities (Berkeley, Texas) and the officially private ones (Stanford, Yale) take similar organizational forms and rely heavily on government subsidy. Almost all of them have football teams. I speak of them here interchangeably.
To date these great engines of human capital have been only minimally deployed to help the nation prepare for a future in which firms will be distributing work ever more fluidly between humans and machines, and workers will be expected to continually and flexibly prepare themselves for different jobs. Aside from a flurry of activity with massively open online courses (MOOCs) by a number of elite schools beginning in 2012 and some more substantial forays into alternative credentialing by a few others, research universities have been quite conservative in their contributions to national discussions about the future of work. Nor have they been called on to do more. The Obama administration directed its attention at very different parts of the postsecondary ecology: community colleges, which educate the vast majority of students pursuing postsecondary degrees, and the for-profit schools whose share of enrollments and federal student aid grew quickly in the early 2000s. Philanthropists took the same route. The Bill & Melinda Gates Foundation and the Lumina Foundation shaped a decade of policy debate on college persistence and completion at community colleges and other broad-access schools while virtually ignoring research universities. To date the Trump administration has had little use for them either, instead calling for employer-based apprenticeship programs and rolling back Obama-era efforts to shield students and tax dollars from the predatory practices of for-profit schools.
Nevertheless, research universities are ideal levers for positively influencing the future of work. The sheer breadth of their capacity and their strategic location in a peculiar civic space where government, business, and public life intersect enable research universities to catalyze fairly radical new ways of measuring, producing, and sustaining human capital. The question for public policy-makers is how to convince university leaders and influential patrons to define the future of work as their problem.
National service
Research universities are hybrid organizations, commingling elements of government, business, and civil society into a peculiar amalgam of all three. Whether technically public or private, universities receive subsidies and tax exemption from governments on the premise that they provide essential services to the nation: basic research, economic development, workforce training, social mobility. Although universities are not technically businesses, they often maintain revenue-positive programs and use them to cross-subsidize lines of service that burnish prestige. Bottom lines of the overall portfolio matter a lot to chancellors, state legislatures, and boards of trustees. Yet unlike businesses, universities retain very strong norms of intellectual openness and physical porousness. People expect universities to accommodate their own dissenters and oppositional points of view. Quadrangles, libraries, and lecture halls are often open to visitors and contribute vital capacity for public life in communities nationwide.
Research universities sustain extraordinarily varied production functions. It would be no surprise to find internationally recognized offerings in architecture, art history, and astronomy cohabiting the same physical campuses as standard setters in neuroscience, nanotechnology, and Native American studies. This heterogeneity is an artifact of early universities’ formative role in encouraging settlement of the nation’s long-expanding western frontier. Consider the Morrill Acts of 1862 and 1890, through which Congress made gifts of land to support the founding of state institutions that would spur economic development and national solidarity in the wake of the Civil War. These so-called land-grant universities became anchors for financial investment and settler migration to particular regions. The presence of an institution with reputable programs, a decent library, and a photogenic physical campus was a signal to potential settlers and East Coast investors that a place was ambitious, going places, looking up. Since any one place could support only so many academic institutions, early universities had to be organizational jacks-of-all-trades, housing talent and knowhow in whatever forms they appeared. Sports, football especially, were central to the business model. In a country constitutionally skeptical of elites and intellectuals, the meritocracy and physicality of intercollegiate sports lent universities an accessible populism and gave voters and taxpayers something to love about “their” universities regardless of academic predilection.
Federal investment in universities continued throughout the twentieth century. The political historian Christopher Loss explains how President Franklin Delano Roosevelt’s administration recognized the great potential of universities to administer New Deal programs. Regional universities were apt vehicles for social provision from Washington: local, trusted, and even beloved institutions already associated with economic and civic improvement. Republicans also recognized the strategic value of universities for government projects. After losing the White House to FDR in 1933, Herbert Hoover returned to his alma mater, Stanford University, a genteel private institution in far-off California, and there helped to seed a “contract” system that enabled federal funding to flow through Stanford to support research while maintaining the formal autonomy of the university.
These early expressions of civic work through universities were scaled up dramatically during World War II. Universities became pivotal mechanisms for the national mobilization of human, material, and intellectual capital for the war and integrated a geographically dispersed economy into a more or less unified military machine. University campuses were training grounds for drafted servicemen, their labs incubated new weaponry and communications technologies, their psychologists and psychometricians created competency tests to sort and assign enlisted servicemen, and their social scientists penned intelligence on foreign enemies. By war’s end universities had accumulated a great deal of approbation for a distinctively American form of civic action. They were not part of government, but they proudly served government, and by extension, the American people. As the inscriptions on the Dexter Gate to Harvard Yard famously read, “Enter to grow in wisdom. Depart to serve better thy country and thy kind.”
The future is not what it used to be
The conclusion of WWII hardly ended government reliance on higher education to take on big problems. Indeed, the war’s end created a new problem with uncanny resonances to the future-of-work anxieties of the present day. The war had decisively concluded the Depression by sending battalions of men, most of them white, into battle and enlisting women and racial minorities to serve military supply chains. Soldiers headed home to a highly uncertain domestic economic and political order. Many of the veterans were modestly educated: fully half of them had entered military service without a high school diploma. How would the nation manage?
The solution, drafted jointly by congressional committees and myriad civic and labor organizations, was the Servicemen’s Readjustment Act of 1944, popularly known as the GI Bill. To spur consumption, the GI Bill subsidized veterans’ home mortgages. To manage workforce reentry and build stateside human capital for a postwar economy, it subsidized veterans’ continuing education. It worked.
Young men who had never before imagined themselves to be college material enrolled in academic programs of wide variety. Universities grew to absorb them, often tapping funds from federal agencies and state legislatures to build new dormitories and academic facilities. As the GI Bill ultimately sent two million veterans to college, it also changed the meaning of higher education: it became the nation’s official vehicle of meritocratic and indeed honorable social mobility.
In the subsequent decades the GI Bill was joined by several federal programs that enabled the United States to build the largest and most productive higher education system the world had ever seen. Congress responded to the Soviet Union’s launch of Sputnik in 1957 with the National Defense Education Act (NDEA) the following year, opening a wide new stream of defense-related research funding to universities. Greatly expanded budgets for the nascent National Science Foundation (1950) brought additional billions to universities over time. The Higher Education Act (1965), part of President Lyndon Johnson’s Great Society initiative, absorbed and continued the NDEA and greatly expanded citizen access to federal tuition subsidies. This multifarious investment very quickly made the US higher education system a global model for strategic science, social mobility, and national economic development.
Although it could go without saying that the current political and fiscal climate all but prohibits that scale of government investment in the present moment, those who are concerned about the future of work have some spectacular assets developed during those earlier generations of state-building: fully mature research universities that sustain expertise in every field of knowledge and enjoy a great deal of citizen fealty. History has demonstrated that universities are capable of catalyzing effective responses to large-scale national problems. Although the major research universities cannot by themselves educate an entire generation of future workers, they can play an essential role in leveraging expertise relevant to preparing them for very rapid economic and technological change. They can be most effective by directing their energy along three broad avenues.
Measuring human capital. The Stanford historian David Labaree aptly summarizes US higher education as “a system without a plan.” Universities served the nation well in the twentieth century in part because their leaders were willing to take on many different functions. The system grew haphazardly as it assumed more tasks and constituencies. Now the enterprise is so extraordinarily complicated that it is difficult for any but an academic bureaucrat to navigate. There are hundreds of courses of study and myriad versions of “college,” offered on varied calendars and platforms and at a wide range of price points. Grant and loan programs are similarly byzantine. The most privileged families bypass this complexity by pursuing the gold-standard, full-service option—a four-year bachelor’s diploma from an admissions-selective school—but this product is available to only a tiny minority of those seeking college educations, is very expensive, and caters almost exclusively to those who are under 21 years of age. There is a great deal of room for deception and graft. However scrupulously particular schools manage their own recruitment and financial-aid programs, there are others eager to exploit the chronic information asymmetry between prospective students and schools. The for-profit college debacle of the early 2000s, in which publicly traded companies that were universities in name only reaped huge profits by running federal grants and guaranteed loans through the bank accounts of unsuspecting college hopefuls, diminishing millions of careers and credit scores, is only the most vividly catastrophic failure of a postsecondary ecosystem long on marketing but short on ground truth.
A parallel information problem can be found in the mechanisms through which potential employees are matched with jobs. In the wake of the GI Bill and the massive postwar expansion of postsecondary access, a four-year degree became the baseline qualification for virtually all well-compensated white-collar occupations. But given the wildly inconsistent programs underlying them, four-year diplomas don’t signal much in the way of specific skills. Nor does the lack of a bachelor’s degree necessarily mean a candidate for employment lacks particular attributes. College credentials are now a perniciously legitimate mechanism of employment discrimination, systematically favoring those privileged enough to have attained them, regardless of underlying aptitude or ability.
It doesn’t have to be that way. The current anarchy came into being during the era of paper records, when integrating information across organizations was technically laborious and costly. The replacement of paper records with digital files enables previously unimaginable integration and representation of vast quantities of information. What Amazon now enables for comparing products in consumer retail markets is now theoretically possible for educational opportunities and job skills. Impediments to getting there are now largely political. Many schools benefit when credential and employment markets are calibrated on mystique and reputation rather than dispassionate measurement. Firms such as LinkedIn and GlassDoor see huge profit potential in rationalizing and privatizing these markets.
Research universities are apt mechanisms for bringing ground truth to this domain, because their legacy of government service enables them to be trusted fiduciaries of government data. Additionally, all those decades of public investment in science training have produced some extraordinary academic talent for mining it. University researchers such as Harvard’s Raj Chetty, Michigan’s Jason Owen-Smith, and Stanford’s Sean Reardon already are revolutionizing the measurement of human capital by bringing sophisticated computational methods to bear on tax, census, and school-district data. One can only imagine the kinds of insights their colleagues and students might derive if government-held sources of information were responsibly linked with data from the likes of LinkedIn.
Learning science. Today, the sine qua non of quality scientific research on learning is a randomized control trial: researchers use a common instrument to assess what two (or more) samples of demographically comparable people know at one point in time; subject (or withhold) an instructional treatment; and then measure again to observe what has (or has not) been learned. Such designs have the virtues of statistical validity and reliability as well as conceptual concision; however, they typically cannot offer insight into the “how” of learning—the neurological, cognitive, and interactive processes through which people successfully acquire knowledge or skill.
Learning processes are very difficult to observe at scale when instruction happens through physical copresence or written correspondence. Seasoned teachers, tutors, and classroom ethnographers may develop good insight about what enables or inhibits learning for particular persons or situations, but more useful insights will emerge when comparable data can be systematically accumulated from a large number of cases.
Digital media make it possible to observe variation in how millions of individual persons receive and respond to instructional treatments. When teaching and learning are digitally mediated, they leave evidentiary traces of very high fidelity. Keystroke analyses, attention heat maps, and measures of time spent on tasks can all be leveraged to create detailed portraits of how individuals and whole populations learn. This computational learning science is still in its infancy, but it has captured the attention of a growing number of senior researchers in fields as disparate as computer science, communication, decision theory, and psychology. Current hubs of activity are Carnegie Mellon, Harvard, MIT, Michigan, and Stanford, with steadily growing interest and support from philanthropies such as Chan Zuckerberg, Hewlett, and Gates.
Very large questions remain open on this scientific frontier. To what extent is learning an individual versus an interactive phenomenon? Does the character of learning differ categorically across cultural contexts, human history, and topical domain? Are human and machine learning fundamentally different things? To what extent is capacity to learn innate or acquired? In pursuing answers to these questions, the new learning sciences would parallel the rise of psychology and psychometrics during the middle of the past century, when university researchers substantially created these fields in the process of building tools for assessing and training enlisted servicemen. Then, as now, the task at hand defies easy characterization as basic or applied science, and is especially apt for pursuing in an organizational context in which research, teaching, and learning necessarily happen in tandem. Insights from this new science will be applicable far beyond the confines of conventional schools. Learning happens everywhere, and the boundaries between school, home, work, and play are rapidly blurring through digital media.
New means of instructional provision. The absorption of veterans in the past century transformed the nation’s universities. They became much more porous institutions, open to a wider range of backgrounds and serving a wider range of purposes than ever before. The massive expansion of federal tuition subsidies in the 1960s made it financially possible for millions more to attend college. The entire postsecondary ecosystem expanded, most significantly with the creation of a vast new tier of institutions—community colleges—specifically designed to promote social mobility and occupational advancement. Today, community colleges serve the majority of college-going individuals, but they are in many places under stress. They typically receive lower subsidies per student than public flagship universities, even though they are charged with serving the students most in need of basic academic services.
Digital technologies create conditions under which the functional relationships between different sectors of the postsecondary ecology might be rewired to redress lopsided funding streams. Research universities are inimitable engines of knowledge production, but their faculties are focused largely on research, not teaching, and their physical plants and selective admissions severely limit their ability to expand. But research universities and community colleges might be reimagined as coextensive with each other, enabling complementarities that would benefit both parties.
Imagine an academic ecosystem in which community college faculty were expected to be instructional experts with deep understanding of the needs of particular kinds of learners in specific communities, while research university faculty were expected to contribute usable knowledge to the widest possible constituencies. Movement of personnel between community colleges and research universities would be routine, because faculty on both sides of the relationship would need to remain current with the changing needs of students and employers as well as the moving edge of knowledge. Digital platforms would enable the constant sharing and updating of instructional materials in the wake of change in substantive fields and the accumulating insights of learning science.
What may on first blush sound fanciful is really just an updating of twentieth-century divisions of labor. Research universities have long produced the lion’s share of PhDs and teaching professors. University faculty also write the articles and textbooks that serve as instructional content throughout the entire academic ecosystem. What might be new would be to view instruction and knowledge diffusion as explicitly collaborative endeavors. If—and this is a big “if”—the organizational arrangements, participation incentives, and diplomacy were done astutely, benefits of expanded capacity and collegiality could accrue for all parties.
Parallel versions of such instructional collaborations might also be developed between research universities and large employers. The basic logic would be the same: research universities have experts and knowledge whose utility lies in practical application; firms have real-world problems and the people charged with solving them. Again, this would be essentially an updating of a twentieth-century model in which government agencies contracted with universities for training and applied research. So-called industry affiliates programs, common in engineering, in which corporations make gifts to universities to encourage applied science and instruction, provide additional policy precedents.
Mobilization
The long arc of history that brought the nation to its current moment of social change also left some remarkable tools for embracing the future. Jason Owen-Smith calls them “beautiful accidents”: complicated, cumulative mechanisms for producing knowledge and human capacity in every field of human end. The handful of premier institutions cannot by themselves educate the next generation of highly skilled workers, but they can—as they did in the twentieth century—apply their formidable intellectual power to understanding and addressing critical national challenges. Although they are hardly entirely beloved,research universities nevertheless enjoy broad respect and even, on game days,affection. Because research universities are so deeply implicated in the evolution of US society and culture, they are especially well positioned to influence their future.
But exploiting that opportunity is hardly inevitable. Universities served the nation in the past because government enlisted them to do so. Congress offered gifts of land, research dollars, virtual cartels on the conferral of academic credentials, subsidized tuition and guaranteed loans for obtaining them, and tax exemption. State legislatures built physical plants and provided subsidies that moderated incentives to seek external revenue streams. Universities flourished through a privileged client-patron relationship charitably defined as national service. If that phrase seems quaint today, it is because domestic government is now among many clients and patrons in universities’ portfolios. They now serve nation-states, corporations, and nongovernment organizations worldwide, and although they still proudly educate US military veterans and academically accomplished local kids from humble origins, they also enthusiastically recruit applicants from across the globe whose families are wealthy enough to pay out-of-state tuition. In competing for patrons and clients everywhere, research universities look rather less like national servants and rather more like self-interested global firms.
Yet it remains the case that the US federal government has a special hold on university finances. Federal research funding, tuition subsidies, guaranteed student loans, and tax exemption might all be deployed as carrots or sticks to spur fresh attention to the future of work and opportunity. The Obama administration offered carrots to community colleges and wielded sticks at for-profits, but scarcely called on research universities to improve themselves or the postsecondary ecosystem overall. The major higher education philanthropies have also failed to encourage research universities’ involvement in improving the national human capital system. The result is that these formidable solution machines have contributed only modestly to national discussions about the future of work.
Government and philanthropic funding are hardly the only means of mobilizing universities. Where money and cutting edges go, so too academics. One need not be a professor at Stanford to recognize that the digital revolution already has produced corporations with ample stores of both. Alphabet, Amazon, Facebook, and Uber are among the most influential organizations of our time. Their core technologies have transformed the production and circulation of knowledge, created entirely new means for goods and services and consumers of things to find each other, and blurred the boundaries between work, private life, and the public square. They have accumulated enormous intelligence and incalculable wealth in doing so, and they could be the lead patrons of experiments and studies on the future of work.
Such corporate-university joint ventures would have no precise precedent in the twentieth century. Then, the nation’s largest problems were described in civic terms. Winning wars, rewarding veterans, and creating occupational mobility were activities that government and universities did to further the public good. Now, the big problems are at least as likely to be framed as business opportunities. Skill definition, the provision of lifelong learning, and the matching of talented workers with good jobs have already attracted venture capitalists, proprietary platforms, and entrepreneurs. If the goal is just to make money, it is not clear that the future of work really needs universities. But if the mission includes enriching an entire society, the best partners are close at hand.