America has forgotten the point of educationA few months after the fact, the removal of Claudine Gay from the Harvard presidency increasingly looks like a watershed moment. Shortly after Gay’s ousting, the New York Times published a long piece titled “The Misguided War on the SAT”, defending standardised testing, which had fallen out of favour at elite colleges. In the past few weeks, Dartmouth, Yale and Brown have reinstated the SATs. High-profile business leaders such as Bill Ackman (a central figure in the university drama) and Elon Musk have continued their criticism of DEI in the corporate world, and DEI hiring seems set to continue its downward trend. The trend has only continued inside higher education: just last week, the University of Florida eliminated all positions associated with the now toxic initialism.

Perhaps these developments merely represent the swinging of the culture war pendulum back from the “Great Awokening” of 2020. But it’s worth exploring another possibility: that something larger is happening, outside the culture wars, motivating shifts in attitudes beyond the usual conservative audiences. It should not be forgotten that the incident precipitating the latest round of campus controversies — Hamas’s October 7 attack against Israel — was a foreign policy issue. In the United States, Israel-Palestine tends to be treated almost entirely as a figment of the domestic culture war, as has been in this case. But the latest round of violence takes place in a context very different from that of the recent past: America no longer has unquestioned military supremacy, primarily because of its declining industrial base. The Russia-Ukraine war has now lasted for two years, and replenishing US ammunition stockpiles used in that theatre could take up to seven. Estimates suggest that, in a military conflict with China, the United States would be out of critical ammunition in just a week. China built 21 submarines last year; the United States struggles to build more than one. In sum, America is not prepared for a major conflict anywhere, and in the face of coordinated attacks from Russia, China, and Iran and its proxies, US and allied forces could quickly run out of materiel.

These statistics may seem far afield from campus politics. But it is worth recalling that foreign policy issues — the Cold War battle for technological supremacy against the Soviet Union — were critical in shaping American higher education as we know it today. Now that great power competition has re-emerged, academic culture wars may take on new dimensions. This historical perspective might also offer some insights on reforming educational institutions, a revolution extending well beyond culture war sniping on DEI, and into the entirety of the American political economy.

The Second World War and the Cold War shaped American education in profound ways, including in its most controversial and contradictory social valences — meritocracy and affirmative action. The need to educate the best and brightest from all backgrounds to ensure technological leadership featured prominently in Vannevar Bush’s Science, The Endless Frontier, the 1945 report which laid the foundations of the modern American research system, giving rise to institutions such as the National Science Foundation and similar bodies. Bush placed universities at the centre of national technology research, called for increasing their funding, and argued: “If ability, and not the circumstance of family fortune, is made to determine who shall receive higher education in science, then we shall be assured of constantly improving quality at every level of scientific activity.”

Bush’s vision would be realised in the ensuing decades. The combination of the GI Bill of 1944 (which covered tuition for veterans), and the post-Sputnik National Defense Education Act of 1958 greatly expanded attendance at and funding for colleges and universities. And although the SAT originated in the Thirties under Harvard president James Conant, standardised testing became increasingly prevalent as higher education was “democratised” and “rationalised” during the Cold War. Thus, driven in large part by great power competition, university education went from a narrow, elitist pursuit — led by the Ivy League and “Saint Grottlesex” feeder schools, along with regional replications — to a national and meritocratic endeavour. In 1940, around 5% of the US population had college degrees; that number has increased in every decade since and approaches 40% today. Meanwhile, Cold War competition also played a role in the advent of affirmative action, which has since evolved into “diversity, equity, and inclusion” programmes and the broader phenomenon of “wokeness”. Although foreign policy was certainly not the primary motivation of civil rights legislation, national security arguments were frequently used to justify it. Conservatives tend to locate the origins of wokeness in the critical theory of the Frankfurt School, while progressives focus on centuries-old legacies of racism, as in the “1619 Project”. But perhaps neither side has given sufficient attention to the role of great power competition in the mid-20th century.

To summarise a well-known history, segregation quickly became an international problem in the Cold War context. Soviet propaganda exploited the persistence of Jim Crow among audiences in the Global South, which glaringly undermined America’s own efforts to present an image of freedom, opportunity for all, and human rights. For hawkish liberals, civil rights was very much a foreign policy issue. Furthermore, although the origins of affirmative action are controversial — there is still some debate over what President Lyndon Johnson meant by the phrase — administrations of both parties highlighted such policies for international audiences when responding to the Soviets.

To some extent, then, the modern American education system was built to defeat the Soviet Union, both in terms of science and technology leadership as well as moral authority. It is also not surprising that both meritocratic education and affirmative action would drift from their original intentions as the Cold War faded into the “end of history”.

In the decades since Vannevar Bush’s report, the idea of meritocratic universities, open to all, gradually metamorphosed into “college for all”, and even “Yale or jail”. In light of America’s deeply entrenched small-d democratic passions, it was perhaps inevitable that the notion of universities being open to anyone with ability would eventually come to mean that everyone should attend one. If college is open to anyone with ability, after all, then anyone who does not go to college must be lazy or an imbecile. So American primary and secondary education became explicitly oriented around “preparing students for college”, meaning a four-year liberal arts degree, which came to be seen as essential for simply entering the middle class. Vocational tracks were stigmatised as casting children into intellectual and professional oblivion — and, besides, why go to “trade school” when the trades were in decline?

The neoliberal turn further entrenched the “college for all” mentality. The Chicago school of economics notion of “human capital”, which conceptualised training as a form of input capital, gained prevalence as deindustrialisation progressed. In theory, human capital simply means that the knowledge, skills, and other attributes of workers have economic value. In practice, it promised that education could substitute for physical and investment capital, and conveniently rationalised the notion that “more education” would address the challenges related to deindustrialisation and the “fissuring” of the US economy (and seemingly just every other economic problem). For various reasons, politicians and business leaders on both Left and Right embraced the sectoral shift away from capital intensive “Fordist” production in the late-20th century. The story that they sold to voters — and told themselves — was that, although nothing could or would be done to “bring the factories back”, the government would support more education for “the jobs of the future”. Americans would do the brainwork in the new “information economy”, while “commodity production” was offshored. In this paradigm, of course, the apparent value of university education only increases.

Unfortunately, many of these illusions have not survived contact with reality. The “information economy” jobs of the future never materialised at the levels promised. The tech industry, relative to its profits and market capitalisation, employs relatively few people — unless one counts driving an Uber — and has delivered notoriously underwhelming productivity gains. Perhaps simply by the logic of supply and demand, the value of college degrees appears to decline as they become more common. While still significant, the economic advantages of a college education have fallen over time, even as student loan debt has exploded. Furthermore, the record of universities — fundamentally medieval institutions — at preparing students for the workforce is questionable at best. Increasing university enrolment has not always meant that more students were meeting higher standards, but rather that academic rigor and quality were being sacrificed to accommodate more tuition-payers. At any rate, it would be hard to argue that the quality of American intellectual life has improved with the expansion of university education. Instead, the entire college apparatus has become a bloated system that seems to badly misallocate resources while heightening inequality. Meanwhile, the original objective for expanding the universities — competition with the Soviet Union — disappeared during the end of history.

The affirmative action component of education, likewise, quickly evolved beyond its original justifications. Initially conceived as an understandable if somewhat ham-fisted attempt to rectify legacies of slavery and segregation, the intended beneficiaries were African American descendants of slaves. This group was essentially the entirety of the black population in the United States in the Sixties, which was then a little over 10% of the total US population. But around the time of major civil rights legislation, new immigration laws were also passed, dramatically expanding immigration. The non-Hispanic-white share of the population fell from about 85% in the Sixties to 58% today (among American children, the share is now less than half). The black population has also shifted, including many African immigrants who were never victims of slavery or segregation; today, Nigerian immigrants have above-median earnings. Nevertheless, affirmative action continued to expand along with the non-white population: from a relatively targeted programme aimed at about 10% of Americans, who had experienced state discrimination in the immediate past, it became an elaborate system for managing “diversity” in an increasingly “majority-minority” country. As globalisation gained momentum, foreign students could also benefit from simplistic racial categorisation, with the added benefit, for universities, that many paid full tuition.

Other emanations from civil rights had unexpected effects on education. The Supreme Court decision Griggs v. Duke Power Co., known mostly for establishing that seemingly neutral employment practices could be in violation of the Civil Rights Act if they resulted in a “disparate impact”, directly concerned employment testing. With the decline of such testing, employers increasingly turned to college degrees (and the relative ranking of universities) as indicators of graduate ability. This solidified the economic importance of the university system and its hierarchies. Meanwhile, other court decisions (and some state referenda) barred hard racial quotas and other forms of affirmative action. Universities did not abandon these programmes, however, but pursued them through ever-more opaque methods (applicants were credited for “overcoming adversity”, or universities were allowed to consider the effects of diversity on the student body). The result was that the substance of the “diversity” regime remained intact, but its logic appeared increasingly obscure and arbitrary.

Decades of mission drift have thus left universities in an impossible position. On the one hand, they function as the main sorting mechanisms for employers at the high end of the economy. On the other, they are supposed to be the main sources of “human capital” for all of society, saving the middle class through “more education”. They also retain, at least in theory, the goal of advancing science and technology, and have added functions around applied, corporate-funded research with the disappearance of the large corporate labs (e.g. Bell Labs). Some consider themselves start-up incubators as well. At the same time, they have appointed themselves to manage the diversity of an increasingly fractured populace, even though they cannot explicitly articulate or openly pursue that purpose. It is probably not a coincidence that wokeness erupted amid this confusion, and after the original rationale for meritocracy — great power competition — seemed to have vanished. To be charitable to academia, one could argue that society expects far too much from universities today, and it is little wonder that they seem to be performing worse and worse at each task.

“Decades of mission drift have thus left universities in an impossible position.”

In light of this history, the reform proposals of both anti-woke liberal and conservative critics seem inadequate. Even if this provisional anti-woke alliance can rein in progressive excesses, the deeper confusion surrounding the universities’ purpose cannot be resolved entirely on the culture war battlefield — or even by reforms limited to the universities themselves.

Harvard professor Steven Pinker, a prominent representative of the liberal camp, illustrates the limitations of the liberal perspective in a tweet proposing five reforms. Some of his recommendations, such as bolstering free speech policies by prohibiting forceful takeovers of buildings and interruptions of lectures, seem like basic common sense. Others, such as disempowering DEI bureaucrats, should be. But two of them are inherently in tension: discouraging the groupthink associated with intellectual “monocultures” will at least occasionally require abandoning his call for institutional political neutrality. Left to its own devices, the Harvard faculty will almost certainly remain a political monoculture, as it already is. To remedy this, Harvard’s administration would have to undertake some direct or indirect affirmative action for conservatives. The question of ideological diversity, then, seems to recapitulate the liberal dilemma over racial diversity. Is actively promoting diversity merely levelling the playing field, or benefiting the whole university community through exposure to under-represented viewpoints? Or is it just unfair discrimination, enforced by, in Pinker’s words, “bureaucrats responsible to no one”?

At the level of intellectual diversity, discouraging monocultures becomes even more complicated. There are no phrenologists at Harvard, for example, and few, if any, climate sceptics, creation scientists, or “human biodiversity” scholars, not to mention orthodox Freudian psychoanalysts or Marxist economists. Should the administration intervene to employ professors with these views? The faculty would doubtless say that the science is settled on these topics, and inviting dissident voices would only promote error over truth. But who gets to decide? How does one determine which monocultures are pernicious and which ones happily represent the triumph of truth over falsehood. Liberals do not really have an answer, which is one reason why their institutions have all deteriorated in recent decades. Ultimately, liberal value neutrality is always a fiction. It may be a salutary and desirable fiction, and probably would be a beneficial one for universities to recover. But such neutrality only functions within a horizon defined by a shared purpose — its own monoculture. Vannevar Bush and the Cold War liberals had such a purpose; today’s liberals do not, or at least have not been able to articulate one.

Anti-woke conservatives, on the other hand, have a clear goal: they seek to use the universities to inculcate conservative values, whether by refashioning red-state universities, starting new ones, or other means. The problem is that this vision is too narrow; it ignores the structural sources of universities’ power and the system’s underlying problems. Although Democrats are the party of education, conservatives often have far more romantic notions of the power of universities and professors. Look no further than the common conservative critiques of wokeness, which emphasise the role of academia. Not surprisingly, then, conservatives tend to imagine that hiring more conservative professors is the main thing required to reshape American culture.

In reality, however, most evidence suggests that professors have relatively little influence on students’ ideologies, and there is little reason to think that more conservative faculty or academic institutions would make much of a difference beyond the campus. It is worth recalling that conservative colleges already exist, such as Hillsdale College, and there are more outside-funded conservative student organisations operating at, say, Harvard today than there were 20 years ago. Yet while Hillsdale is important to conservatives, and its students are well-represented among congressional staff, the brutal fact is that its influence on larger elite opinion is negligible. Likewise, while conservative campus organisations may be important nodes of personal networking, they obviously have not done much to change the trajectory of elite universities.

Conservatives’ inveterate idealism tends to blind them to the real functions and sources of the power of elite universities. The status of top institutions is not derived from their stellar humanities departments or pedagogical commitment, but rather from the signalling value of their credentials, the wealth of their alumni networks, and their relative importance to corporate and government research apparatuses. Everything else is marketing. Thus, changing university culture without changing any of the system’s underlying dynamics, which seems to be the current conservative approach, seems unlikely to succeed. Moreover, even if such an approach were to succeed, it would offer little to those who are not cultural conservatives: substituting identitarian progressivism with Straussian political theory or some other conservative hobbyhorse would do little to create a better prepared workforce or advance science and technology.

Instead of an exclusive focus on superficial culture war issues, educational reformers today should recall the last major, intentional reorganisation of the American university system that occurred during the Cold War. We must be cognisant of how that legacy has contributed to current problems, but should also recognise its value. Reforms that meet the needs of great-power competition are more likely to be far-reaching and enduring than endless battles over the universities’ symbolic orientation.

The primary difference between then and now involves the shifting demands of great-power competition. After the Second World War, US manufacturing dominated the world, but American primary research and basic science needed to be strengthened. The reforms of Vannevar Bush and the post-Sputnik legislation thus increased the funding for and importance of universities. Today, however, the situation is essentially the opposite. Basic research is still important, of course, but America’s main vulnerabilities lie in its supply chain and manufacturing dependencies on China. Its leadership in pure research is also most at risk in capital-intensive sectors that have been commercially abandoned due to foreign manufacturing subsidies, such as shipbuilding, nuclear reactor components, and perhaps even commercial aviation. Further strengthening universities will do little to address these problems; the focus should shift to other institutions.

Americans in the mid-20th century probably could not have imagined the country’s loss of manufacturing dominance. They also underappreciated the importance of corporate labs, which Bush considered too focused on incremental improvements rather than fundamental breakthroughs. But institutions like Bell Labs — whose scientists won 10 Nobel Prizes — RCA Labs, Xerox PARC, and others made immense scientific contributions. These labs were lost in the late-20th century as a result of increasing foreign competition, the growing capital intensity of research, and shareholder demands for short-term financial returns. Universities, by default, picked up many of their functions, but university research productivity has been on a declining trajectory for decades. Instead of funnelling ever more money into dysfunctional universities, government and corporate funders should look to recreate, at some level, the major corporate labs. Such labs are better suited to work that combines innovation with production and manufacturing, and they could allow for better collaboration between multiple universities as well as researchers from corporations and start-ups. Insulating this research from faculty politics as well as undergraduate admissions and credentialing could also reduce pressures against meritocratic hiring.

The Bell Labs of 1950 is not coming back, of course, because Bell Telecom is not coming back. Large corporations are no longer incentivised to invest in quasi-independent research labs, and start-ups lack the capital for such projects. Nevertheless, government could work with the private sector to pioneer new labs and research facilities that could fill the holes left by the lost corporate labs. Another model is offered by the Broad Institute, a collaboration between Harvard and MIT, which was a leader in the development of CRISPR technology. Such institutions could be co-located with universities, and share personnel, but institutions with their own unified and coherent research missions, along with their own independent governance and funding structures — separated from undergraduate admissions and credentialing — would be an improvement over housing more programmes within already sclerotic universities.

At the same time, policymakers should continue existing efforts to reduce the importance of universities in employment credentialing. States such as Utah and Pennsylvania have already removed bachelor’s degree requirements for state government positions; hopefully other states will follow suit. Moreover, contra Griggs, direct employment testing should be encouraged to open up pathways to hiring at top-tier companies outside of elite college recruiting. Allowing five or 10 universities to be gatekeepers of the highest echelons of the U.S. economy has hardly improved “equity” and “inclusion”. If universities were less central to elite credentialing, the importance of micromanaging the “diversity” of their admissions would also decline.

Finally, instead of focusing on replacing the faculty of liberal arts institutions, or starting new ones, such as UATX, reformers should seek to build institutions that meet workforce or other practical education challenges. Elon Musk has floated the idea of starting new K-12 math and science schools, and this approach also seems more likely to succeed at the higher education level than churning out more underemployed liberal arts graduates, whether woke or anti-woke. To be sure, politicians have lavished praise and funding on community colleges and vocational training for decades, to little avail. The existing community college and apprenticeship tracks, even if some students can go on to do relatively well financially, do not offer access to prestigious professions. But a new set of applied science training institutes might have a better chance, especially if they solve an actual workforce challenge and are part of a larger ecosystem for training and credentialing outside of the universities.

College for all’ has been a disaster economically, intellectually, and politically”

On the whole, “college for all” has been a disaster economically, intellectually, and politically. Too many people, too much money, and too many research functions have been pushed into universities that have no business being there. It may have been reasonable, in the Forties and Fifties, to believe that strengthening universities was necessary to maintain America’s technological and economic leadership. But today, the opposite course is required. We need the universities to do less, not more. A common objection to this argument is that technology has become so much more complicated and specialised that more education is simply necessary. But the connection between “more education” and the “information economy” has always been overstated. One does not need a four-year degree, much less a postgraduate degree, to become skilled at “coding”. Two of America’s most successful computer technology entrepreneurs — Bill Gates and Mark Zuckerberg — famously dropped out of Harvard. Many successful recipients of the Thiel Fellowship intentionally avoided college.

Conservatives and liberals who remain devoted to more Romantic conceptions of the humanities will doubtless object to this ruthlessly instrumental approach to education. But breaking the university monopoly on credentialing and applied science has benefits for liberal arts purists as well. The “right-sizing” of universities would allow liberal arts colleges of various ideological orientations, whether Hillsdale or Oberlin, to compete with Harvard and Stanford on more equal footing. All of society would benefit if universities lost some of their exalted status. They are not the preservers of civilisation, guardians of truth, or reservoirs of moral purity. The more they are seen as practical institutions with practical functions, the less we need to worry about their ostensible values.

Although today’s crisis of higher education manifests primarily as another culture war battle, the issues go much deeper. Decades of institutional drift have left that system with multiple, contradictory purposes, none of which are being served effectively. If it is to be successfully reformed, American higher education must be reorganised to address the exigencies of great power competition that are emerging in the present.

view comments

Disclaimer

Some of the posts we share are controversial and we do not necessarily agree with them in the whole extend. Sometimes we agree with the content or part of it but we do not agree with the narration or language. Nevertheless we find them somehow interesting, valuable and/or informative or we share them, because we strongly believe in freedom of speech, free press and journalism. We strongly encourage you to have a critical approach to all the content, do your own research and analysis to build your own opinion.

We would be glad to have your feedback.

Buy Me A Coffee

Source: UnHerd Read the original article here: https://unherd.com/