If you’ve been watching the latest pitched battles in America’s culture wars, you’ve doubtless heard of the much-ballyhooed and much-denounced field of critical race theory. One thing you may not have gleaned from all the media furore, though, is that critical theory, from which critical race theory is derived, has much to offer. Jason Josephson-Storm’s intriguing study, The Myth of Disenchantment, is a good place to start.

Critical theory was born in Germany between the two world wars. It was founded by a clique of Marxist academics in Frankfurt who were horrified that the grand march toward the communist utopia predicted by Marx wasn’t happening on schedule. On the one hand, communism in the Soviet Union had devolved into a totalitarian nightmare with a reliable habit of mass murder. On the other, the working people of one of the most educated and cultured nations of Europe, who according to Marxist theory should have been flocking to the banners of proletarian revolution, were instead rallying around a weird little man with a toothbrush moustache and an unhealthy obsession with an archaic, bloodthirsty mysticism of race and soil.

Obviously, something had gone wrong, not just with Marxism but with the entire enterprise of Western rationality summed up in the phrase “the Enlightenment”. Consider what that phrase means for a moment. One of the basic credos of the cultural mainstream in Western countries is the rather odd notion that, at a certain point not that many centuries ago, for the very first time in human history, intellectuals in Western Europe saw the universe as it actually is. Before then, despite fumbling attempts in the right direction by ancient Greek philosophers, humanity was hopelessly mired in superstitious ignorance; afterwards, Western intellectuals led a rapid ascent towards true knowledge of humanity and the universe. People still speak of that period using such far-from-neutral terms as “the Age of Reason” and “the Enlightenment”; in Germany, the term is die Aufklärung, literally “the Clearing-Off”.

It is to the credit of the founders of critical theory — Theodor Adorno, Walter Benjamin, Erich Fromm, Max Horkheimer, and Herbert Marcuse — that they didn’t just go on believing in the secular mythology of progress. They grasped that the Enlightenment had failed to accomplish what everyone expected of it, and they set out to understand what had gone wrong. Since they were Marxists, of course, they still framed things in terms of the march toward a utopian society of the future, and critical theory thus set out not just to understand society but to change it. It sought, in Horkheimer’s words, “to liberate human beings from the circumstances that enslave them” — but it tried to do that by understanding the entire panoply of reasons why those circumstances happen to exist at a given place and time.

This is what makes critical theory useful. Treat a belief as though it’s timeless and context-free and all you can do is accept or reject it; recognise that every belief has a history and a cultural context and you can understand it instead. Critical theory attempts to do this with the core beliefs of Western society. The first major book to come out of the movement, Horkheimer and Adorno’s Dialectic of Enlightenment, sought to make sense of the way that Enlightenment rationalism had led to the twin tyrannies of Stalin and Hitler. It’s still worth reading today, even though much of what passes for critical theory now is little more than empty propaganda.

In the opening lines of his Guide to Kulchur, Ezra Pound wrote: “In attacking a doctrine, a doxy, or a form of stupidity, it might be remembered that one isn’t of necessity attacking the man, or say ‘founder,’ to whom the doctrine is attributed or on whom is it blamed.” Similarly today, in circles unsympathetic to what critical theory has become, it is common to assail Adorno, Benjamin, et al., because of the current antics of their followers. This is unfair. The founders of critical theory did in fact make a massive mistake, but it’s one that pretty much everyone made in those days and too many people still make today.

That mistake? The failure to recognise that the academic circles to which Adorno and Benjamin belonged — and to which their followers by and large belong today — form a privileged class with an interest in furthering its own influence and grabbing more than its share of wealth and privilege. Critical theory by and large avoids talking about this. A genuine critical race theory would interrogate the discourses concerning race used by Left-wing activists in today’s society, and show how those discourses are used as instruments of hegemony by those activists and the people who pay them. A genuine critical theory would also interrogate the implications of “liberating human beings from the circumstances that enslave them”, and talk about how that rhetoric of liberation is used to replace one set of enslaving circumstances with another. You can read a whole lot of critical theory and never catch the least whisper of this sort of thinking.

This is what makes Jason Josephson-Storm’s work so fascinating. He tiptoes very close to the edge of that forbidden territory, by suggesting that one of the most fundamental assumptions of modern thought — the notion that we modern people are disenchanted, freed from the superstitious burdens of the past and venturing heroically forward into a new world free of myth and magic — is simply another myth. He has applied the tools of critical theory to one of the basic assumptions underlying critical theory, and showed that belief in disenchantment is just another narrative employed to advantage certain people over others. It is an impressive project.

One of Josephson-Storm’s greatest influences, a scholar he frequently cites, did much the same thing on a bigger scale and to an even more vulnerable set of narratives. This is Bruno Latour, one of the first scholars to study “the social construction of scientific facts”. What does this mean? Well, it’s part of the mythology of science that claims that scientists in their research are simply following where nature leads. In practice, it’s very nearly the other way around.

Consider the steps you’d need to take if you wanted to do some research into any branch of science: reading the relevant literature, crafting a hypothesis, considering the available equipment, designing an experiment, and, of course, finding funding for it. It’s common for such steps to be dismissed as mere details, but in fact they are more significant than that.

For instance, the literature you’ve read is the product of peer review and the evolution of scientific opinion, which has at least as much to do with academic politics as with nature. The hypothesis is a product of your education, and also of current fashions in academia (anyone who thinks that scientists are immune to the blandishments of intellectual fashion has never met a scientist). The equipment available depends on who has invested money into developing certain kinds of experimental gear, and also on what gear is popular and readily available. The experimental design is just as subject to fashion, and it also has to appeal to funding sources. And finally, the decision to grant or withhold funding for an experiment depends entirely on the behaviour of human beings.

On top of this, after conducting an experiment, you’d have to interpret the results, write a paper, get a prestigious co-author or two to sign on, submit it to a journal, wait nervously while it goes through the peer-review process, and revise the paper at least once in response to comments by the anonymous peer reviewers. Then, once the paper is finally published, other researchers will respond to it and potentially adapt their own research projects in light of what you’ve found. All these, again, are social processes.

The end result of such a research project — a half sentence and footnote, say, in some future textbook — is thus almost entirely a product of social interactions among human beings. At the centre of those interactions, the flake of grit at the heart of the pearl, is the fact that you asked nature a specific question and got an equally specific answer. That process of question and answer is the thing that makes science as effective a way of making sense of the world as it is, but it does not erase the effect of social processes on the result — it just means that the result has to have some contact somewhere with nature.

Now take that and multiply it by four centuries or so of scientific effort, and the result is a vast social process built atop a relatively narrow foundation of natural facts. Those facts are carefully selected, curated, and assembled by the social process into a model of the world. Ask different questions, use different equipment, give the results a different theoretical spin, and you can quite easily end up with a completely different model of the world. That’s not something most people want to talk about in the scientific community, because it would weaken their claims to influence, wealth and privilege. That’s why so many scientists were shouting “Believe the science!” at the tops of their lungs not so long ago: maintaining the cultural prestige of science, and thus their own social status and its perks, took precedence over nearly everything else.

If you want another glimpse at just how far the social enterprise of science veers from its imagined ideal, look up the phrase “replication crisis”. One of the essential principles of science is that any scientifically valid finding has to be replicable; it can’t be some kind of fluke. These days, for an astounding number of studies in a very wide range of sciences, that’s no longer true. Very few people in the sciences want to talk about how much of this is caused by experimental and statistical fraud, both of which are pervasive in those branches of science where corporate profits are involved and far from rare even in less lucrative fields of research.

If science were really a matter of following nature wherever it leads, the emergence of the replication crisis would have caused a sudden, frantic search for the causes. We’re talking, after all, about something that challenges the act of faith at the centre of the scientific enterprise. By and large, though, that search hasn’t happened. Instead, scientists have chosen either to ignore the problem or to denounce anyone who dares to draw attention to it — typical behaviour of any elite group faced with a challenge to their legitimacy.

This is the kind of thing Bruno Latour wrote about. In We Have Never Been Modern, he proposed that most people are convinced, or at least act as though they’re convinced, that the modern world is something new and unique in human history because, unlike all others, our sciences really do tell us the objective truth about nature. These people also appear to be convinced that the same thing is true of everything else in our culture.

This is the heart of modernity — the conviction that keeps people today from making use of any of the hard-won lessons of past civilisations, or even learning from our own civilisation’s catastrophic mistakes. It is a fond, false, foolish belief — and Latour and Josephson-Storm have both shown that it can’t be justified except by the most absurd sorts of special pleading and circular logic.

What does it mean if we give up the myth of modernity, the conviction that we — alone of all the human beings who have ever lived — see the world truly? Surprisingly enough, we don’t have to give up science. That our scientific worldview is not given by nature, but assembled out of data points drawn from nature, does not make science meaningless or false. It simply makes the work of the scientist a product of human society and culture, rather than a revelation handed down from on high.

In a very real sense, science stripped of the myth of modernity takes on the same shape as the study of history. It is absurd to think that history is simply an account of what happened; “what happened” in a month in any small town would fill entire libraries. The historian’s task is to craft a narrative which illuminates some part of the past, using actual incidents as building blocks. A scientist without modernist pretensions, similarly, crafts a narrative that illuminates some part of nature, using replicable experimental results as building blocks. Theories along these lines are useful rather than true; they start by accepting the reality that the human mind is not complex enough to understand the infinite sweep of the cosmos, and then goes on to say, “but as far as we are capable of making sense of things, this story seems to reflect what happens”.

This sort of thinking is doubtless a bitter pill to swallow for those who have founded their own identities on the notion that humanity is or should be the conqueror of nature. Here again, though, the failure of those notions to create a world fit for human habitation is increasingly clear to many of us. And the sooner we accept that the stories told by today’s industrial societies are just another set of mythologies, and that the technologies they’ve created to manipulate the world are just another set of clever gimmicks — why, the sooner we can get to work discarding those aspects of modernity that have failed abjectly, and picking up those older habits and stories and technologies that are better suited to the world we find ourselves facing. Only then, can we begin to do something less inept and foredoomed with our time on Earth.

view 123 comments

Disclaimer

Some of the posts we share are controversial and we do not necessarily agree with them in the whole extend. Sometimes we agree with the content or part of it but we do not agree with the narration or language. Nevertheless we find them somehow interesting, valuable and/or informative or we share them, because we strongly believe in freedom of speech, free press and journalism. We strongly encourage you to have a critical approach to all the content, do your own research and analysis to build your own opinion.

We would be glad to have your feedback.

Buy Me A Coffee

Source: UnHerd Read the original article here: https://unherd.com/