[The following is a chapter from Dr. Julie Ponesse’s book, Our Last Innocent Moment.]

Every human being of adult years and sound mind has a right to determine what shall be done with his own body. 

Justice Benjamin Cardozo, 
Schloendorff v. Society of New York Hospital (1914)

As my fingers type these words in a corner of my local coffee shop, some simple interactions catch my attention. 

Could I have a tall dark roast, please? Certainly. 

Would you like your croissant warmed up? No, thank you. 

Is the milk organic? Of course.

In a few simple exchanges over a morning coffee order, each customer managed to make more robustly informed choices than most did over the far more impactful health and policy issues of the last four years. 

Why, I wonder, couldn’t we muster the relatively meagre skills of paying attention, asking questions, and expressing a reflective “yes” or “no” when it came to the life-impacting issues of the pandemic — masking, lockdowns, family distancing, and vaccination — when we seem to do it as a matter of course in the more prosaic areas of our lives?   

During the pandemic, informed consent was inverted for all to see. The public health establishment concluded that protecting the “greater good” required exceptional measures, making informed consent expendable in the name of “keeping people safe.”

Physicians refused to sign exemptions and courts refused to hear exemption requests. Patients were fired for questioning vaccination. Families and social groups began to distill their membership in more and less overt ways, shaming and uninviting until those who remained were pressured into compliance or exile.

And various institutions began to release statements amending their position on informed consent, claiming that its revision was necessitated by the pressures of the pandemic. The FDA and the Office for Human Research Protections, for example, released statements revising their informed consent policies in the wake of the Public Health Emergency Declaration (issued in January 31, 2020, then renewed through May 11, 2023.) 

In more and less formal ways, Covid was the tool that transformed our supposedly inalienable right to make informed choices about our private lives into a public and readily dispensable good. It was almost as if we had built such a network of infinitesimal choices creating the powerful illusion of choice that we didn’t notice when we were asked to give it all up in an instant.

After all, if we can choose to have our coffee prepared and personalized to our liking — if the world is responsive to our needs and desires to that degree — why would it occur to us that we can’t make decisions about what goes into our bodies?

When I look back over the motley collection of oversights and transgressions of the last three years, what surprises me most is that we let it all happen. The government could have demanded our unquestioning compliance, journalists could have spun a one-sided narrative, and citizens could have shamed us, but we could have resisted it all by simply making our own choices in our own little corners of the world. This should have been the fail-safe that would have put us in a very different place now.

Instead, Covid became a moral litmus test in which we not only showed our capacity for making poor choices but, even more devastatingly, our capacity for complete deference (what some call “public trust”). Covid created an atmosphere in which informed consent simply could not survive. “Free choice” was considered “free riding,” and those who made individual choices that departed from what was perceived to “keep people safe” were seen as benefiting from others’ sacrifices without incurring costs themselves. As Canadian singer-songwriter Jann Arden quipped in a 2023 podcast, “[V]accinated people have enabled everybody on this planet to be having the lives right now that they’re having.”

What I would like to do here is to explore what has happened since 2020 that made us so willing to give up personal choice and informed consent so we can better understand how we got to this place and how to prevent the next moral misstep. The answer may surprise you. 

Why Did We Give Up So Easily?

Though it might feel like we abandoned our right to make choices in the blink of an eye, informed consent started to lose its footing in medicine, and in culture more generally, in the years leading up to 2020.

Almost 20 years before Covid, ethicist Onora O’Neill callously wrote that “informed consent procedures in medicine […] are useless for selecting public health policies.” Her idea was that public health policies must be uniform to be effective, and allowing for personal choice creates the possibility of divergence.

For O’Neill, there can be no exceptions regarding individuals’ masking or vaccination choices, for example, and success at limiting the spread of a lethal virus. You can either have safety or individual choice and, when the two conflict, informed consent must give way to the more important value of safety.

When I was a graduate student studying medical ethics in the early 2000s, the value of informed consent was so obvious that it was treated almost as a prima facie good, as something with great moral weight. Its value was grounded in the fundamental belief – a belief with deep philosophical roots – that all humans are rational, autonomous (or self-governing) persons who deserve respect. And one of the basic ways of respecting a person is to respect the choices persons make.

As the President’s Commission for the Study of Ethical Problems in Medicine and Biomedical and Behaviour Research stated: “Informed consent is rooted in the fundamental recognition — reflected in the legal presumption of competency—that adults are entitled to accept or reject health care interventions on the basis of their own personal values and in furtherance of their own personal goals.”

In medical ethics, informed consent became the principal mechanism to prevent some of the most deplorable abuses of human rights: the Tuskegee Syphilis Experiment, the Skid Row Cancer Study, the Stanford Prison Experiment, the GlaxoSmithKline and U.S. Military hepatitis E vaccine study, and of course the Nazi Party’s medical experimentation and sterilization programs.

With these cautions and philosophical views of personhood in mind, informed consent became the cornerstone of medical ethics with the requirements that the patient (i) must be competent to understand and decide, (ii) receives full disclosure, (iii) comprehends the disclosure, (iv) acts voluntarily, and (v) consents to the proposed action.

These conditions became more or less repeated in every major bioethics document: the Nuremberg Code, The Declarations of Geneva and Helsinki, the 1979 Belmont Report, the Universal Declaration on Bioethics and Human Rights. The Canadian Medical Protective Association document on informed consent says, for example, “For consent to serve as a defence to allegations of either negligence or assault and battery,…[t]he consent must have been voluntary, the patient must have had the capacity to consent and the patient must have been properly informed.”

By this standard, how many physicians in Canada were guilty of “negligence or assault and battery” by pushing Covid vaccination on their patients? For how many was the act of Covid vaccination truly voluntary? How many Canadians received full disclosure about the benefits and harms of wearing masks and locking down?

More generally, what if we had just asked more questions? What if we paused to think? What if we listened more than we talked? What if we worked our own way through the evidence instead of simply trusting the ‘experts?’ As it was, we masked enthusiastically, we locked down hard, and we lined up for hours to get our chance at a shot we knew little about. And amidst it all, there was an eerie absence of questioning and choice.


To understand how we got to where we are, it is helpful first to appreciate that informed consent is a relatively recent trend in the history of medicine. Two ancient ideas, which are now exerting a renewed pull on our healthcare system, helped to resist it for a long time.

The first is the idea that the physician or “expert” always knows best (what is referred to in healthcare as “medical paternalism”). The second is the related idea that the value of “the greater good” sometimes supersedes that of patient choice. Both allow that there are things of moral value that can, in principle, override patient choice. 

Dating back to Ancient Greece, the dominant trend in patient care was paternalism, which left little room for informed consent and even justified deception. For thousands of years, medical decision-making was almost exclusively the physician’s domain whose responsibility it was to inspire confidence in his or her patients. It was the physician who decided whether to withhold a course of antibiotics, to consider a newborn with birth defects a stillbirth, or to give one patient rather than another access to surgery when resources were scarce. Even during the Enlightenment, when new theories of personhood framed patients as rational beings with the capacity to understand their medical options and make their own choices, deception was still felt to be necessary to facilitate patient care. 

It wasn’t until the 1850s that English Common Law started to reflect worries about injuries incurred from surgery without proper consent. The courts increasingly interpreted a physician’s failure to provide adequate information to the patient about his or her treatment as a breach of duty. This trend culminated in the 1914 case of Schloendorff v. Society of New York Hospital, which was the first to establish that the patient is an active participant in the treatment decision process. The judge in the case, Justice Benjamin Cardozo, stated:

…every human being of adult years in sound mind has a right to determine what shall be done with his own body; and a surgeon who performs an operation without his patients consent commits a battery for which he is liable in damages.

In spite of all of this progress on the autonomy front, informed consent lost its footing in recent years due to an increasingly impersonal health care system congested by a growing number of stakeholders (including public health agencies and the pharmaceutical industry), overworked clinicians, financial conflicts of interest, and shifts in moral and political ideologies. Gradually, almost imperceptibly, the traditional relationships of trust between particular physicians and patients wore thin, and the expectation of explicit consent gave way first to more tacit understandings of the concept and then to its near total erosion.

How could this happen? Why did we experience such a wholesale amnesia for the ethical framework that we had worked so hard to build? What could have made us abandon it all so quickly and so completely?

Scientism in the Age of Covid

It is said that ours is an age of entitlement, or at least that millennials — the “Me, me, me,” generation — have an attitude of entitlement. Our culture caters and markets so fully to every whim that a desire for making our own choices is the last thing you might expect us to give up on. So why did we give up on it? 

I believe that the decline of informed consent has coincided not just with the specific events related to Covid-19, but more generally with the ascent of a particular scientific ideology called “scientism.”

It’s important to be clear that scientism is not science. In fact, it has very little to do with science, itself. It is an ideology, a way of viewing the world that reduces all complexities, and all knowledge, to a single explanatory approach. At its most benign, scientism offers a complete view of the human condition, appealing to science to explain who we are, why we do what we do, and why life is meaningful. It is a meta-scientific view about what science is capable of and how it should be viewed relative to other areas of inquiry including history, philosophy, religion, and literature.

Scientism has become so ubiquitous that it now influences every sphere of life from politics to economic policy to spirituality. And, like every dominating ideology that has imposed itself on the world, scientism has its own shamans and wizards.

The practical upshot of this is that, because scientism uses science to resolve conflicts outside its proper domain, conversations about whether it is right to disinvite an unvaccinated sibling from Thanksgiving dinner, for example, frequently devolve into the rhetorical “What, don’t you believe in science?”

The question assumes that science, by itself, can answer all relevant questions, including those about etiquette, civility, and morality. Hurt feelings, broken relationships, and moral missteps are all justified by appealing to the fact that the shunned individual excused herself from moral consideration by not following “the science.”

One particularly devastating feature of scientism is that it obliterates debate and discussion, ironically hallmarks of the scientific method. Think of the frequent invocation of “#Trustthescience” or even just “#Science” in social media communications, used not as a prelude to argument and the presentation of scientific evidence but as a stand-in for them, rendering alternative viewpoints impotent and heretical. 

Political scientist Jason Blakely identifies the locus of this feature of scientism as the “overextension of scientific authority.” As Blakely wrote in his cover story for Harper’s Magazine in August 2023, “scientific expertise has encroached on domains in which its methods are unsuited to addressing, let alone resolving, the issue at hand.” The fact that a microbiologist understands the elements of DNA is, today, unquestionably used to grant that person supreme authority in matters of morality and public policy.

The emergence in 2020 of a viral crisis, the proper domain of science, meant the overextension of scientific principles into the sociopolitical and moral domains, and therefore the suspension of all basic ways of treating one another. The assertion made by officials that the pandemic necessitated a specific policy response was a way of suppressing the more complicated ethical and political disagreements that underlaid them. Having suspended our civility, Yale sociologist and physician Nicholas Christakis remarked, “We allowed thousands of people to die alone,” and we baptized and buried people by Zoom while the compliant dined out and went to Maroon 5 concerts.

As this transition unfolded, scientism’s fundamentalist nature was gradually being exposed. Having emerged as an intolerance to what some perceived as dogmatic, often faith-based ways of viewing the world, scientism called for a return to science to unseat these purportedly “outmoded” systems of belief. But, in so doing, scientism demanded perfect adherence to its own orthodoxy, which ironically led to the resurgence of paternalism that defined the dark ages of medicine.

A sign of this is the near perfect global homogeneity of the Covid response. If individual jurisdictions were allowed to debate and develop their own Covid strategies, we would have undoubtedly seen more varied pandemic responses based on their unique histories, population profiles and what sociologists call “local knowledge.” Communities with young families and university students, where the risk of Covid was low but the risk to mental health from lockdowns, closures, and distancing was high, might have opted for more minimal Covid policies.

A religious community might have accommodated more risks to attend worship services while commuter-belt communities could have more easily embraced work-from-home restrictions with little negative impact. Every Canadian community would have been allowed to wrestle with the scientific realities of a viral threat balanced against their own values, priorities, and demographics. And the result, varied as it surely would have been, would have created control groups that would have shown the relative successes of different strategies.

As it was, we had little opportunity to understand what things would have been like if we had acted differently, and therefore little opportunity to improve our strategies for the future. And, where those opportunities did exist (e.g. in Sweden and Africa), their responses didn’t register because they were simply assumed to be unsuccessful as a matter of principle because they departed from the narrative.

As it was, the pandemic response ignored and silenced dissenters in all sectors of society: whistleblowing professionals, concerned parents, and hesitant citizens. We were simply informed of the ‘scientifically’ appropriate policy, and then nudged and pressured until we complied with it.

There was no attempt to engage with the population within the parameters of the pandemic restrictions; no outdoor town hall meetings, no phone polls or online referenda to increase engagement between public servants and those they were supposed to represent. I don’t think it would be an exaggeration to say that population lockdown without presentation of evidence, and without discussion and debate, meant not only the dissolution of representative government but the loss any semblance of a robust democracy.

One thing that is crucial to understand about the effects of scientism on the Covid narrative is that those holding ‘correct,’ pro-narrative views were not as protected by those views as it seemed. Those who followed ‘the narrative’ enjoyed only the facade of respect because their views weren’t conspicuous in the landscape of conformity. The opinions of your friends who masked, distanced, and got boosted to the precise tempo of public health orders were only coincidentally acceptable. If the narrative had changed, those views would have become — and will become, if the narrative changes — immediately unacceptable, and their holders shamed and rejected. 

In all this we got so much so very wrong. As the philosopher Hans-Georg Gadamer observed, the chief task of a humanistic approach to politics is, first, to guard against “the idolatry of scientific method.” Science should inform public health policy, to be sure. But there are important differences between facts and values, the humility with which a scientist tests a hypothesis and the certainty with which a politician asserts a claim. And we must be careful not to conflate our obligations as citizens with our obligations as spouses, parents, siblings, and friends.

Furthermore, science offers no special insight into matters of ethical and political significance. There is no branch of science — no immunology or microbiology — that can determine what makes life meaningful, no way for scientists to prioritize the moral values we ought to have just as there is no scientific ‘key’ capable of unlocking answers to questions about what it means to be good and live well.

Your Choice

“Your.” “Choice.”

Who could have guessed prior to 2020 just how controversial these two little words would become. Simple on their own but, put together, they create an affirmation of yourself, your worth and your abilities, and a declaration of your right to be the author of your own life. They give you the confidence to reflect, consider, question, and resist, and in so doing, make yourself and your place in the world. 

To choose is not just to randomly opt for one option over another. It is not an act of indulgence nor is it selfish. It defines who and what we are, as individuals and as a people. In one act of choice, we bring to fruition a lifetime of self-development. In one act of choice, we become human.

As it is, our scientism has put us into a moral deficit that is destroying our own moral capacities and the moral bonds between us.

Though we think being scientific means leaving the insights of the humanities and social sciences behind, we forget that not even 200 years after the Scientific Revolution came the Enlightenment, the 17th century intellectual movement that asserted the natural and inalienable rights to life, liberty, and property, and especially personal autonomy and the capacity for choice. The capacity for choice was seen by Enlightenment thinkers not just to serve individual interests but to be able to produce societies that are more equitable and just, and unbeholden to the unchecked powers of misguided and corrupt leaders.

Unfortunately, the lessons of the Enlightenment didn’t stick. 

We find ourselves now in desperate need of a 21st century Enlightenment, a renaissance of informed consent and personal choice. Such a renaissance will mean the coexistence of choices that are different from one another, and therefore messy and varied. But, in being so, they will also be perfectly imperfect. They will be, as Friedrich Nietzsche wrote, “human, all too human.”

Disclaimer

Some of the posts we share are controversial and we do not necessarily agree with them in the whole extend. Sometimes we agree with the content or part of it but we do not agree with the narration or language. Nevertheless we find them somehow interesting, valuable and/or informative or we share them, because we strongly believe in freedom of speech, free press and journalism. We strongly encourage you to have a critical approach to all the content, do your own research and analysis to build your own opinion.

We would be glad to have your feedback.

Buy Me A Coffee

Source: Brownstone Institute Read the original article here: https://brownstone.org/