Welcome To The Postmodern World

At the airport last week, I heard a traveler opining that he was glad he didn’t get vaccinated for COVID-19 because, he said, it causes more harm than it prevents and it “isn’t even a vaccine.” These situations, for the scientifically-literate, offer endless opportunity for humor, shock and dismay, incredulity and disbelief in the lack of rational thought carried by our fellow citizens.

We live now in a world of anti-vaxxers, Flat Earthers and voracious consumers of ghost haunting TV shows—Ghost Hunters, Ghost Adventures and Paranormal Investigators are a few of the most popular of these types on television shows, pulling in between 1.3 and 2.7 million weekly viewers in the U.S. alone. Tom Nichols published an excellent treatise on this called, “The Death of Expertise,” and all of the above is why I’ve termed this current era which we inhabit as “Postmodern.”


What is Postmodernism?

To reduce this term to its root, let me explain a little bit about its history. Postmodernism was a philosophical movement started in the late 1800s—the term was first used in 1870. It came into use more strongly in the early 1900s, where it represented a skepticism toward a grand narrative of modern culture. It sought to label the current state of the time—one in which the problems of the world had been resolved to such a degree—that it was fashionable to reject clear-minded, rational thinking. Postmodernism distinguishes itself by rejecting universal truths and objective reality, often at the expense of the critical thinking skills of its practitioner.


Facts do NOT equal Truth

There’s a difference between Fact and Truth, and it’s more than just semantics. Facts are those things supported by data and evidence, where truth is a principle in which one must believe. I’ll quote my friend Neil deGrasse Tyson here, who says, “The good thing about science is that it’s true, whether or not you believe it.”

We principally DO science (as a verb) in order to elucidate those approximations of the universe within which we live, to give us these objective facts. Unfortunately, and largely because of social media over the past 15 years or more, public beliefs have become more about ‘Likes,’ word-of-mouth virality and opinion. This is why sometimes this current era is called “post-truth.”

So, if it’s true that facts don’t matter, we find ourselves in quite a predicament in this civilization. Whether we think about the Mesopotamians inventing the idea for the number Zero, which was first in use in roughly 3 BC, Muhammad ibn Musa al-Khwarizmi inventing what we know as Algebra, the term taken from the title of his book, Kitab al-Jabr), or James Lind pioneering a prescient idea for the first clinical trial for scurvy on board the HMS Salisbury, everything we know and do on a daily basis rests upon the pillars of knowledge (social epistemology) divined from past and current scientific research.

And like everything in the universe, chaos is more ubiquitous than order, and entropy always increases. So, it’s easier to “undo” our successes as a civilization than it is to add to and build upon them, easier to tear-down an idea without evidence than it is to defend it with evidence. Pulling ourselves out of the intellectual darkness has required a tremendous amount of effortful study over millennia. But it’s been possible to undo it in just over a decade.


How do we know we’re in a postmodern world?

The evolution of society’s knowledge base requires agreement on what things are facts, and what are fiction. As John Adams observed, “Facts are stubborn things; and whatever may be our wishes, our inclinations, or the dictates of our passions, they cannot alter the state of facts and evidence.”

Similar to the goldfish, who doesn’t (probably) realize it’s in a fishbowl, because that’s all that it knows, how can we know that we’re now inhabiting a postmodern era when the fabric of everything around us is what it is? We’re embedded within it, so how do we know what it really is? By looking for simple pieces of evidence, such as rejection of expertise, rejection of facts and objective truth, and proliferation of antiscientific non-deductive knowledge frameworks are all strong hallmarks of, in fact, being part of a postmodern civilization.

There is a branch of philosophy that concerns itself with how the evolution of knowledge occurs in a society. Basically, how do all members of civilization occupy and retain a similar worldview, based on foundational knowledge elements that have been built before it. This is known as the Zeitgeist.

So, we occupy a particular place at a particular time in the evolution (or devolution, as the case may sometimes be!) of civilization, and the defining spirit or mood of this particular period is shown by the ideas and beliefs of the moment. Knowing we have such potent anti-knowledge headwinds that insidiously find their way into political and cultural approaches should keep us on our toes so that we don’t get swept-up in the inertia of the zeitgeist.


How the pharmaceutical industry is susceptible to postmodernism

Any facet of study and any industry is susceptible to postmodernism, challenging the very pillars and principles upon which the body of knowledge rests. As I mentioned in this column last month, Big Pharma is considered one of the Ultimate Evils of the world—the world outside of Pharma, that is. An example is the lay public challenging or disbelieving clinical trial results, even though they don’t have any statistical training, nor biological, toxicological, pharmacokinetic, or otherwise. They don’t let facts get in the way of a good viral anecdote! And actual, high-quality research demonstrates that it’s much easier to erode belief in a fact than it is to establish a fact, in the mind of an outside appraiser.

The public distrusts the safety and efficacy of pharmaceutical therapies, including record-setting consumer spending on wholly unproven and unscientific “natural” remedies and alternative treatments, which, by the way, led to the deaths of Steve Jobs and Steve McQueen, for example. The public also distrusts the cost of drug treatments, as well as the scientific acumen which exists within the pharmaceutical industry itself. A strongly supportive comment of this idea was posted on a news article recently that was about the U.S. Government appointing a few pharma execs to their panel in pursuit of more effective monkeypox treatments. The comment from a public commenter was, “Great, now Big Pharma is so far in the government’s pockets that the feds are going to recruit their ranks to help them study another fake disease that they can profit from. The inmates are running the asylum.” So deep is the fear and suspicion of knowledge, that the best and brightest for a particular role can be actively discredited.


What’s the antidote to anecdote?

Anecdotes come in all forms, and most of them harmful. Anecdotes are small sample size nonsense. And remember, “The plural of ‘Anecdote’ isn’t ‘Data.’” This carries forward into the practice of medicine, where in my estimation, the three most dangerous words that exist (in clinical medicine) are: “In my experience _____.” This statement is usually used clinically to precede and defend or usher-in the suggestion for a treatment that is unfounded, and supported by roughly four patients who were never part of a rigorous randomized controlled study for the proposed treatment idea that’s about to follow in that statement.

The only antidote to anecdote is to require more, and require better. Better data, better science, better scientific practice. The problem is, science itself can be seductive, and the practitioner can continue “science-ing” ad infinitum, always looking for more and more data. This becomes a separate challenge, and it’s also why science’s greatest strength is also its greatest public relations weakness: The general public wants “an” answer to a scientific question, not a range of answers with uncertainty included and confidence intervals. Not answers that change over time. So, the self-refining and error-correction nature of multiple scientific studies over time tends to contribute to the very distrust within the public that we seek to avoid in the first place.

It doesn’t help that scientists put in top positions of authority have abused their posts as well as not even having a strong grasp on basic science in the first place. For example, in former White House coronavirus response coordinator Deborah Birx’s new memoir, she talks of using “flatten the curve guidance” to manipulate “political, nonmedical members” of the U.S. government. She also claims that her manipulation was to avoid a personal situation where “they would have campaigned to lock me down and shut me up.”

By the way, she also knew that “two weeks to slow the spread” was fiction, and she said, “No sooner had we convinced the Trump administration to implement our version of a two-week shutdown than I was trying to figure out how to extend it.” Birx and her team also gamed their documents continuously to keep the lockdown measures in place. She writes, “After the heavily edited documents were returned to me, I’d reinsert what they had objected to, but place it in those different locations. I’d also reorder and restructure the bullet points so the most salient—the points the administration objected to most—no longer fell at the start of the bullet points. Our Saturday and Sunday report-writing routine soon became: write, submit, revise, hide, resubmit.”

When you combine this level of scientific malpractice and impropriety with the post-truth crucible of social media, which was already well in-place, science itself had its reputation tarnished by both bad actors and bad tech platforms.

All that we can do to preserve our industry in this era of postmodernism is to demand objective facts and good evidence in the work that we do. Evidentiary standards cannot falter, because those evidence bases are the only things upon which rests rational defense from unsubstantiated attacks. It’s important to also not let social media opinions and anecdote extinguish the flame of knowledge. When we know things, we must say them and defend them—otherwise we’re complicit by omission in the scourge of anti-knowledge.


Ben Locwin is Contributing Editor and author of Clinically Speaking. He is a healthcare executive, philanthropist and helps with strategic direction and investments across the healthcare industry. Past work has included working on development and allocation of vaccines for COVID-19, nationally and internationally, as well as improvement of clinical practice and other patient-facing endeavors. He has been featured in top-tier media including The Wall Street Journal, Forbes, USA Today, Der Spiegel, Associated Press, and many more.

Leave a Reply

Your email address will not be published.