Earlier this week, in an interview with Drs Dean and Ayesha Sherzai, clinical psychologist Dr Jonhathan Stea canvassed the problem of pseudoscience and the grifters out to confound the world of health and wellness.
Listening to it, I couldn’t help but be reminded of two ‘maxims’ I’ve quoted before – where there’s need, there’s greed, and the late Dr Hanz Diel’s wise observation: “there’s a pill for every ill and a bill for every pill.”
Stea is an expert in addiction and mental health. He is a passionate advocate for science-based practices and the use of the right words to combat misinformation, particularly on social media. He is an assistant professor at the University of Calgary and is keen to improve public mental health literacy through his writing and speaking engagements.
It’s a subject that fits well with the Sherzai’s. Dean considers misinformation one of the biggest challenges of our time, and in his field, “It’s not just from people that are not scientists, but even scientists and how they do research is becoming problematic,” he says.
Although not naming the study, he said, “We’ve seen that just recently with a dementia study where it was completely contrived. The results were almost engineered.”
Tough even for experts
Stea recognises the challenges here and says this was the point of his new book Mind the Science: Saving Your Mental Health from the Wellness Industry. “It’s hard even for experts in their narrowly defined niches to navigate scientific literature, so it’s hard for the general public to syphon through this stuff, especially on social media and in pop culture where it’s just so pervasive,” he says. “Even at the level of scientific journals, misinformation exists at so many different levels.”
But it was the pseudoscience that bothered him the most.
“We all know that scientific journals can vary in their quality. We have top-tiered journals like Nature or The Lancet, and then we have less-tiered journals. But there also exists an entire industry of pseudo-scientific journals.
“What I mean by this are journals that will publish on unequivocal pseudo-scientific topics. For example, a randomised controlled trial evaluating whether someone can send positive vibes to water can make its ice crystals look more beautiful. Or another one where it entertains the idea of whether demonic possession is a causal factor in schizophrenia.
“These are published in journals dressed up as science; that’s what pseudo-science is. It looks very official, is peer-reviewed, and is on Pub-Med and in major databases. That, I think, is very dangerous for the general public.”
As Stea sees it, such publicity provides ideal opportunities for what he calls “wellness grifters” to draw from these repositories and say they now have evidence to support their pseudo-scientific treatment or ideas.
Ayesha Sherzai lights up on the subject of grifters and what they bring to the health.
“Neurology is still a growing field. There are still a lot of conditions we don’t know or have a definitive treatment for, and so when we come across patients who are experiencing these conditions, and they are suffering … well, it’s a mix of having a very inconvenient health care system as well as not having very well defined treatments or therapies for a condition.
“Unfortunately, while they’re on this journey, they are losing hope, and they are suffering and experiencing so much pain from their condition. They seek out alternative therapies, and usually, the verbiage that comes across from my patients is, ‘Well, it’s not going to hurt.’ Then you see these grifters and individuals who are essentially just making a profit out of the hopelessness their conditions bring along with it.
“They sell them vitamins and concoctions and brain scans that don’t mean anything. You see them becoming almost behemoths in that field. They’re just taking space, not giving opportunities for true science, physicians and therapists who stick to the science to come across as legitimate experts in the field.
Grifters without conscience
“It becomes so confusing because these grifters are very well-versed in marketing as well. They know what to say and what testimonials to give. They bring in the right amount of amygdala stimulation and emotion and basically this concoction is dangerous. It doesn’t provide any health to the patients, and yet you see them growing and growing.
For Dean Sherzai, the term “peer-reviewed” has become particularly unsettling because it is “so misused.”
“I was the director of research education for all the residents at Loma Linda, and when I would send papers with students to certain journals, they would say send me a list of your peers that would review your paper. I’m like, oh wow, that’s a nice way to game the system. These are my friends. No matter what I write, they’re going to say this was a great paper. So, I see the flaws even in the peer-reviewed system.”
Overall, Dr Dean sees that problem as part of increasing complexity “and whether we are designed cognitively to be able to manage greater and greater complexity. This is not putting down humans in general; it’s just the reality of a society that’s becoming complex structurally, yet cognitively, we are still palaeolithic man.”
The points raised in this discussion strike at the heart of the infotech-biotech revolution and beg the question, what on earth is happening here?
The problem of our ability to manage greater and greater complexity was dealt with in depth by historian Yuval Noah Harari in his book 21 Lessons for the 21st Century, where he proffered a similar warning.
Today, where the spread of misinformation is concerned, the internet has become one of our best but most misused tools, a fact that likely bemuses people like Harari, who points out that humans have always been far better at inventing tools than using them wisely.
Complexity and confusion
“In the coming century,” Harari wrote, “biotech and infotech will give us the power to manipulate the world inside us and reshape ourselves, but because we don’t understand the complexity of our own minds, the changes we will make might upset our mental system to such an extent that it too might break down.”
For large sections of the public, however, the other options involve either complete disengagement or the construction of a whole new reality. For those gripped by influencer culture, that’s not hard to do. As Harari says, if we don’t want to work it out, we can always find a cat video to distract us.
It’s the bullshit factor that sticks in Stea’s craw the most.
When it comes to evaluating people’s ability to fall for a conspiracy theory, he says, “there’s a variable called pseudo-profound bullshit. It was pioneered by Gordon Pennycook and related to someone’s tendency to fall for superficially deep but vacuous language.
“For example, my ability to fall for the idea that I’m living in the midst of a high-frequency, blossoming quantum soup. That’s meaningless, but it sounds profound.”
On one level, it might sound amusing, but on the other hand, it seems sad when you consider events like the occupation of Parliament grounds in Wellington, New Zealand, where some people sold tinfoil hats apparently to protect themselves from dark government agencies reading their thoughts.
The bullshit asymmetry principle
Stea says the whole internet misinformation effort they were talking about could be easily likened to Alberto Brandolini’s bullshit asymmetry principle, which explains the effort it takes to debunk misinformation in comparison to the relative ease of creating it in the first place.
Stea: “The amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it.”
It seems the biggest problem we all face when misinformation is regularly promulgated is that our brains are not wired to recognise and deal with it easily.
“One of the reasons misinformation can go viral so easily is that it speaks to our personalities and emotions. It can be entertaining at times but incredibly dangerous. We know from another psychological phenomenon known as the illusory truth effect that when misinformation gets repeated over and over and over, our brains don’t do so well in differentiating truthful information from familiarity. It has to do with the way our brains are wired.”
Stea says the people more likely to fall into the misinformation claptrap are those who think intuitively rather than analytically.
The solution is to sit back, pause, reflect, blow down and back up, and really try to think about it. “That act, in and of itself, is protective.”