Tuesday, February 18, 2025
spot_img
HomeFeaturesUnderstanding confirmation bias, and why it keeps beating us up

Understanding confirmation bias, and why it keeps beating us up

For a moment, I felt a twinge of something I couldn’t quite identify when I recently deleted a LinkedIn follower from the low-carb fraternity. I like to keep my LinkedIn contacts broad because, normally, I’m keen to hear a wide variety of views. Still, after another round of preachy bashing on the ‘benefits’ of low-carb intake, I was over it.

The outcome was a clear decision, certainly, but I couldn’t help but wonder if I’d just fallen victim to the problem so many of us seem to be infected with these days: confirmation bias.

Perhaps I should start with an explainer and upfront declaration of where I stand before we plumb the depths of this subject.

Firstly, I’m not opposed to low-carb advocates as such, as I am against carb confusion generally. That’s the camp my previous LinkedIn communicator was clearly in. He couldn’t (or wouldn’t) distinguish between highly processed carbs and complex carbohydrates, i.e. potato chips, as compared to the full potato. To him, anything carb was the source of all evil; there was no distinction.

He even opposed the AI overview I put forward, which states that complex carbohydrates “are a more stable source of energy than simple carbohydrates, which are quickly digested and cause blood sugar levels to spike.”  Anyway, I’ll come back to how AI fits into this later.

The Diet War bias

Second, I suspect that the problem of confirmation bias sits at the source of the diet wars we’re fighting out there, with the low-carb sector claiming that high-carbers (like me) are living on information that’s 10 years out of date and our lot pointing out that doctors don’t prescribe beef steak as a cure for heart attack.

Thirdly, oh and probably I just did it: yes, I live in the high-carb camp and am happy to be here, but I like to keep an ear out for what ‘the others’ are saying. The fact is, though, I could be just as biased as the camp I’m complaining about. Neither side is ever exempt from the problem of confirmation bias. But what is it?

Technically, confirmation bias is considered a cognitive phenomenon where individuals favour information that aligns with their pre-existing beliefs or hypotheses while disregarding or minimising evidence that contradicts them. It manifests itself in various contexts—from everyday decision-making to scientific research—shaping how we interpret data and form conclusions.

At its core, confirmation bias can be broken down into two main processes: selective exposure and biased interpretation. Selective exposure occurs when individuals actively seek information supporting their views while avoiding contradictory evidence.

Biased interpretation, on the other hand, involves processing ambiguous information in a way that reinforces existing beliefs. For example, if someone believes that a particular diet is effective, they might only read success stories about it while ignoring or downplaying studies showing its ineffectiveness.

The mechanisms

Delving more deeply, I found that the roots of confirmation bias can be traced to various psychological and social factors. A significant driver is cognitive dissonance, a psychological discomfort experienced when one’s beliefs clash with new evidence.

To resolve this discomfort, individuals often alter their beliefs to maintain consistency with their current viewpoint. This self-justificatory mechanism makes it easier to embrace information that aligns with one’s beliefs while dismissing conflicting data.

Another contributing factor, and a big one for many, is social identity. People often derive a sense of belonging and identity from their beliefs, particularly in political or religious contexts. This identification can lead individuals to become defensive when encountering dissenting opinions. Rather than reevaluating their stance, they may become more entrenched in their beliefs, further perpetuating confirmation bias.

As Dr Francis Collins says in his book The Road to Wisdom, a dissertation on truth, science, faith and trust, it’s a “tribal thing”.

I like Collins because he provides a kind of microbiome-type take on the subject, which appeals to me. After all, in the world of WFPB (i.e. my world), the microbiome is a really big deal. Note how that tribal thinking can be so easily appealed to. Collins says:

“‘Trust your gut’ might at times be a reasonable starting place, but it’s the most likely to be coloured by cognitive bias, so it should almost never be the endpoint. Listen to your gut, but then try to verify it. Trust deserves time, exploration of facts, and reflection. That reflection should include a serious effort to weigh the factors that are feeding into the deliberation, though not all of those are conscious.”

Unfortunately, these days, and probably because of the internet, we’re far more in danger of trusting those with marginal competence but shared tribal values.

Publication bias

Then, as Collins further explains, there is also the problem of publication bias, “where poorly designed studies that seem to show an association with a bad outcome are much more likely to be published than well-designed studies that show no relationship.”

That’s clearly a big problem for the “legacy press”, as Mark Zuckerberg calls it, but how can we trust him when he decides to dispense with fact-checking on Facebook, Instagram and Threads in a bid to curry favour with the new Whitehouse regime?

If you’re anything like me, you may have noticed that, for a while now, things have been going a little crazy, and this ‘dizziness’ encompasses much more than the ongoing battle over food, food additives, political persuasion, and general discontent over almost anything you can name. Or so it seems.

You can imagine my relief when, over the holidays, I discovered one word that explains it all: enshittification. This term, according to its creator, Cory Doctorow, is a theory about what happens when you have power without consequence. He sees it as a kind of insipid creeping process that eventually makes people feel powerless and angry.

Room for hope

For Doctorow, there’s room for hope because “people are so pissed off about monopolies” and “if they can make a coalition, it’ll be like when we discovered the word “ecology” in the ’70s — we realised that just because you care about owls and I care about the ozone layer, it doesn’t mean we’re not caring about the same thing.”

In other words, despite our tribal divisions, there’s a certain unity in both our diversity and discontent.

“Every time you see the world change all of a sudden, it’s because a new coalition has popped up,” he told CNN.

What we can clearly observe right now is that we’ve got a lot of shouting on both sides of the fence, and it doesn’t seem to matter if the facts are right or not. We’ve separated into tribal camps and seemingly lost, or even consider that we might be biased. With this in mind, I offer the following thoughts.

How to fight it

Recognising the existence of confirmation bias is probably the first step toward mitigating its effects. Here are some strategies you might employ to reduce your susceptibility:

1. Seek Diverse Perspectives: Actively seek out information and opinions that challenge your beliefs. Engaging with a variety of sources can provide a more balanced view and help identify potential flaws in your reasoning.

2. Practice Critical Thinking: Cultivating a habit of critical thinking involves questioning your assumptions and analysing evidence rigorously. Instead of accepting information at face value, consider the credibility of the sources and the strength of the arguments presented.

3. Embrace Contradictory Evidence: Make a conscious effort to consider and understand evidence that contradicts your beliefs. This can provide insight into the complexity of issues and help refine your understanding.

4. Engage in Constructive Discussions: Participate in discussions with individuals who hold different viewpoints. Constructive dialogue can illuminate potential biases and encourage a more nuanced perspective.

5. Limit Echo Chambers: Be aware of environments (like social media influencers or specific news outlets) that reinforce your beliefs. Engaging with diverse media can help break down echo chambers that perpetuate confirmation bias.

6. Reflect on Motivations: Consider why you hold specific beliefs. Reflecting on the motivations (yours and theirs) can help identify biases and encourage openness to new information.

Fact, fiction or opinion

There’s a considerable difference between what we can class as fact, what might be considered fiction, and what is nothing more than an opinion. The confounding factor here is that opinions can be built from fiction and, if repeated often enough, become fact for those who continually hear it.

Historian Professor Yuval Noah Harari, a vegan and self-declared atheist, maintains that the Bible, the Koran and the Vedas are all works of fiction, “like Harry Potter,” he says. It’s a position which, despite his enormous following, hasn’t made him popular with Christians, Muslims or Hindus, who dismiss this view as nothing but misguided grandstanding designed to rack up clicks.

Dr Francis Collins, on the other hand, very clearly illustrates how we can distinguish fact from opinion. In his view, a fact might be something like – the speed of light in a vacuum is exactly 299,792,458 meters per second (approximately 300,000 km/s). This is a fundamental constant in physics and forms the basis of Einstein’s theory of relativity. Newton’s law of universal gravitation would be another example.

By contrast, an opinion is someone saying, ‘I think cats make better pets than dogs.’ One you can clearly prove with rigorous testing and checking; the other will always remain a matter of opinion.

As mentioned above, not all our considerations are “conscious.” For instance, I’ve already declared my own bias and mentioned where Harari sits, and if you aren’t already aware, does it affect your view of Dr Collins to know that he was Dr Anthony Fauci’s boss over the time the Covid vaccine was developed and is also an evangelical Christian?

Confirmation bias in science

While science is often viewed as an objective pursuit of truth, confirmation bias can infiltrate scientific research. Like all individuals, researchers are susceptible to biases that can influence their hypotheses, data interpretation, and even the design of experiments. As I understand, it kinda works like this:

1. Hypothesis Generation: Scientists may unconsciously formulate hypotheses based on their beliefs or prior findings. As a result, they might overlook alternative explanations or fail to consider hypotheses that contradict their existing theories.

2. Data Interpretation: When interpreting results, researchers may favour interpretations that align with their hypotheses. This bias can lead to selective reporting of results (where only supportive data is published) and, therefore, skewing scientific literature.

3. Peer Review and Publication: The peer review process is intended to ensure the quality and integrity of scientific research. However, reviewers may also exhibit confirmation bias. Papers that align with popular theories or prevailing beliefs may receive more favourable evaluations, while those challenging the status quo might face harsher scrutiny.

4. Reproducibility Crisis: Confirmation bias can contribute to the broader reproducibility crisis in science, where many studies fail to produce consistent results upon re-evaluation. Influenced by biases, researchers might not replicate studies adequately or may overlook important variables.

Preventative steps

Science does, however, take certain preventative steps to mitigate the problem. The following points explain how.

1. Replication Promotion: Encouraging the replication of studies by independent researchers can help validate findings and ensure that results are robust against biases.

2. Open Practices: Sharing raw data and methodologies can allow other researchers to scrutinise and replicate studies, helping to uncover any biases in the original research.

3. Diverse Research Teams: Promoting diversity within research teams can introduce varied perspectives and reduce the likelihood of groupthink and confirmation bias influencing research outcomes.

4. Training in Scientific Rigor: Programs aimed at educating researchers about cognitive biases and encouraging critical thinking can equip scientists with tools to recognise and counteract their biases.

5. Protocols for Bias Awareness: Establishing guidelines that encourage researchers to state their hypotheses explicitly and how they might select data can encourage awareness of potential biases from the outset.

Essentially, confirmation bias is a pervasive phenomenon that significantly influences how all of us perceive and interpret information. From everyday decision-making to the rigour of scientific research, its effects can be profound.

Can AI overcome this?

You might think AI will eventually help mankind overcome this problem. However, research released before Christmas 2024 from the University of British Columbia suggests otherwise.

While AI chatbots may seem like neutral tools, researchers found that they often contain biases that can shape discourse in unhelpful ways. In this case, the research team examined how four leading AI chatbots responded to questions about environmental issues—the findings are surprising.

“It was striking how narrow-minded AI models were in discussing environmental challenges,” said lead researcher Hamish van der Ven, an assistant professor in the faculty of forestry who studies sustainable business management.

“We found that chatbots amplified existing societal biases and leaned heavily on past experience to propose solutions to these challenges, largely steering clear of bold responses like degrowth or decolonisation.”

Alarmingly, they found that the chatbots leaned heavily on Western scientific perspectives, marginalised the contributions of women and scientists outside of North America and Europe, largely ignored Indigenous and local knowledge, and rarely suggested bold, systemic solutions to problems like climate change.

The researchers hope the findings will encourage AI developers to prioritise transparency in their models.

 “A ChatGPT user should be able to identify a biased source of data the same way a newspaper reader or academic would,” Dr van der Ven said.

Gosh, did I hear that correctly, or was there an echo somewhere?

Peter Barclay
Peter Barclayhttp://www.wholefoodliving.life
Has a professional background in journalism, photography and design. He is a passionate Kiwi traveler and an ardent evangelist for protecting all the good things New Zealand is best known for. With his wife Catherine is also the co-owner of Wholefoodliving.
RELATED ARTICLES

Sign up to our newsletter

For the latest in news, recipes and alerts be sure to sign up to our newsletter to stay up to date.

Most Popular

Subscribe To Our Newsletter

Subscribe To Our Newsletter

Join our mailing list to receive the latest updates on plant-based evidence, recipes and opinions straight to your mailbox. 

You have Successfully Subscribed!