Features

The Spread of Misinformation

Reading time: 16 minutes
image_print
Multiple threads representing information

Conspiracy theories about COVID-19 continue to proliferate online

When NFL’s Buffalo Bills safety Damar Hamlin collapsed suddenly on the field during a game with the Cincinnati Bengals on Jan. 2, 2023, Imran Ahmed braced himself for the inevitable response from the vaccine conspiracy theorists.

Mr. Ahmed, founder of the Center for Countering Digital Hate (CCDH), didn’t have long to wait. Even before the football player was put into an ambulance, social media began to buzz with posts claiming, without evidence, that complications from COVID-19 vaccines had led to his health emergency.

“This is a tragic and all too familiar sight right now: Athletes dropping suddenly,” tweeted Charlie Kirk, a far-right activist, conspiracy theorist and U.S. radio show host.

Marjorie Taylor Greene, the U.S. congressperson from Georgia, drew an even more direct link. “Before the covid vaccines we didn’t see athletes dropping dead on the playing field like we do now. And we never saw the CDC say things like this. How many people are dying suddenly? Time to investigate the covid vaccines,” she tweeted.

A defining characteristic of this pandemic has been the spread of misinformation — indeed the World Health Organization coined the term “infodemic” to describe the fast-moving distortion of facts about COVID-19. So, while the stark opportunism of the anti-vaccination conspiracy theorists may horrify Mr. Ahmed, he is no longer surprised by it.

The response to the football player’s collapse was straight out of the COVID-19 conspiracy playbook: use the absence of information in the death or health emergency of a public figure — Bob Saget, Lisa Marie Presley, or even Queen Elizabeth to name just a few — and apply the narrative that best suits one’s own purpose.

Unlike credible sources, the spreaders of disinformation capitalize on the information void. More than a week after Mr. Hamlin’s collapse, doctors still hadn’t determined what led to his health emergency.

“How do you falsify their information while the doctors still had not worked out what is wrong with him?” asked Mr. Ahmed, whose organization is dedicated to the disruption of the architecture of online hate.

“The tactic of overwhelming people with different types of lies and then forcing them to fact check it and saying, ‘you aren’t allowed to decide on the truth until you falsified our untruths,’ is problematic. It’s like throwing a spanner into the gears of a machine because it just grinds to a halt … It destroys the ability of people who are well-meaning to have productive discourse. Trying to falsify a non-falsifiable statement is like wrestling with a chimney sweep — you’re going to get covered in soot.”

Despite the efforts of scientists and health officials to combat disinformation, conspiracies about the vaccine continue to proliferate, including: it contains a microchip that allows Bill Gates to monitor our whereabouts; it causes infertility; it alters one’s DNA; and, of course, that it’s responsible for a growing number of deaths.

A sub-narrative of this last assertion has cut particularly deep into the heart of the Canadian medical community. A bogus theory — promoted by a small group of Canadian doctors — is that the vaccine played a role in the recent deaths of more than 80 Canadian physicians. Global News spent months investigating the deaths of the doctors on the list and found no connection to the vaccine. Where the media outlet could determine the most likely cause of death, it was most often cancer, suicide or cardiac arrest. One physician died climbing one of the world’s most dangerous mountains and at least one doctor was never vaccinated.

The Cost to the Canadian System

Conspiracy theories and incorrect scientific information about COVID-19 took root early in Canada. According to a survey conducted by Carleton University in May 2020, nearly half of Canadians (46 per cent) reported believing at least one of four COVID-19 prominent conspiracy theories and myths.

 “The numbers are shocking because it highlights the degree to which misinformation has been normalized. It just washes over us,” said Professor Timothy Caulfield, a Canada Research Chair in Health Law and Policy at the University of Alberta.

But it can’t be emphasized enough that the core messages pushed by COVID-deniers or antivaxxers are simply wrong, said Prof. Caulfield.

And when they frame the scientific community’s refuting of their claims as an infringement on their freedom of speech? Utter nonsense, he says. “Their positions haven’t been silenced. Just the opposite, in fact. They have been considered, thoroughly researched and found to be incorrect.”

Prof. Caulfield is a member of an expert panel that authored a recently released report that laid out the costly consequences of misinformation. Commissioned by the Council of Canadian Academies, the report, called Fault Lines, states that as science and health misinformation become increasingly fused with ideology and identity, it contributes to deepening divisions in our society and takes an extraordinary financial and human toll across Canada’s communities and systems.

Considerable and mounting evidence shows that misinformation has led to illness and death from unsafe interventions and products, vaccine preventable diseases, and a lack of adherence to public health measures, with the most vulnerable populations bearing the greatest burden, states the Expert Panel on the Socioeconomic Impacts and Health Information in its Fault Lines report.

Misinformation about COVID-19 is estimated to have cost the Canadian health care system at least $300 million in hospital and ICU visits between March 1 and November 30, 2021, states the report. This doesn’t include the cost of outpatient medication, physician compensation, or long COVID. it also does not include broader societal costs, such as delayed elective surgeries, social unrest, moral injury to health care workers, and the uneven distribution of harms borne by communities.

And if those who reported believing COVID-19 is a hoax were vaccinated when they became eligible, over 2.3 million additional people in Canada would have been vaccinated, resulting in roughly 198,000 fewer cases, 13,000 fewer hospitalizations, and 2,800 fewer deaths from COVID-19 between March 1 and November 30, 2021, states the report.

Disinformation — The Operationalizing of Hate

During a recent meeting of CPSO Council, Mr. Ahmed discussed how digital spaces have been colonized and their unique dynamics exploited by malignant actors that operationalize hate to sow doubt and division.

Mr. Imran Ahmed
Mr. Imran Ahmed

Mr. Ahmed became fascinated with the dynamics of hate and how disinformation could be weaponized during the events of 9/11. He was a young investment banker in England at the time — having earlier dropped out of medical school — and decided to study social and political sciences at the University of Cambridge. After graduating, he worked for the British Labour Party in Opposition.

During this period, he was becoming increasingly uncomfortable with the toxicity proliferating online — specifically, the hatred being spewed against Jews, Muslims and Blacks. And then a young Labour MP, with whom he worked very closely, was murdered. Jo Cox was killed by a white supremacist who appeared to have been radicalized online. Ms. Cox was a Remainer in the campaign leading to the 2016 referendum on the United Kingdom’s membership in the European Union. According to eyewitnesses, her attacker yelled “Britain First” as he stabbed her.

Mr. Ahmed immediately recognized his colleague’s killing as proof of social media’s power to galvanize real-world violence. Her murder, coupled with his unease about the increasingly malignant hatred appearing online, changed the course of his life — he founded the CCDH shortly thereafter and committed himself to countering online hatred.

Mr. Ahmed now lives in Washington, D.C. and since CCDH’s inception, his small team has produced extensive reports on how disinformation has been weaponized to encourage misogyny; antisemitism; hatred against Muslims, Blacks and LGBTQ+ communities; to deny climate change; claim election fraud; and undermine the war efforts of Ukraine. Every day, Mr. Ahmed finds a new atrocity in the online harm landscape, but he has particular scorn for anti-vaccine conspiracy theorists and the extraordinary toll their disinformation has taken on society.

“The net result is that, so far, more than 300,000 unvaccinated people in the U.S. have died of COVID and that was avoidable,” he told Dialogue. “Those deaths happened after the vaccine became available and every one of those people died a horrible death. People continue to die horrible deaths — right now someone, in their last few moments on Earth, is telling their doctor, ‘I thought the vaccine would harm me.’ I know the impact these deaths have on families, on ICU staff, the physicians, nurses, students, the cleaners, everyone is impacted by it. That scale of death — it’s just so overwhelming to me.

“I look at it this way. On 9/11, fewer than 10,000 people died and we spent trillions of dollars fighting wars in the Middle East and in South Asia. And then I look at our response to 300,000 Americans being killed in the pandemic because of disinformation about vaccines. How on Earth is our response to say, ‘Hey Facebook, Twitter, Google — carry on as you were. You are doing great!’ ” he said.

The Big Chill: Silencing Science

In tallying the costs incurred by COVID-19 disinformation, one must count the toll exacted on the scientific and medical communities through intimidation. Dozens of researchers, for example, told the journal Nature they received death threats, or threats of physical or sexual violence, after speaking about COVID-19 and advocating for vaccines on social media.

“It is that prevalence of abuse — the ease with which trolls and malignant actors can terrorize good people — that will see our societies begin to atrophy,” said Mr. Ahmed. “It will lead to unnecessary death with tragedies occurring because good people will think, ‘Why on Earth would I ever be a vaccinologist?’”

In a March 2021 report, the CCDH identified 12 people responsible for the bulk of misleading claims and outright lies about COVID-19 vaccines that proliferate Facebook, Instagram and Twitter. The report, “The Disinformation Dozen,” found these activists produce 65 percent of the anti-vaccine lies on social media platforms. Leading the pack of the Disinformation Dozen is Dr. Joseph Mercola, a Florida-based osteopath physician who peddles dietary supplements and false cures as alternatives to vaccines. According to testimony he provided in court, he has a net worth of more than $100 million (U.S.).

Big money can be made in disinformation, and it is that pursuit of profit that is keeping Facebook, Twitter and Google from banishing the bad actors, said Mr. Ahmed.

“The game is about eyeballs: how many eyeballs you can get on your site,” Mr. Ahmed explained. The longer platforms like Facebook and Twitter can keep users scrolling, the more advertisements they can serve up. And providing content that promotes division and controversy — at the expense of facts and truth — increases the tension and keeps people engaged.

The absence of laws that would force big tech to become responsible stewards of their technology has put us on a terrifying trajectory of chaos, said Mr. Ahmed.

The laws that seek to regulate internet communications, for the most part, were created before social media companies existed. As a result of this permissive regulatory environment, tech companies have been emboldened to adopt aggressive, profit-driven business strategies. Mr. Ahmed references a Facebook memo in which co-founder and CEO Mark Zuckerberg described his company’s “move fast and break things” maxim.

 “That has meant that social media companies are free of normal negligence law — the only industry in the world that cannot be held liable for negligence. That is a real problem,” said Mr. Ahmed. “That, I would say, is the source of our problem.”

He said if the big tech companies could do just one thing to demonstrate good faith, it would be to quietly enforce their own rules. But even when instances of hate are reported to platforms using their own tools, nine out of 10 times they fail to take it down. This includes posts promoting the Great Replacement conspiracy theory, violating pledges they made in the wake of the 2019 Christchurch Mosque attacks. That conspiracy theory inspired the Christchurch attacks, as well as the Tree of Life Synagogue attacks in Pennsylvania and probably many others, including last year’s mass shooting in Buffalo.

CCDH has lobbied governments for change and to promote its STAR Framework, which sets out a plan to reset our relationship with technology companies and collectively legislate to address the systems that amplify hate.

In May 2022, Mr. Ahmed spoke to a committee of MPs at the Canadian House of Commons about the need to set laws in place. Member of Parliament Peter Julian (NDP) has a Private Member’s Bill C-292 that aims to promote algorithm transparency. But Mr. Ahmed appears not to have been encouraged by the leadership demonstrated by Canadian politicians, to date, against digital hate. He says too many appear to be terrified of being trolled.

Yet, voters appear to want laws enacted that protect them from disinformation when they use social media. Public polling in the U.S. and U.K. have overwhelmingly found most people believe people should not have the right to pump their disinformation into millions of households if that information causes immense harm.

Mr. Ahmed says he believes we are at a pivotal moment. “We have a question that needs to be answered now. Do we want to let Facebook decide that we never go back to normal? Because that’s what is happening right now … Our prosperity is being challenged every day by the prevalence of dangerous, fringe elements elevated to the centre by the weird mechanics of algorithmic amplification on platforms that now control what we see and how we think.”

The authors of the Fault Lines report agree that we are at a vulnerable moment in our history, with societal trends of declining trust, increasing polarization, and the delegitimization of our knowledge institutions, just as social media is pulling us all even more tightly, uncomfortably, together with its astonishing powers of connectivity.

“The enormity of the misinformation problem can feel overwhelming and impossible, but we cannot afford to turn away. The future health and well-being of people in Canada, and around the world, depend on our recognizing and responding to science and health misinformation today,” it stated.

For more, listen to our “In Dialogue” podcast with Mr. Ahmed: Health Disinformation — Consequences & Solutions.

Transcript

How to Debunk

As you scroll through your Twitter feed, you see a tweet asserting “facts” about COVID-19 vaccines that bears no resemblance to science. Should you correct it?

Yes, says Professor Timothy Caulfield, who has been studying and writing about misinformation well before the emergence of COVID-19. In fact, he says fighting the spread of misinformation should be viewed as a critical priority for health and science policy.

“I do think we should all correct misinformation when we see it,” says Prof. Caulfield who uses “misinformation” as an umbrella term to include disinformation. “And physicians, particularly, are such trusted voices that I think they, especially, need to correct misinformation when they see it,” said Prof. Caulfield, a Canada Research Chair in Health Law and Policy, a Professor in the Faculty of Law and the School of Public Health, and Research Director of the Health Law Institute at the University of Alberta

That, however, does not mean engaging directly with the person who tweeted. “I don’t think it’s worth engaging with trolls and misinformation mongers because that is what they want. It just gives their false claims oxygen and a broader platform.”

But what if it looks like someone is just asking an honest question? Is it safe to engage then? Prof. Caulfield still counsels caution. “The problem is that a lot of misinformation mongers have gotten good at asking what looks like an honest question. And — I’ve fallen for this too — once you engage, it becomes clear that it’s a trap and they just wanted to get in your feed.”

Prof. Caulfield says when he sees incorrect information, he takes a screen shot of the misleading tweet or post and shares it with his correction clearly added.

He also thinks it’s important to move quickly to falsify a particular assertion before it is able to get a foothold. “We need to correct things quickly and clearly before it takes on an ideological spin, before it becomes part of in-group signaling for a particular community. We need to have clear messaging that destabilizes their positioning, because it doesn’t take much for something to reach a tipping point and become an ideological flag.”

In his research paper, “Does Debunking Work? Correcting COVID-19 Misinformation on Social Media,” Prof. Caulfield writes that the data surrounding effective debunking strategies “is messy and context-dependent. More research on how best to deal with misinformation is clearly needed, but there is little doubt that countering misinformation can have a positive impact.”

In fact, he writes that silence in the face of misinformation may be the very worst strategy. A 2019 study, for example, found that not responding to misinformation “has a negative effect on attitudes towards behaviours favoured by science.”

In his paper, Prof. Caulfield puts forward some evidence-informed principles that physicians can adopt to help counter misinformation:

  • Use facts. Most studies have found providing corrective information can be effective. When appropriate and possible, provide a causal explanation. Prof. Caulfield writes that this approach can also nudge people to generally think more critically, which may help to shield them against related forms of misinformation.
  • Make the facts the hook, not the misinformation. Frame debunking in a manner that makes the correct information — not the misinformation — the memorable part of the messaging. Make sure the misinformation is clearly flagged as wrong.
  • Provide clear, straightforward and shareable content. Studies have shown that the use of scientific jargon will cause people to disengage, even if explanatory language is also provided.
    To that end, Prof. Caulfield helped found the ScienceUpFirst website for people who want to cut through the noise of misinformation with science. The website is a national initiative that works with a collective of independent scientists, researchers, health care experts and science communicators to produce content that is useful, accessible, shareable and designed to amplify science-informed messaging.
  • Use trustworthy and independent sources. Evidence perceived to be removed from an agenda, especially a profit-driven agenda, is more likely to be trusted and persuasive.
    If applicable and available, emphasize the scientific consensus. If appropriate, acknowledge that science evolves and, as such, the consensus can change.
  • Be nice. An aggressive language style intended to shame is perceived to be less credible and less trustworthy.
  • Don’t play to the hard-core denier. It’s a waste of energy. The World Health Organization has stated the likelihood of changing the mind of a vocal science-denier is close to zero.

Prof. Caulfield says, in general, most people want to be told the truth. There are still enough Canadians in “the movable middle” who will respond positively to scientific reasoning when presented to them. “You won’t convince everyone of course, but anything that helps move the needle when it comes to refuting something as problematic as health disinformation is a good thing.”

Running Afoul of Regulatory Bodies

When physicians choose to disseminate misinformation online, medical organizations charged with the responsibility of protecting patients can’t afford to look away. 

Those physicians who choose to promote narratives unsupported by evidence create a din of doubt and confusion, making it harder for the actual science and public health messages to break through the noise.

Dr. Richard Baron, President of the American Board of Internal Medicine, explained it this way: “Medicine has a truth problem. In the era of social media and heavily politicized science, ‘truth’ is increasingly crowdsourced: if enough people like, share or choose to believe something, others will accept it as true.” he wrote in an editorial in the New England Journal of Medicine. 

 “Growing allegiance to crowd-endorsed ‘facts’ poses a serious challenge for the institutions and structures that the medical enterprise has developed to protect the public and ensure that people can tell who can or cannot be trusted as medical professionals, or relied on for scientific knowledge.

“We, physicians, need to use the institutions we’ve created for professional self-regulation to maintain public trust by establishing some recognizable boundaries,” he wrote.

Indeed, in July 2021, the Federation of State Medical Boards, the umbrella organization of U.S. state and territorial licensing boards, issued a policy statement that, “Physicians who generate and spread COVID-19 vaccine misinformation or disinformation are risking disciplinary action by state medical boards, including the suspension or revocation of their medical license.”

Addressing arguments that reject scientific evidence and “seek to rouse emotions over reason” has been a priority for CPSO too. In a letter to the membership in the Spring 2021, the College stated it was concerned about the behaviour of physicians who were publicly contradicting public health orders and recommendations online. The statement recognized the important role physicians can play by advocating for change in a socially accountable manner, but their unique position of trust in society comes with a professional responsibility to not communicate unsupported anti-vaccine messaging. 

“Physicians who put the public at risk may face an investigation by CPSO and disciplinary action, when warranted. When offering opinions, physicians must be guided by science, the law, regulatory standards, and the code of ethics and professional conduct.”

The statement, CPSO said, was not intended to stifle physicians from engaging in healthy public discourse about measures aimed at addressing public safety during the pandemic, but to encourage messages based on peer-reviewed, scientifically validated information.

The recent released report Fault Lines states that while the damage caused by misinformation about health care interventions is most visible and immediate when it negatively affects individual health care decision-making, there are also more insidious impacts on the erosion of trust and relationships among patients, health care providers, and the wider health care system. “This trust is already fragile or severely eroded in some groups, especially those dealing with the effects of colonialism, systemic racism, or other forms of exclusion,” it states.