‘In Dialogue’ Episode 14: Health Disinformation — Consequences Solutions
Mr. Imran Ahmed, founder and CEO of the Center for Countering Digital Hate, talks about how health disinformation spreads, attempts to displace scientific authorities in public discourse, th...

April 2023
Reading Time 79 min.
Print
Print

In episode 14 of “In Dialogue,” CPSO Council President Dr. Rob Gratton speaks to Mr. Imran Ahmed, founder and CEO of the Center for Countering Digital Hate, about how health disinformation spreads, attempts to displace scientific authorities in public discourse, the negative impacts of disinformation on society, and how to best address it.

https://soundcloud.com/cpso_ca/episode-14-health-disinformation-consequences-solutions
Listen on Apple Podcasts badge
Listen on Google Podcasts badge
Listen on Amazon Music badge

Mr. Ahmed is an authority on social and psychological malignancies on social media, such as identity-based hate, extremism, disinformation, and conspiracy theories. He regularly appears in the media and documentaries as an expert in how bad actors use digital spaces to harm others and benefit themselves, as well as how and why bad platforms allow them to do so. He holds an MA in Social and Political Sciences from the University of Cambridge. Listen to Mr. Ahmed describe the tragedy that inspired him to start the Center in his introduction to CPSO Council.

The Center for Countering Digital Hate opposes hate and disinformation by disrupting the online architecture enabling its rapid, worldwide growth. Its work encompasses research, campaigns and policy to increase the economic, reputational and political costs of all parts of the infrastructure — the actors, systems and culture — that support and often profit from this negativity. The Center works with academics and practitioners in diverse fields to develop strategies that strengthen tolerance and democracy, and counterstrategies to new forms of hate and disinformation.

CPSO has made addressing health misinformation a priority, and is committed to sharing the information and solutions we learn along the way.

Related eDialogue Articles

Other Resources

Interview with Mr. Imran Ahmed

Introduction:
CPSO presents “In Dialogue,” a podcast series where we speak to health system experts on issues related to medical regulation, the delivery of quality care, physician wellness, and initiatives to address bias and discrimination in health care.

Dr. Rob Gratton (RG):
Welcome to “In Dialogue.” I'm Dr. Rob Gratton, an OBGYN practicing in London, Ontario and the 2023 CPSO Council President. We're pleased to be joined today by Mr. Imran Ahmed, the founder and CEO of the Center for Countering Digital Hate [CCDH]. Mr. Ahmed is an authority on the social and psychological malignancies on social media, such as identity-based hate, extremism, disinformation and conspiracy theories.

This is an important issue affecting health care providers and institutions all over the world. As a medical regulator, our goal is to shine a light on some of the ways in which health misinformation can affect the care provided by physicians, the doctor-patient relationship, and the well-being of healthcare providers. And, ultimately, shine a light on the effect of disinformation on the health and well-being of the people that we serve.

Welcome Imran. Thanks so much for allowing us to interview you today and joining us today.

Mr. Imran Ahmed (IA):
It's good to be with you, Rob.

RG: At present, it feels as if were encountering disinformation everywhere — online, on TV, in the news. What are the key sources of health and science disinformation, and how is it spread?

IA: The truth is that the prevalence, the sheer scale of health disinformation is driven by a number of factors. One of them is just people who are misinformed, communicating to each other on social media platforms, sending each other misinformation. The second is very organized actors who spread disinformation for ulterior motives. And they're the ones who CCDH have really focused on in the past few years, particularly in the pandemic. Those people who profiteer by undermining faith in science-based mechanisms in physicians, in the scientific establishment.

But there's a third set of reasons for why we see this increased amount of disinformation and that's because we use platforms now as the primary means by which we share information, we negotiate what we believe collectively, and that's social media. And social media is a space in which information is not given the same visibility, so everyone can post there but what gets the most amplification is actually content that is — in their terminology — “engaging,” so stuff that basically gets people talking, that induces an emotional reaction, that triggers our desire quite often to defend science. So, disinformation — because it's erroneous, because it's surprising, because it's often conspiracist in nature — drives an emotional reaction, but it also drives the corrective reaction. And quite often that engagement is what means that disinformation is not just produced, it's also then amplified into millions of timelines around the world.

RG: Thanks. I was just going to pick on something that you mentioned in your work — you speak of “the anti-vaxx industry” and I wonder if that speaks a little bit to what you've been telling us about some of the motivations and drivers that's behind this misinformation?

IA:You know, for some time now, we've known that anti-vaxx is an industry, all the way back to Andrew Wakefield and when he was producing his paper in the late 90s, in England, on the MMR vaccine (the measles, mumps, rubella vaccine) and his purported link to autism. And we know now that he was working with personal injury lawyers to promulgate this notion that there was a linkage between the two, so they could take action on behalf of parents of kids with autism. And from then, there has been this real strong linkage between disinformation about vaccines and essentially profiteering. But there is an even older precedent in the snake oil salesmen that would go around towns and say, “Hey, look, I've got this miracle cure that can cure your aches and your pains. It can cure your warts. It can cure everything! All you got to do is give me $20 and I'll give you this magical vial of good stuff.” And the modern anti-vaxx industry is just the same thing. It uses theatricality, disinformation, outright lies in order to sell whether it is fake cures — and there's people like Joe Mercola, who've been repeatedly sanctioned by medical authorities in the U.S. for claiming that vitamin C pills and vitamin D pills that he sells on his branded Amazon webstore can solve all manner of ailments. But he's made $100 million out of his career spreading disinformation and selling cures, which have questionable scientific basis among other things, of course.

You've got people who do it for political gain. So, they spread disinformation because they think it will give them a political advantage in undermining faith in the current authorities. And we've seen that, for example, with some GOP Republican politicians in the United States, but also around the world as well. We have the same problem in Europe and Britain and Germany and France. And this industry of people who essentially profit from the spread, in persuading people, of disinformation. We thought that was a really important sector to look at when we started looking at the pandemic around May 2020 as an organization. In March 2020 we started, in May we published our first report, “The Anti-Vaxx Industry.” And I think that has been a real feature of this pandemic is the organized, sustainable economies around disinformation. But they're only possible because of the digital platforms they have, which gives them access to enormous numbers of people and all for free.

RG: This misinformation, disinformation affects a wide range of critical issues from health and science to climate change to elections. But why is it so widespread? We've talked a little bit about the motivation, but maybe I can ask this: How is it allowed to expand and to run rampant, and apparently without safeguard or regulation?

IA: Well, part of the reason at a fundamental level is that the nature of how we communicate to each other has fundamentally changed, and we have interfaced 4.5 billion human beings and more than half of all human beings on Earth into networks, in which people can spread a message for zero marginal cost. So, that changes the economics of communication. It is the nuclear age of communications, the creation of vast amounts of communications with virtually no effort on platforms that have changed the rules of what gets the most attention. It used to be that in order to be taken seriously for your health information, you had to go and study, get a degree, practice, think, be recognized by your peers, and then you may be cited in a newspaper or on a TV show that gives you access to millions of people. Now, the way that you are given the right to speak to millions of people about health is by going onto a platform and saying the craziest thing possible, because that will induce reactions, it will get people talking, it will get people arguing and shouting at you.

And the way that these platforms work is quite different. The economics and motivations to the communications mechanisms that we use to transmit information in our society is quite different. It used to be that people competed on different factors to what they do now, because, actually, the reality of these platforms is that they only care about one thing and that is whether or not you stay on that platform, arguing, shouting and consuming ads for as long as possible. And this is a species-wide shift in how we communicate as a species. Communications, we know, is what differentiates us from other species on Earth. We are a social species that managed to achieve so much more than the limits of our quite fragile, corporeal beings by meshing ourselves in society and community, and using language to communicate to each other. And if you can hack into and change the fundamental rules of how we communicate as a species, and then you can reweight that towards disinformation over good information, what crumbs you can start to rip apart what it means for us — the fundamentals of how we think, bring ourselves to the world, and can start to undo a lot of the civilizational achievements that we've made.

RG: In bringing it back to health and science disinformation, I think we've seen evidence of that profound impact. And maybe I can just get you to comment on how it has influenced people's interaction with the healthcare system and with their care providers, and that trust relationship on which it's really founded.

IA: It was really interesting during the pandemic. I mean, we, like everyone else, have come to this not realizing the scale of change that has occurred. And keep in mind that these changes have been wrought over maybe 15 years. One of the fastest technological adoptions in human history is social media and the way in which billions of people have taken it up. And we realized very early on that the scale and speed of change made it very difficult to perceive how significant the changes were. When it comes to science, I think ironically, despite the fact that scientists are — you know, that say life scientists, mathematicians are very empirical, very numeral — it is the difficulty of understanding what it is that these people are seeking to do, because there's lots of big numbers involved and actually getting down to the brass tax of there's billions of memes about vaccines, but there's only three themes, for example.

So, the three themes are: COVID isn't dangerous, vaccines are dangerous, and you can't trust doctors. And it's that last one that I think is most important. That what we are talking about here is not just that communications or people trying to sell goods, but also something that we call “epistemic replacement.” So, this is about replacing authority with new authorities. This is not populism. This is not about the people having a voice. This is about different people having a bigger voice than scientists and physicians. Because if you can replace scientists and physicians as trusted relationships, as a source of authority, as the people that you go to when you're scared and you want someone to tell you what can I do, what advice would you give me, your expertise is important to me; if you can replace those people, that is a very powerful thing to be able to do. And that's what knits together those other movements that you talked about earlier. So, whether it is fringe political actors, say white supremacists or those who seek to overturn the order, what they all need to do is attack the existing order in whatever form they find it.

So, replacement of the epistemic authority of physicians, of surgeons, of the medical profession as a whole is the prime directive, undermining those people is the prime directive of these malignant anti-vaxx movements. As we know, there are people who say that you can cure cancer using different mechanisms. In fact, two of the most famous proselytizers of anti-vaxx disinformation, the Bollingers, were actually — the first website they had was the truth about cancer, which was a lot of nonsense about how cancer can be cured, again, if you just follow their prescriptions and give them lots of money.

RG: And I wonder when it comes to COVID misinformation, we've seen the very practical impact that it’s had on vaccine hesitancy, on patients seeking care, on health outcomes — can we get you to reflect on how you've seen that play out in your broad observations?

IA: Yeah, I mean, the pandemic itself was a moment in which it was clarifying across a range of the issues that we talked about. So, conspiracism is driven by a belief in non-falsifiable, conspiracist theories, is driven by epistemic anxiety. And that's an anxiety not just about what's true or not, but about how to find out what's true or not. It's this kind of more deep angst, this yearning for certainty. And there are some people that haven't — so, another sort of psychological factor that's very linked to conspiracism is a need for closure, a need for certainty. I cannot imagine a more profound trigger of that sense in people than a pandemic that has the potential to take millions of lives, that has a new virus that is spreading rapidly across the world and terrorizing us. And in response to which, we're being asked to take extraordinary measures. It is difficult now to remember that two years ago, the entire world, human beings collectively decided to isolate from each other, which is anathema to us as a species, as people, to isolate and to hide away from those we love. And that we did that for two years for crying out loud. It's an extraordinary thing that we did collectively as an act of self preservation. And of course, it was driving high levels of epistemic anxiety.

So, you've got a fertile psychological environment for conspiracist-thinking. You compound that with the fact that we're physically dislocated, but the primary way that we can communicate with each other remains digital channels. So, digital channels achieve a real significance in our lives. And those digital channels are weighted — I repeat, this is what's critical — are weighted in favour of disinformation. So, the prevalence with which we see disinformation is higher relative to its actual incidence compared to good information. And at a very simple level, who goes and retweets the CDC on a regular basis? Or the CPSO? Who goes and quote tweets someone who's saying nonsense and goes, “This is ridiculous. Stop saying it!” So, the problem is that we're literally signaling to the algorithm that we'd prefer bad information because we like to shout at them rather than good information, which we just acknowledged with a nod. And that's no use to those platforms, because they don't want you to just nod at stuff. They want you to stick around and start arguing, start getting stuck in, so that you can consume ads.

And so, we have this moment in which people were particularly reliant on digital environments for information sharing and for understanding the world around them, interpreting what they thought was normal, what they thought people were feeling. Then you have an anti-vaxx industry that's had 20 years, because it's really been outside the mainstream for 20 years. So, non-mainstream movements have been really fast at picking up on digital channels for communications, because it's very low cost to use; zero marginal cost for every additional message to every additional person. That is a great opportunity for fringe movements, because they can do what they do very, very cheaply. And you also have, I think, a public health industry that hadn't gamed this out in their heads and had no idea what they were doing. That still thought they were in the 1930s, where they could send out a missive from central headquarters and say, “Excuse me, citizen, we require you to go and hide yourself in your home for two years. Please accept our advice. It’s based on something that you couldn’t possibly understand. I understand it, because I’m very clever.” And that is not an effective mechanism for persuading people to take extraordinary action.

So, I think the public health industry didn't realize they were in a fight with disinformation actors, that the information ecosystem had changed fundamentally. They went and begged the platforms. In fact, what we found quite often is that major international health organizations were going to speak to the platforms and begging them for free ads. And actually, the problem is that the entire system was weighted against them, the ads were never going to be effective on their own. And so, there was no game plan and there certainly was no game plan for contact with the enemy. And I think the combination of those factors is what led to the utter chaos that we saw over the last two years, including the enormous loss of life of people after vaccines were available, of people who refuse to take a vaccine. And we need to learn a ton of lessons from those moments. That said, having now spoken to many of those international health organizations since the pandemic, I am sad to report that I don't think they've learned the lessons they needed to learn for the main part.

RG: It's a sobering insight on your part. I just wonder, though, for us, I mean, we're going to have to partner in this, given the context that you described. So, when we encounter these falsehoods online, when we feel we should correct or address it, we also fear bringing further attention to the issue or bringing attention to ourselves. What is the best approach for healthcare providers, researchers, social media users to address disinformation when they come across it?

IA: So, let me split this into two parts. And let me remind you again of something I said earlier about epistemic replacement. This is about replacement of physicians, surgeons, scientists, people who put in their hard yards in understanding their issue as sources of authority. The single most powerful counter to that is the trust that you already have and behaving in a way that continuously earns that trust, the highest levels of ethics, of probity, of honesty, of quality of communications, of reaching out and showing worth and trust to those people that already trust you. The most powerful inoculation against disinformation is the trust, the love that most people feel for their physician because that physician has protected them through thick and thin. And I think that the CPSO, in particular, is to be commended for having taken action against those people who have not lived up to the extraordinarily high standards that you set for yourselves as a profession. That saves lives. Those rules are not about having rules arbitrarily. They're about protecting the lives of your patients. And nowhere can that be clearer than in a war for epistemic authority in which there are usurpers, seeking to destroy those bonds of trust. And so that's the first thing.

The second thing, I think, if you're talking specifically about the online environment, disinformation’s aim is to make itself be more seen and therefore normalized. So, it's about frequency biases, about normalization, resocialization of society to trust these fringe actors rather than people who've earned that trust. And all the tactics they use, whether it's gaming the algorithm to work for them, these crazy networks of people that troll — trolling is a very good example of this tactic. And why do bad actors troll? They do it to silence you because the greatest power you have is the power to authoritatively communicate to people on social media the truth. And I think not allowing yourself to be distracted by the prevalence of disinformation — leave the systemic disinformation analysis to those people who spend our lives making very cautious not to amplify it, but finding ways to remove that. You need to be thinking very hard about how you can get better at spreading your message.

There's tactics you can learn. There’s things like building up not only your own channels, but also about supporting each other because, actually, on systems that work on the network effect and engagement, actually having a network of people who are constantly reinforcing, communicating to each other, retweeting, talking, getting in a conversation — that drives visibility and visibility is the name of the game on social media. So, actually thinking about your own communications and getting those tip-top, building the best possible networks and creating the biggest possible noise you can, and remembering at all times that if this is a game about trust, the second that you think that the game is to ape the behaviour of the bad guys — the trolling, the nastiness, the clickbait nonsense — you lose. If you allow them to make that the nature of the game, you're going to lose because you will never be as nasty, as base, as valueless, as willing to embrace clickbait disinformation as they are. And so, it's about ensuring that you learn the right lessons from them, which is confidently proselytize your case, build networks, use high-quality video-based information, communicate with emotion, be yourself, but remember that people need you to be trustworthy when they are scared and they're looking for someone to trust.

RG: That aligns nicely with our mission at the CPSO, which is trusted doctors providing great care.

IA: Yeah.

RG: Another area, unfortunately in this with the disinformation fueled hate, many young learner physicians think twice before entering public health fields, vaccine research. How do institutions overcome this hesitancy, and support those who want to contribute and pursue careers in these important fields?

IA: Well, we kind of need you Gen Z, because actually, yes the industry is under short-term assault right now, but your creativity, your familiarity with digital spaces, your digital native understanding of — the truth is, to the younger people out there listening to this, I do have to go and give these lectures to a lot of people around the world, but the ones who kind of just sit there nodding and going, “Yeah, totally,” are always the younger ones. I'm telling them platforms work on engagement and they’re like, “Yeah, whatever Boomer.” So, I discovered it for the first time. (I'm not a boomer, by the way young people, I'm gen X I think or something.) But I think it's really important to have people that are cognizant of the way that digital environments work, the way in which communications has changed and understand that the short war that we're in now where trolls are given impunity to operate, that's actually changing.

I've just been back in London and Brussels in the last couple of weeks talking to legislators in London. So, in the UK Parliament and in the European Union, both of those jurisdictions are passing or have just passed legislation. So, Europe has got the Digital Services Act, the UK has got the Online Safety Bill. I know that Canada has legislation as well. I've just been recording something for a new bill, C-292, which has been introduced. There are different things being tried around the world to change the way in which social media platforms are allowed to get away with creating harm, with building algorithms and artificial intelligence that over promotes disinformation, making it look more prevalent and normal. And I don't think that you will be in the same information ecosystem in 10 years time that you are today. And in that time, you guys are going to know better than anyone else how to use it to spread good information and spread help, spread trust, spread pro-social messaging that will be of enormous benefit to our societies in 10 years time.

RG: When you spoke to us at Council, one of the things in your powerful talk that really struck us was the importance of the work that you're doing and that your team is doing to support organizations, to support on a big scale, this change that we need to affect.

IA: You know, I used to apologize for it at the start of every speech I gave, to say, “You may be wondering why the CEO of the Center for Countering Digital Hate is speaking to this medical body or this scientist body.” And I’d always apologize, but I think we are increasingly aware, as you sort of intimated with one of your first questions, that we are in a common fight here, that so much of the world of science, the world of our values as societies, as tolerant scientific societies is being undermined. And we do a ton of work, for example, in climate change, in looking at the way in which disinformation about the way that our physical ecosystem is degrading through human greed because of the burning of hydrocarbons, the release of carbon dioxide, of other greenhouse gases, which is causing change to our climate and the undermining of the scientific consensus on that by the amplified spread of disinformation on digital platforms. When you look at the hard-earned work that we’ve done on LGBTQ+ tolerance, on the advancement of women's rights, of their fundamental sexual and reproductive rights, of even societies in which people like me — and, you know, my name is Imran Ahmed, my family are Muslim and there would have been a time when people like me wouldn't get an opportunity in public life because our societies were retrograde and didn't see the value and didn't see the equality that I deserve, just as anyone else does, the chance to succeed, the chance to contribute to the society that I that I'm part of, that I love.

So, all of those are being fundamentally undermined. And we see the commonality in those fights as being the greed, the indifference, the playbook of digital platforms that have privatized the public discourse, made it run to rules which actually are designed to advantage them. So, why are they doing all this stuff? Why are they amplifying disinformation? Why do they prioritize engagement? It's really simple. Every 6.7 posts on Twitter that you read, they can show you an ad and they get money for that. And so, they don't want you to read two or three posts, [see] the most relevant information for you and then go away. They want you to reach 400 by getting into arguments, by essentially presenting you with something surprising or disturbing or upsetting that makes you stay on there. And that fundamental — the cynicism of that business model underpins so many of the harms that we see around the world. It also drives why young girls, 13-year-old girls are presented with images that drive body dysmorphia, that drive concern about mental health, about a whole range of issues.

And what we do as an organization is help other sectors understand what the problem is and how digital environments are changing the way that that sector works, the harms and the threats there, and then coalescing all of those concerns into a drive for change, because this is an industry that just requires us to harness all the good things about it. And social media is fantastic in many, many ways, but also make it act in a slightly more responsible, transparent, accountable way in which they think about safety. So, we've got something called the “Star Framework,” it’s on our website counterhate.com, which is a basic framework in which we can make this industry not just work for its own profits, but also in part to service the needs of our societies and of people themselves. And that work, I think we have found people to be enormously receptive to it.

RG: And I appreciate your commonality in many of these issues and, certainly, in our addressing of misinformation, disinformation. I appreciate how you've highlighted what we can do, we can't do what you're doing, but we can do the networking, and the reinforcing of the positive message, and reinforcing the trust that we have with the patients we care for.

As we sort of bring us to a close, I wonder if you would like to share with us the most important takeaway you'd like us to impart to the listeners today.

IA: I think the most important is not to be fearful of the fact that things have changed. Too often, our reactions to what's been happening online has been to get angry, fearful, and then to go online and start screaming and shouting at the people that spread disinformation and hate, ironically trapping us because what then happens is we diminish the perceived professionalism, probity, trust that people have in us. And we also then amplify disinformation, hate, intolerance. And I think it is to understand the mathematics that underpins these platforms and to work it for yourself.

So, the first thing is go out there — do not engage with disinformation and hate, it just amplifies it — go out and find some good information and engage with that, go and produce some for yourself. And then text your mates, WhatsApp them, DM them and say, “I'd love it if you gave this a share,” to start to build networks to amplify good information, because I guarantee you that's precisely how the bad guys did it, how they got started doing this. They built mutual networks to reinforce their messaging, initially online, but using a variety of mechanisms. And so, today, if you can do one good thing, don't engage with any BS, go and find some good information, and make sure that gets more attention. And that way we can rebalance the information ecosystem away from disinformation and towards good information. And I don't need to tell you the final bit of that, which is that that, of course, is how we protect lives.

RG: Amazing. Thank you for that very practical takeaway. And thank you so much, Imran, for sharing your insights and thoughts on this important issue, for all the work that you do and your team does to inform and support organizations in this united stand against disinformation. And thanks, each of you for joining us today, “In Dialogue.”

IA: Thank you

Closing:
Please visit CPSO Dialogue for more in-depth discussions about health care.




Back to top