Breadcrumb

COVID-19: The Threat of "Fake News"

In this episode, theoretical and evolutionary biologist and professor at the University of Washington in Seattle, Washington, Carl Bergstrom talks with students from the UC Riverside School of Public Policy about the era of "fake news" and how to combat disinformation.

 
FEATURING Carl Bergstrom
February 26, 2021
40 MINUTES AND 21 SECONDS

In this episode, theoretical and evolutionary biologist and professor at the University of Washington in Seattle, Washington, Carl Bergstrom talks with students from the UC Riverside School of Public Policy about the era of "fake news" and how to combat disinformation.

About Ben Bishin:

Bergstrom uses mathematical models and computer simulations to study a wide range of problems in population biology, animal behavior, and evolutionary theory. He is interested in how current norms and institutions shape scientific knowledge. While researchers may be driven by intrinsic curiosity, they are constrained by the realities of the scientific ecosystem in which they operate and motivated by the other incentives — monetary and otherwise — with which they are confronted. He is also the author of the book, "Calling Bullshit"

Learn more about Carl Bergstrom via https://www.biology.washington.edu/people/profile/carl-bergstrom

 

Podcast Highlights:

“Over the last few years we have seen increasing weaponization of disinformation. Social networks are very vulnerable to the exponential spread of misinformation, but also to injection of disinformation by parties who want to disinform.”

-       Carl Bergstrom on the topic of why "fake news" seems to be at an all-time high today.

“Something like QAnon can not only contradict, but directly challenge the authority of not only media, but also state government agencies.”

-       Carl Bergstrom on the topic of the deterioration of trust in established institutions.

“Anti-vax sentiment can ultimately undermine our ability to fight the pandemic...”

-       Carl Bergstrom on the topic of disinformation during the COVID-19 pandemic.

 

Guest:

Carl Bergstrom (Professor at the University of Washington in Seattle, Washington)

 

Interviewers:

Maddie Bunting (UCR Public Policy Major, Dean’s Chief Ambassador)

Kevin Karami (UCR Public Policy Major, Dean’s Ambassador)

 

Music by:

C Codaine

https://freemusicarchive.org/music/Xylo-Ziko/Minimal_1625

https://freemusicarchive.org/music/Xylo-Ziko/motif-remix/imagery

https://freemusicarchive.org/music/Xylo-Ziko/Phase

 

Commercial Links:

Lizbeth Abeln Webinar

Marisol Franco Webinar

https://spp.ucr.edu/mpp

This is a production of the UCR School of Public Policy: https://spp.ucr.edu/

Subscribe to this podcast so you don’t miss an episode. Learn more about the series and other episodes via https://spp.ucr.edu/podcast.

Transcription

  • COVID-19: The Threat of "Fake News"

    Introduction: Welcome to Policy Chats, the official podcast of the School of Public Policy at the University of California, Riverside. I’m your host, Maddie Bunting. Join me and my classmates as we learn about potential policy solutions for today’s biggest societal challenges.

     

    Joining us today is theoretical and evolutionary biologist and professor at the University of Washington in Seattle, Carl Bergstrom. My fellow classmate Kevin Karami and I chatted with professor Bergstrom about the era of fake news and how to combat disinformation. 

     

    Maddie Bunting: Professor Bergstrom, you are a theoretical and evolutionary biologist and a professor at the University of Washington in Seattle, Washington. You are also an outspoken critic of low quality or misleading scientific research. Lack of trust in the American government appears to be at an all-time high recently with citizens questioning experts, and perhaps unknowingly spreading misinformation. Your book, Calling Bullshit, discusses how to dismantle misinformation and think clearly in a world of fake news and bad data. Can you talk to us a little bit about this concept?

     

    Carl Bergstrom: Sure. I mean, when the book that we wrote, Calling Bullshit, is a book that's really about how to make sense of a world where bad information more and more comes in quantitative form. So we all have a fairly good sense of how to parse the sort of false promises or corporate weasel words or advertising claims that people make in qualitative form. But when we start to get hit with a barrage of quantitative information, data graphics or statistics and a lot of numbers and that kinda thing. A lot of us feel like we can't challenge that. We feel like, Well, I don't really know the statistical techniques or maybe we feel like Wow those numbers are just like these real things that come from nature. So you can't argue with those, you know, this sort of, you know, the fact that the data don't lie, kind of perspective. They sure do mislead though and even if they are the right data. And so what the book is really trying to do is trying to empower readers to see that you don't have to have a whole lot of technical background in any of these areas to be able to be able to ask these questions for yourself, to be able to see through misinformation that comes in quantitative form and to be able to sort of navigate the landscape of information that we're all in now that has become increasingly quantified because everything in the world around us increasingly quantified. Yeah, so I guess that's the kind of the basic aim of the book. 

     

    Maddie Bunting: I, for one, know myself and many of my peers and family and friends, when you see a statistic from what you believe to be a trusted source, you automatically take it in and may tell others without knowing maybe the background where it came from. So I definitely learned that and I want my public policy courses as well. I am wondering what this term, fake news. Is this the new norm? I'm wondering, this seems to have popped up about the last five years. I don't know if that's me, you know, coming into college and adulthood. If it's always been there or even in this past year, if the pandemic has exacerbated the situation. From your experience, is this a relatively new concept theory or has this been around for quite some time? 

     

    Carl Bergstrom: Now that's a good question. The basic notion of fake news, which we typically talk about misinformation and disinformation. Misinformation being information that is incorrect but not necessarily created for the purpose of misinforming people, or disinformation being sort of deliberately created false information, deliberately propagated false information. That's been around for a very, very long time. And you can look back, you know, for example, there were these tabloids in the 19th century that would run outlandish stories that, you know, sort of World Weekly News kind of stuff about, you know, civilizations on the moon and that people had seen through telescopes, all kinds of wild stuff like that. And so we've always had, you know, misinformation out there. We've seen there had been, for example, with big tobacco or something like that has found other ways to sort of push misinformation and seed doubt about whether there's really a tobacco cancer link and, and that sort of thing. So we've always had that. I think, you know, one of the things we talked about in the book, and because we deal with this quantitative stuff, we also just kinda talk about you're like What's going on, what's happening in our information world? And I think the way that information is shared has a really driving effect on what kind of information is out there in the first place. And that's something we see for sure with the current environment. We, you know, we had this great idea to take all the computers in the world and link them up together, right? And that sort of coupled with various methods of digital typesetting, has allowed every single person, you know, essentially with access to a computer to become an information producer and to produce high-quality output that they could then share worldwide at essentially 0 marginal cost. And so that, that creates a situation where everyone, you know, this could be really great, right? You know, you no longer need this sort of social capital and financial capital and so forth that was needed before in order to broadcast your story. And we're all tremendously excited about this in the 1990s and this was going to be this marvelous democratizing force and so on. And to some extent it has been, and then to other extends we've gotten hit by this disinformation problem and misinformation problem that I think we didn't adequately think through. And a lot of what happened there, I think we get this huge volume of information coming through. We have a sort of accelerating news cycle where we want the latest news up to the minute and seeking that doom scrolling right? And so with all of that, we can't possibly sort through this enormous volume of information the same way, you know, when you had three networks, you could click across the three, and they all were saying the same thing anyway. Now we have this enormous volume. What are you gonna do about that? Well, one solution to it is to sort of take on collaborative filtering. Let everybody kind of collectively decide what's worth looking at. And so we've all been pushed into this role of editor for one another. When I read information in the evening, if I'm reading it on Twitter or Facebook or something like that, you know, I'm no longer getting it from a professional editor at The New York Times or a professional editor at Random House or from producers at CBS or something like that. I'm getting it from, you know, Uncle Rick who has some really strange ideas and puts that out. And so what's happened is we've all become these editors, we’re not good editors because we're not trained at that. We don't have the incentives to get these things right. A lot of what we share, this is Judith Donath’s idea, is probably about social signaling. So I have a conspiracy story. I share that story. I'm not necessarily trying to get you to even believe it's true, but I want to show you that I'm on that side. I’m in your tribe, I'm part of that group. And so for all these reasons, the information that gets pushed through a social network is very, very different than the information that is provided through traditional print and broadcast media. And so that creates additional set of challenges. And then I think over the last few years we've also seen you sort of increasing weaponization of disinformation. And you know what you say, you set up this big social network and it's very vulnerable to do the sort of exponential spread of accidental misinformation, but also to injection of disinformation by parties that want to miss and forum. And then I think there's been, you know, there were some governmental trends as well where in order to support a steady stream of disinformation, There's that was sort of bolstered with this. You know, you can't trust the real media. You can't trust the mainstream media, they’ll tell you that this isn't true, but don't listen to them because they're lying to you. And that leaves everybody sort of uncertain what to trust. And this, and I'll wrap up, but this is this relates to one of the, I mean, inadvertently we've kind of sort of through this ecosystem, stumbled our way toward one of the main aims of modern propaganda, this sort of falsehood firehose strategy, which is not to get people to believe specific wrong things, but rather to flood the information channels with so much mutually contradictory, misleading information that people sort of give up on any hope of actually getting to a sort of objective truth. And so I think those are all challenges that we're dealing with right now. And so yeah, I mean, the fact that it feels worse than ever is, you know, there are new elements to what we're dealing with that haven't been there before. And so there's some legitimacy to that feeling.

     

    Commercial: Lizbeth Abeln of the Inland Coalition for Immigrant Justice talks about COVID-19’s impact on detention centers on March 2nd at 4 PM Pacific Time. Learn more about this UCR School of Public Policy Seminar at spp.ucr.edu. You can also find the RSVP link in our shownotes. 

     

    Kevin Karami: Going off that idea where you just open up your phone and you just get this flood of information. You have so much trouble navigating what's right, what's wrong, what's an opinion, what's a fact? So based on that idea, would you say that social media has threatened established institutions or would you say it bolstered it? Or maybe it's a bit of both?

     

    Carl Bergstrom: Yeah, so I mean  certainly social media challenges, traditional media sources. And obviously there's a very strong competition for people's attention that plays out across, you know, across both of those and in traditional media sources have learned to adapt to social media environment. And an awful lot of what we share is put through traditional media sources. I think one of the really big differences actually to think about there is a shift from a sort of subscription-based model to consuming information to a click-based model. So, you know, when you're setting up a subscription-based model, you're looking for a newspaper that you think is going to provide you with a long-term, sort of, you know, informative view of what, you know, what's going on in the world. And so you'll make this kind of decision about what am I going to read based on what you think you're going to get out of it over the next year or something like that. As you move to a clinic-based model, and then within the New York Times or whatever you choose, the individual stories aren't competing with one another for your attention or something like that. You move onto this click based model. You open up your phone and then on your phone you've got, you know, in the news app or whatever, you've got stories from like eight different outlets that are just right there and you've got to choose between them. And you may have some in-depth analysis of how the vaccine rollout is going. But below that you've got, you know, nine cats that look like Disney princesses and your click and you're off and going. And so there is this sort of, you know, or you've got these hype headlines of things that just basically aren't true and better sound, really, really spectacular, shocking or something like that. And so there's this, because of this sort of head to head competition of individual articles that's driven by the click-based advertising model, you end up getting a bit of a race to the bottom in terms of the way what articles are written, but also the way that they're sold, the way they're presented in terms of headlines. And we are moving toward this place where the unvarnished truth is no longer good enough. It doesn't bring in revenue the same way that it used to. And so, so in that way, you do see social media really. It's not just social media, it's more, it's more of the entire distribution technology. And really the innovation of click-based advertising that's doing that. And if you want to take your sort of notion of institutions a little bit more broadly and say, expand out to, you know, government institutions, other kind of trusted institutions, government agencies, non-profits, that sort of thing. I think, you know, again, there are ways in which these are challenged by social media. I think we've seen that play out in quite dramatic forum over the last year. Where you can find other people that are interested in a certain angle of misinformation spread that misinformation and administrate information can do something like QAnon or something like that can not only contradict but directly challenge the authority of, of, you know, not only media, but also say government agencies and things like that. So I think in all of these ways that has happened. You may see also there's been this more general trend toward distrust of experts and things like that. That may be sort of, you know, facilitated by social media environment. It wouldn't say it's necessarily driving it. But if you don't have professional editors and the legs gatekeeping the voices that are out there in the media. Then you have these populist alternatives, if you will, that can be used to undercut expert voices and challenge them and so on. You know, anybody can download a bunch of data about COVID, put it through a spreadsheet, make some pretty graphs. And then you see this sort of thing we were seeing last March and April where you had people claiming it was all a hoax and the date approved it. And so I think that's another thing that we definitely see. 

     

    Maddie Bunting: I would love to talk more about not trusting experts and why people maybe believe their personal practitioner, their doctor, their friend, over say a public health official. And I think this pandemic is a really great example, is there elitism? Why do you believe, research has shown, has come out this way? Dr. Fauci or the CDC or the WHO says wear a mask and we still see so many people not wearing a mask.

     

    Carl Bergstrom: I think partly we really…. there's a lot going on there in that question. So just in a very COVID specific way, a lot happened to undermine the credibility of US governmental agencies in March and April in really disturbing ways. And so I wrote it in late February as well. And I wrote about this affair a bit at the time that you saw various branches of government and various agencies giving mutually contradictory messages the same day. So you have Trump standing up saying there are 15 cases, it's going down. This is going to be gone soon. You see Kudlow saying something similar. This is under control. It's not going to be a big deal. And the same day you've got the CDC giving a press conference saying Americans need to be prepared for a major epidemic in this country. And so this is inadvertently again playing into this sort of falsehood firehose approach. And then because of political expediency, there is also this rhetoric of you can't trust the deep state, the Democrats control the CDC, all of that. And that I think creates a certainty vacuum and we see a lot of certainty vacuums around the pandemic. That there's a very fundamental one, which is that, and I think a lot of the problem came from that, which is when, you know, when this pandemic started, we were looking at a virus that had never been in humans before November, December 20th, 2019? And so we didn't know anything about it. And early on it's very hard to figure out. I'm an epidemiologist by training and so this has been something I've been working on and only tracking misinformation around COVID. But also lately I'm spending most of my time actually just doing very basic epidemiological modeling with nothing to do with misinformation. Anyway, you see it's very hard to figure out what's the, what's the case fatality rate, how many how many people are dying from this or what's the infection fatality rate and how fast is it spreading? And what is r naught and what's the generation interval. And we don't know any of those things early on. And so, but of course everyone wants to know because we want to know where this thing is heading. And so you ask experts and the experts often will say, Well, we don't know. And so I had a, you know, I remember this conversation in March, I think may have been April. I think it was March with a leading national paper and the reporters, I think something's going on at the CDC because they know things they won't tell us because I couldn't pin him down on the infection fatality rate to anything narrower than somewhere between half a percent and 5%. And I said, yeah, there's nothing going on there. The one person you're talking to that's telling you the truth about uncertainty. And that uncertainty vacuum, right? And then so what comes flooding into there is the fact that now you have people who are willing to just say a number. You know, they are very likely to get attention because that's more than, you know, even in mainstream media, but especially on social media. I mean, that's more compelling and say, oh, you know the case, the case fatality rate is 2 or something like 2% or whatever. And so those messages get amplified. And then because the whole thing is politicized and also because sort of strong claims are sticky and attract attention. People who are willing to say extreme things have their voices disproportionately amplified so that people are saying, oh, the infection fatality rate is lower than the seasonal flu. Those get amplified a lot because they're not only specific, but they're also extreme. Same thing on the other side, people are saying, well it's 5%, this is going to be just the worst calamity that humankind has ever seen and those get amplified. So you have this situation where instead of having reliable messaging that's appropriately accounting for uncertainty coming out of sort of centralized authorities with consistency across government agencies. You've got the government agencies talking at cross purposes, often saying you can't trust, saying you can't trust each other, essentially even within the government. And you've got this wide range of different claims out there. And so that makes it very hard to believe anything. And then it's very easy to say, Well, what do you know? What do these experts know? They don't know anything. I'm hearing completely different things. And then of course everything is highly politicized. And some of that as expected, right. I mean, if you told me in 2018, yeah, you're gonna have a pandemic and the Democrats are going to want to protect the health care system and the Republicans are going to want to protect the economy. I'd be like, yeah, yeah, no kidding. But if you're gonna say the Democrats want you to wear masks and the Republicans want you to take hydroxychloroquine…  say like, why, like what, what's Democrat about mask or Republican about hydroxychloroquine? Nothing! But it just happened that that's how things felt. And so then when that happens, and as you get these sort of highly politicized scenarios, then people can cherry pick from the wide range of evidence out there. Because in an emerging scientific area where you have all of this uncertainty, there will be a wide range, even if sort of honest, legitimate science in terms of the answers that people are getting initially, people cherry pick from that. You know, people on one side push one story and use that to attack the other. And so I think that was kind of the ecosystem that we got ourselves in fairly early on. And I think to some degree that's being resolved, but of course we see related things playing out in the anti-vaccine front.

    Commercial: Marisol Franco of the Women's Foundation of California talks about gender equity and policy on March 4th at 3:30 PM Pacific Time. Learn more about this UCR School of Public Policy Seminar at spp.ucr.edu. You can also find the RSVP link in our shownotes. 

     

    Kevin Karami: I know earlier in 2020, the World Health Organization basically admitted that they're battling massive info-demic based on the ignorance of misinformation, the rumors, the theories. What do you think this tells us about free speech and freedom of the press based on the pandemic, but also moving forward in the future?

     

    Carl Bergstrom: Yeah. I mean, It's a challenge. How do you deal with misinformation while abiding by fundamental principles of free speech. And I think a lot of us here- in some countries people take approaches. They don't have these strong traditions around first amendment rights and the likes and so you see efforts, you know, in parts of the EU and around the world to do things like criminalize disinformation. And we have extremely high bars for that. And I think for largely good reason. I don't want laws against fake news. When, you know, we've already seen that you can elect a president who will call anything he doesn't like fake news. And so, you know, it's like in principle, yeah, you might not want people aggressively sharing fake news and you want to do something about it. But the fact is, we don't want to trust the government to decide what's fake and what isn't, because that can be fickle. And so what do you do? And I don't think we have good solutions yet about how to handle this. The, you know, one of the big shifts of course, is that the large social media platforms and also search engines and things like that are major conduits of information about the world to citizenry. And so one place that I think there might be some leeway for a little bit of regulatory support of truth would be in providing consumers with more choice over the way that they get their information. So one of the big problems, and it's by no means the only problem, but one of the big problems is what we see online is being determined by these algorithms. And these algorithms are designed to maximize engagement, not to maximize truth for us. So we're in this ecosystem that is trying to like pull us and keep us glued. Twitter is a marvelous outrage machine, right? It just, it gets you riled up about things and then you share your outrage with people and then you rile each other up and, and factual stuff might be pretty outrageous, but misinformation can be even more outrageous. And, or you know, like, with YouTube. And the way that it sort of drives people toward more radical content over time, the recommendation engine is, you know, the algorithms learning out and discovered that people are more likely to stay on if you push them toward more and more radical content. And so you have this sort of radicalization process that happens algorithmically. People don't necessarily want to be given that content, but they don't have a choice because of the way that search algorithms work or what chooses what I see on my Facebook feed works or whatever it is. And so requiring platforms to offer people the ability to opt out of their own algorithmic filtering could help someone. It's not gonna make the problem go away, of course, because people still are going to attend to, gravitate to like-minded people. They're easier to find on the Internet. You know if I have some crazy belief about lizard people or something, I can find the other lizard people, people that believe in it and form a filter bubble around myself and so on. But it could certainly help with some of the problems there. And you know, some of the argument for doing that I think is that having access to good information is sort of a synchrotron for democracy. So when we had the Fairness Doctrine up to about 87 or so in broadcasting where you had to basically cover on, if you had a broadcast license, you had to cover meaningful issues of public importance and you had to present sort of fair and balanced coverage of that. And that was what the Fairness Doctrine was about. Reagan got rid of it. Congress tried to enact something like that and pass the law, Reagan vetoed. So, you know, the government said, you know, that sort of Reagan idea there was, we don't want the government interfering in this kind of speech at all in any way, shape, or form. And that was coming up against this earlier notion that it was important for people in a democracy to have access to important and accurate information about what was actually going on in the world. And I think returning to those kinds of sentiments for which there's ample precedent could provide reasonably strong arguments for giving users more control over what they see online. You know, I think I don't know what to do about the problem of platforms becoming monopolies are near monopolies and then choosing to, you know, basically filter or a sensor or whatever you wanna call it information they have these, you know, there's a whole suite of challenges around all of this. And I think we're only starting to think about what to do. Bottom line is we created this ecosystem of information exchange where we, by turning things over to private platforms that were, you know, basically maximizing financial return given sort of click based advertising, instead of being designed in any kind of way or steward it in any kind of way that provides useful and broadly scoped information to the public; We got ourselves into a position where we don't have access to the reliable information that we would like and I don't think that in the information world we can count on there being something like an invisible hand that, you know, where it all just sorts itself out. And consumer demand leads people to get better information. You know, we, there's no particular reason to believe that. So I think we need to start thinking more about what kind of stewardship is necessary for digital communities when they get, especially when they get created on a worldwide scale. 

     

    Commercial: Social injustice, health disparities, climate change. Are you interested in solving pressing challenges like these currently facing our region in the world? Then consider joining the next cohort of future policy leaders like me, by applying for the UCR Master of Public Policy program. Learn more at mpp.ucr.edu. You can also find the link in our show notes.

     

    Maddie Bunting: Definitely. It is a complex issue as you've explained and yeah, I don't know if anyone currently agrees on one, you know, really massive perfect solution. I think there's a lot to be done. You did mention earlier the anti-vaccine movement, and I would love to touch upon that a little bit more. Vaccine hesitancy, as well as the defense of individual liberties have had a vocal and outspoken following for quite some time now. And this is no different when discussing the distribution of the Pfizer or Moderna COVID-19 vaccines. But I'm curious, how does this “anti-vaxxer” movement connect to misinformation or disinformation?

     

    Carl Bergstrom: Well, I think an awful lot of it is weaponizing just information. And so one thing that you have is, I think there is very likely to be state-level involvement by states that are hostile to the United States in pushing anti-vaccine information into our media channels. Because it's highly disruptive. Anti-vaccine sentiment can ultimately undermine our ability to fight the pandemic and prolong what we're all dealing with and so forth. And so I do think it also generates a lot of, you know, person against person, neighbor against neighbor hostility, which has been a major tactic that goes on, right? So you've had Russian bots aggressively pushing arguments on both sides, right? You know, whether it's around Black Lives Matter or any other abortion or gun rights or whatever you've got. You have foreign actors that are trying to fan the flames of those controversies. Because if you can convince people in a democracy that their neighbors are unreasonable, even evil, can't be compromised with, can't be reasoned with, then you undermine the foundations of why we think a democracy should work. And so I think one thing that's going on is just that that's a fruitful place to sow chaos in terms of people not trusting each other. And then another thing that's going on is that it's a fruitful way to sow chaos in terms of slowing economic recovery. So I do think that and there have been some documented examples of this. There probably is foreign involvement in the US anti-vaccine movement. That does not mean of course that it's wholly a foray and, you know, Psy Op or something like that. It's not I mean, it wouldn't be effective if that were that were all it was. I mean, there are these legitimate sentiments that are very strongly held within the United States around these issues. And vaccines are interesting in the very strong emotions that they elicit among people. There is something fundamental about injecting something into your body that creates a response that makes people uncomfortable. And it's unbelievable lifesaving technology. I mean, we don't know a single person who's died of smallpox. Any of us? Hopefully we don't know anyone who's died of measles and so forth. So it's an amazing technology. But for some reason if people find it- it makes people anxious. And then, you know, then you have sort of misinformation like the, like the false claims of a link between, between vaccines and autism. And that plays also into really deeply held fears about the safety of our children and their well-being and things like that. And so, you know, a lot of these things just target the almost like really deep emotional responses that we have to the world. And so that makes it effective. There are these elements of the anti-vaccine movement that build very, very heavily on community, right? I mean, it's not just that you don't want a vaccine, but it's often very tightly linked to a lifestyle. And it's interesting to see the lifestyles vary, I mean, some of them are sort of, you might think that some of them, I mean, this is stereotyping or caricature thing, but some of them might be on the sort of prepper end of things and you don't trust the government for anything. And then others are very much on this holistic, you know, live natural lifestyle that would be completely different. Of course, geographically you see that you have pockets of the different types of anti-vaccine sentiment in different parts of the country and so on. And so what happens is people then form communities around this and that, you know, when that happens, it's particularly hard to change people's minds, change people's beliefs because it becomes a part of personal identity. And so I can, It's easier for me to change your mind about something that you don't think of as part of your personal identity than to change your mind about something that you do and something that when you do change your mind, that will strange you from a lot of the social support that you're receiving. And so I think that becomes, you know, those are all important pieces of why, you know, why anti-vaccine sentiment spreads, why it's challenging to combat anti-vaccine misinformation. So much of course is misinformation is important because so much of it is based on false claims about the dangers of vaccination, about how they work, what an mRNA vaccine does, whether they've ever been tried before, how the immune system works, connections to autism, what the ingredients are, all of these things. So, you know, all of this comes into play. And then on the social media world as I was talking about community and so forth, it does help people connect and find one another. And again, form these kind of filter bubbles around themselves that they're only really hearing one type of information. And all of this I think allows, you know, something like anti-vaccine sentiment to get a really strong foothold. And then what we start to see is as it does that then there's this natural tendency to try to expand its scope. And so then you start to see misinformation that's not targeting hardcore members of that community, but rather trying to generate vaccine hesitancy among people that maybe have always taken vaccines in the past, you know, people that would never think of not getting their kids vaccinated for measles, mumps, rubella may now be trying to figure out, well, wow, are these mRNA vaccines safe? I don't know. I'm, I say as an immunologist and epidemiologists, they are safe and they're amazing lifesaving technologies. But these questions are, you can push this and people are pushing this kind of disinformation broadly through a big swath of the American public. And that leads to the sorts of rising levels of vaccine hesitancy that we've seen. 

     

    Maddie Bunting: I agree. And I think what I've taken away from this conversation is I feel that this disinformation specifically but also misinformation just causes an individual to really start to question things that maybe they might question before. And maybe that's a good thing sometimes.

     

    Carl Bergstrom: It can be a very good thing, right? And then one of the interesting things about a lot of this is that much of the rhetoric of many of these sort of conspiracy theories and so forth is quite similar to the rhetoric that we would try to push for students that we're trying to educate in the sciences or something like that. You know don't just believe other people, you know, think about it yourself, examine the evidence. You want to be an independent, critical thinker! And it's that rhetoric often gets picked up and used. It's just that then that, that drive is then applied. You know that the information that you're supposed to look at is a small and highly biased subset. But you can see the attraction of this and you'd see the attraction of sort of, you know, I'm figuring things out for yourself. And we see this so strong with QAnon, right? I mean, it's like this gigantic puzzle game or something for people that are going to try to figure out these clues and ask the questions that the sheep don't dare ask. And all of that. Critical thinking is a really good thing. And doing a lot of that is a really good thing, but there needs to be this. and I don't know how to do it, there needs to be this sort of, you know, fundamental notion of where to find either reliable information or abroad enough scope of information that you can make your own decisions about the world based on adequate evidence. 

     

    Maddie Bunting: Hopefully, that's something we can try to navigate these next few years just for the bettering of society. 

     

    Carl Bergstrom: You know, that’s essentially the mission. We, at the University of Washington, we founded a center called the Center for an Informed Public. And that's a cross-disciplinary center with people from computer science and law and the information school and communications snd I'm from biology and there are a lot of us that are working on exactly these problems. And that really is the mission of that is to try to understand how can we steward a social media environment? How can we help the public adapt to the changes that we're seeing in society in the way that information is provided so that we can continue to have a healthy democracy and so the work we do is part research, but a huge part of it's also just outreach and public education and trying to move forward with this and whether that's working with high school students around the state or with the AARP nationwide or whatever. And that's, I think, really, really fundamental as something we need in the country. 

     

    Maddie Bunting: Well, thank you for speaking with us today in our audience. This has been so informational, I've learned a lot and I just wanted to thank you so much for joining us today. 

     

    Carl Bergstrom: Well, thank you so much for having me. It was a real treat. 

     

    Outro: This podcast is a production of the UC Riverside School of Public Policy. Our theme music was produced by C Codaine. I'm Maddie Bunting, till next time.