210205-ne-facts
>>BOB ZALTSBERG: This is Noon Edition on WFIU. I'm your host, Bob Zaltsberg. I'm co-hosting today with WFIU news bureau chief Sara Wittmeyer. And this week, we're talking about public distrust and the spread of misinformation. We have three guests with us today. We have Lori Robertson, who is with FactCheck.org. She's the managing editor. Mike Gruszczynski is assistant professor at the IU Media School. And Joseph Vitriol is the senior researcher at Stony Brook University in the Political Science Department. You can follow us on Twitter @NoonEdition and send us your questions there. And you can also send your questions to news@indianapublicmedia.org. We're doing the show over Zoom, so you can't call in with your questions. Well, thank you so much for being here to all of our guests. And I'm going to start with Mike Gruszczynski first because he's there at IU, so I'll start with the local guy. So Mike, if you could frame where we are now, it seems that we're in a situation where we just have a difficult time even agreeing on basic facts. So you know what's allowed us to get to this point?
>>MIKE GRUSZCZYNSKI: Well, thanks, Bob. It's a complicated question with a lot of answers, potentially. But in a lot of ways, you know, since the 2016 election, for me anyway, it's kind of like been watching a steamroller come from a long distance. We saw, really, the first kind of widespread use of what was then, you know, called fake news, and now we would call mis and disinformation, to try to swing an election. And it's only gotten worse. So you know, coming into the 2020 election, I think we knew that this was going to be an issue. And that's one of the reasons why through the support of the Knight Foundation, the media school teamed up with the people at the Luddy School of Informatics to set up or extend their use of analysis tools to understand kind of the prevalence of fake news and misinformation, disinformation in the election. And I think a lot of it, you know, I mean, it's kind of a collusion of events. You have just the ever-quickening news cycle. You have just an immense variety of social media platforms that up until, you know, just a couple of months ago, it took more or less a hands-off approach to controlling their platforms. And then, you know, obviously the increased polarization in the electorate, as you mentioned, where especially for the last 10 years but over the last 30 years, we've really been becoming increasingly separated in terms of partisanship, region, you know, religion, all of those things. And it's all kind of come together in this perfect storm of, you know, an environment in which these false narratives can not only be put out there very easily and quickly but also propagate through the population very quickly.
>>BOB ZALTSBERG: I want to follow up with Joe Vitriol from Stony Brook. I mean, we're talking about, you know, a lot of this has come to light in terms of political misinformation and how that's driven a lot of behavior. So could you just, you know, talk about that a little bit, about how people are willing to buy into this misinformation and then act on it?
>>JOSEPH VITRIOL: Well, thank you so much for having me. I'm happy to talk about the implications of political belief in misinformation for actual behavior. It's important to note, however, that conspiratorial beliefs and fake news or belief in fake news is pretty widespread, but that does not necessarily indicate that people genuinely believe those things. And it doesn't therefore mean even among those who do believe those things, that they're going to act on the basis of those beliefs. So when we look at public polling data, asking people what are their opinions or beliefs about a range of groups or ideas that are false or inconsistent with facts and evidence, we sometimes overestimate the prevalence of misinformation and misperceptions. Often, people express what political scientists refer to as a symbolic support for some of these ideas. That people don't actually genuinely believe some extraordinary or extreme claims. For example, QAnon is gaining a lot of attention but it's still not a widely popular or highly supported group. Many people simply express belief or superficial commitment to ideas we describe as false or fake, not because they genuinely believe it, but because it expresses some deeper truth about the way they see the world, deeper truth about their identities or they're using this as an opportunity to signal their values and their beliefs to others. And from that perspective, it may be the case that while misinformation and misperceptions have certainly increased over the years, its implications for behavior are not always as clear. We don't always see belief about conspiracies or fake news to actually be consequential for political behavior. Often, these beliefs or ideas that are common among people who are going to act in that way anyways. So if you're somebody who disliked Barack Obama, for example, you may be more willing to believe or express the belief that he had faked his birth certificate. But you may not genuinely believe it. For those kind of individuals, those who are expressing symbolic support for some of these ideas, conspiratorial beliefs and fake news may be epiphenomenal. In other words, those are beliefs that exist independent of their behavior. It's not causing their behavior. However, there's concern that even among a subset of folks who express commitment to these ideas and these beliefs, that they genuinely believe it, that they have internalized those ideas, they view it as an important basis for seeing the world. And over time, it may form a moral imperative to act on the basis of those beliefs. And that's a process of radicalization, moving away from superficial or symbolic support for ideas and groups and beliefs to a much stronger commitment that we see, for example, among the individuals that showed up on January 6 in Washington, D.C., with the intent to commit violence against elected officials and to interfere with a peaceful transfer of power. It's not always the case that these beliefs translate into action but for a subset, those who genuinely come to believe these things and internalize them, it can radicalize them to act in ways that they think is morally and ethically necessary.
>>BOB ZALTSBERG: Before I move on to Lori Robertson, I want to follow up with you just briefly. Marjorie Taylor Greene has been in the news a lot lately. And when you talk about how people might not believe what somebody is saying but that they might, you know, follow them anyway. You know, before she was elected, she talked extensively about how, you know, some of the school shootings just didn't happen. And there's evidence that she said something about 9/11 not happening or planes didn't fly into the Twin Towers. So - but people supported her and voted for her. So is that part of what you're saying, that people might not actually believe what she's saying but they still thought she, you know, she's somehow struck a chord with them?
>>JOSEPH VITRIOL: Yes. So it's certainly the case that a lot of - a fair share of her supporters likely believe or have some sympathy for those perspectives, but they may not necessarily agree with all aspects of it, assuming they're even aware of some of those more controversial comments. And it seems that many of her base may not have been until more recent times. And so the great concern is when political leaders and people who have influence are sort of validating or advancing these ideas and giving them credibility and legitimacy. That over time can lead to increased support and endorsement and more internalization of those beliefs that might lead to action. As to whether or not the supporters of somebody like Congressperson Greene genuinely believe all of those ideas or instead, perceive her as somebody who is representing their group, who is engaged in sort of real politics with those they perceive to be hostile to their interests, that would be the Democratic Party, that would be my expectation is, is that yes, they may not agree with all the claims that she makes, but she nonetheless speaks to their sort of belief that something is awry, that something is being - that they're being misled, they're being maltreated by a political group that they don't support and they don't identify with. And their political leader is someone that they want to sort of combat and challenge those individuals. Now, whether that represents a worldview that believes in the deep state and some paedophilic ring among Democrats is separate from just the general belief that you want to put in power somebody that represents the interests of your group, that is willing to fight for those interests and doing essentially what it takes to protect you from the negative political outgroup. But over time, those ideas can become validated in the minds of constituents who might not otherwise have been willing to support them. And so that's a great concern. Top-down influencer of political leaders validating, increasing knowledge and exposure to problematic perceptions can over time lead to radicalization, increased detachment from reality.
>>BOB ZALTSBERG: Lori Robertson is the managing editor of FactCheck.org. And Lori, I am just fascinated by what your organization must have been going through. Can you explain a little bit about how, you know, how this year may have differed from previous years at FactCheck.org? And you can just talk a little bit about, you know, the business model and how you, you know, how you do what you do.
>>LORI ROBERTSON: Sure. Thanks for having me on the program. So FactCheck.org, we're a nonpartisan, nonprofit project of the Annenberg Public Policy Center at the University of Pennsylvania. We launched in 2003. So we've been around for a little while. I think we're actually the first independent fact-checking organization in the US. And our mission is to increase public knowledge and understanding and to reduce the level of deception and confusion in US politics. We focus on the - on federal office holders, so the president, the executive branch, Congress and, you know, campaigns for those offices. So you know, we've seen a lot of changes over the years. One big change is the way information is spread and the types of things that we're monitoring. When we launched in 2003, we were largely monitoring political ads, TV ads or radio ads, as well as major speeches by the candidates or, you know, the president officeholders. But now that has really shifted over the years. We still look at TV ads. They still play a prominent role in campaigns. But more and more of that communication from politicians has been through, you know, it can be through stump speeches, but they're streamed live on the Internet or, you know, you can watch them on YouTube. Used to not be able to do that. Used to be somebody would go to Indiana and give a stump speech. And the only people who really heard about it were, you know, local readers and listeners of local media that would cover it. And then, of course, social media. And in the Trump administration, you know, Twitter just became such a focus of ours. We had never seen a president use Twitter the way that Donald Trump did. And that became something that we looked at much more closely than we had in past years. And then 2020 overall was just, you know, I think such a unique year for everyone in the country. COVID-19 became such a focus for us from about February all the way through the year. And we really hadn't seen one issue take over the site like that for us. Our state has three main aspects. We have the political fact-checking of politicians. We have a project called SciCheck. We have a science writer. She has a Ph.D. in immunology, and she writes about science-based claims. So she had a tremendous amount of work to do this year with COVID-19. And then sort of the third aspect of our website is covering viral social media claims, you know, Facebook posts and memes and the same kind of stuff often shows up on Twitter and Instagram and other platforms. And with COVID-19, all three of those efforts were really involved in this one topic for many months. And then, of course, there were the voter-fraud claims that we dealt with. Beginning in April, I believe that was the first story we wrote on some claims that Trump started making questioning these mail-in ballots. And we had seen voter-fraud claims before. We've written about many over the years. But this was really the first time there was a campaign essentially against the election, a campaign aimed at undermining confidence in the election. You know, and then on top of all of that, it was just really a strange presidential campaign in that it was largely virtual. You know, we hadn't seen something like that before. The Trump campaign, you know, started to look like a normal campaign holding events in the last few months, but the Biden campaign was still overwhelmingly virtual. So for us, you know, 2020 was quite the unique year.
>>BOB ZALTSBERG: We're talking about public distrust and the spread of misinformation today on Noon Edition. You can follow us on Twitter @NoonEdition and send your questions there. And you can also send us questions to news@indianapublicmedia.org. I wanted to follow up with Lori and then Mike and Joe can jump in on this, too. I was a newspaper editor for many years, and I wrote actually thousands of opinions over the years and I quoted FactCheck.org dozens, if not hundreds of times. And in my later years in the job, it was as if FactCheck.org and the other fact-checking sites, Snopes and other fact-checking sites became to some people the enemy, too. It's like I would quote something that, you know, that you had said, Fact Check had said, and there was a certain group in the population that would say, well, they're all a bunch of liberals, too. So it seems like the fact-checking institution even became suspect, just like, you know, the mainstream media and everybody else. And I wondered if you could just address that and how you try to maintain that nonpartisan position that you have.
>>LORI ROBERTSON: Sure. Well, you know, I'd say a couple of things about that. And certainly, we get, you know, emails criticizing our work and we get this from, you know, proponents of both political parties. You know, first, I'd point to the coverage that we've done on other politicians that perhaps, you know, some readers would be more apt to agree with, let's say. But you know, I think that we also ask people, you know, if you see a factual error in our stories, please tell us. You know, we want to have all the facts in our stories correct and, you know, we will correct things if there's some kind of problem with it. You know, and I think those are really the two things that I understand, maybe you don't like this story because it is saying something is wrong that a politician you like says, you know, but what is wrong with that story? Do you see a factual problem with the story? And then, you know, here's coverage that we've done on the other side, on the other political party, you know, what is your reaction to that? So and, you know, you can't like the coverage we do of one person and not like it and say we're biased on the other side. You know, we really do try to cover both sides with the same set of standards.
>>BOB ZALTSBERG: I guess I was a little naive in my thinking because I thought facts were facts you know? And I would quote a fact-checking group and I thought I was safe but then it became not so much. So Mike and Joe, could you react to this notion? And it really goes deeper into the distrust of institutions. Joe?
>>JOSEPH VITRIOL: Yes, sure. So you can think of social and political belief as serving a wide range of psychological functions. So the beliefs that we adopt are often intended to satisfy the motivations that we have to feel like important, good people in good standing within our communities. Among those motivations might be accuracy, in which case, facts and evidence become indispensable for arriving at sound judgments about the world. But other motivations include the need to belong, the need to feel as though one has meaning and purpose within their communities and that they have control over the environments in which they live. That the world is intelligible, it's predictable and that uncertainty is at a low level. And so what happens with fact-checking is what political scientists often refer to as backfire effects or what psychologists would refer to as motivated reasoning. Facts and evidence that are in conflicts with importantly held values and beliefs are not only facts that are perceived to be worthwhile to one's judgment that one should update one's views based on that facts and evidence, it's often and also perceived as an attack against one's self, one's identity, one's values, one's worldview, one's community. And as a result, when we encounter facts and evidence that threaten our identities, our beliefs, we're not always open to updating and revising our perspectives. Often, instead, we argue against those ideas and argue against those facts. By motivative reason, what I mean to say is that people are often motivated to defend their beliefs, to defend those identities. And that's not always engaged in a factual-evidence-based way. And this motivated reasoning can lead to backfire effects when people are presented with facts and evidence like that provided by FactCheck.org, by counter arguing, by challenging that information, by derogating the source of that information. And over time, that counter argumentation can actually strengthen in the mind of the person, their confidence in those beliefs. That they have defended their views against attack. That these experts don't know what they're talking about. And through that motivated reasoning, dynamic beliefs can radicalize. They can crystallize. They can harden against new information. And so what is often the case is that people who hold misinformation and misbeliefs do so because it represents some important part of their identities, important part of their worldview. And facts and evidence that are in conflict with that are, instead of being assimilated into those beliefs, are subject to attack. And that can be true for the fact-checker. That can be true for a scientist. That could be true for a scholar. That could be true for the average Joe on the street. People are often motivated to protect their views, especially views that are an important part of their self-concept in their identities from disconfirmation. And that tendency to counter-argue information, to derogate the source of information that threaten those identities, either lead to no change in attitudes in the face of evidence or more concerningly, increased entrenchment and extremism.
>>MIKE GRUSZCZYNSKI: Yeah, I wanted to just say I agree with Joseph completely on this stuff. The thing I always think about when I'm researching, whether it's disinformation or just how people pay attention to issues in the media and trying to conceptualize and understand and measure, you know what is it that people are actually reading or watching or listening to, internalizing maybe or ignoring is that, you know what that motivation - to be motivated to do something, you have to have the time, you have to be motivated to spend the time. And in an information environment that is just saturated constantly with so much stuff, you know - so many of us live on social media feeds now where, you know, just the endless scroll, what some people during the pandemic have called the doom scroll, right? It's monium - just hugely easy or very easy to scroll past something that you don't agree with. You experience that hit of cognitive dissonance. It makes you uncomfortable and you just keep on scrolling. And at the same time, you also have to be motivated if you encounter information, to think about the source, right? One of the big problems with all of this information that's out there is that we've gone from a source-driven media environment where you have The New York Times, you have CNN, Fox News, and those are still in there. But people are encountering these things in a social media feed where the sourcing might not be immediately apparent. Where is this coming from? You can very easily ape a website. I've gotten tricked more than a few times on Twitter by the fake Donald Trump account. And you know, just like Joseph says or said, you know, it's when people have these identities that are important to them and opinions that are important to them, which many opinions are, the motivation there is going to be to continue scrolling past the things that maybe are contrary to that worldview or that identity.
>>SARA WITTMEYER: I want to ask Lori, you know, a lot of this we do seem to pin on former President Trump. I guess I just want your opinion on how fair that is. You know, the Biden administration has not been in office very long, but are they guilty of some of the same things?
>>LORI ROBERTSON: Oh, of course. You know, we launched, like I said, back in 2003 because we felt there was a need to fact-check the messages that were coming from politicians. So this has, you know, always exist - has always existed. And don't worry, we're going to do plenty of stories on the Biden administration, just as we did during the Obama administration. You know, I think that there were obviously some unique aspects of the Trump administration. I mentioned, you know, Twitter, the president using Twitter in a way that we had never seen before. You know, one thing that Joseph mentioned earlier about the sort of top-down validation of certain beliefs, including conspiracy theories, you know, that was very unique during the Trump campaign. I mean, these conspiracy theories have existed for years. They've kind of bubbled below the surface. You know, we might see them in, you know, social media claims or what used to be viral emails. Viral emails have kind of been taken over by being able to spread that much faster on social media. But another reason that these conspiracy theories have really come to the fore is that Donald Trump himself, either explicitly pushed or elevated, highlighted through retweets, many conspiracy theories, both before and after he took office. So there's definitely some unique aspects there. At the same time, as you say, you know, we've been dealing with misinformation for years and that's, you know, that was not unique to the Trump administration.
>>SARA WITTMEYER: I'm curious, Mike, you know, if there's this increasing distrust of media organizations, why is there such an issue of misinformation being spread?
>>MIKE GRUSZCZYNSKI: Well, you know, I think it's - a big part of it's just, you know, kind of similar to, you know, scholars of Congress. You know, I've long found that people who - people dislike Congress as an institution but like their member of Congress. Their member of Congress brings them benefits to their districts or their states in the same way that, you know, people might distrust this monolith in their mind of the media but then have their trusted media sources that they love. I mean, you see it on Fox News quite a bit where they talk about the media, right? And they are the media. But they're derogating - are denigrating the very, kind of industry that they're a part of. And I think that that's probably, a big part of it is that, you know, it's almost like kind of a hyperpolarization within the media environment as well. You know, you have your sources that you trust and I'm certainly prone to this as well. And there are some that you see and you don't like them. I have a supreme dislike of a lot of the kind of standard kind of reporting - ways that we report issues, you know, in terms of reporting two sides. It's not biased if you just cover both sides as if there are only two sides to the issue. And I think that that's one of the things that has led us down this path where we kind of have treated - I'm kind of getting off track here a little bit, but we've kind of treated politics as a game that's fought between two sides. And so I look at a lot of kind of, you know, the monolith of the media. You know, of course, I'm in a media school, so I have to be a little bit careful here. But you know, seeing that is kind of problematic. But I still have trusted sources and I think a lot of people are like that. You know, they might trust Breitbart but despise the media, this thing in their mind. And I think that's a big part of it, kind of, you know, picking sides sort of thing.
>>SARA WITTMEYER: Yeah. I mean, Joseph, do you want to weigh in on this, too?
>>JOSEPH VITRIOL: Yes, just to resonate Mike's insights about this. I think what we find is, is that there's a disconnect between people's general attitudes about the media, about government, about institutions and their actual behavior. And so folks will describe, of course, that the media is bias against their interests and yet, they have a lot of confidence in specific media outlets. They may be distrustful of Washington, D.C. in general, and yet they have a lot of confidence in specific individuals or specific leaders or their own political party. So it's important to understand that the perceptions of the credibility of sources of information, confidence in institutions are largely derived from psychological dynamics. It's not necessarily grounded in objective features or criterion of the information source or of the actions of Congress. It's the subject to some of these motivated reasoning dynamics I was describing earlier and how people perceive these institutions and these organizations. We have a lot of confidence in news outlets that generally resonate with our worldviews, that confirm our beliefs and our assumptions and that sort of feed our perceptions that our group is good and desirable, and the outgroup is hostile and undeserving. And that can lead, over time, to a distorted perception of the media landscape and of Congress in general. If our perception of source, credibility of information quality is grounded not in the objective features by which that information is generated, whether it's subject to editorial oversight, whether it reflects expertise and commitment to objectivity but instead based on whether it resonates or confirms our worldviews. One implication of that is increased polarization in which people select into information environments that confirm their beliefs, avoid or are otherwise motivated to reject alternative sources of information. And over time, this can lead to distrust of information sources that are in conflict with your belief. That can lead to distrust towards institutions or political actors that are behaving in a way that one doesn't understand or that one is unable to evaluate because a person might be unwilling or unable to seek out information that can inform them about what's going on in an objective and comprehensive way.
>>SARA WITTMEYER: So what gives rise to so much of this and the spread of misinformation? Why is it being able to just, you know, seems like just, spread like wildfire right now?
>>JOSEPH VITRIOL: Well, it spreads like wildfire because the Internet makes possible the ability to disseminate without much gatekeeping all kinds of ideas and information and morally, outrageous contents or sensational content, often gets - is viral and spreads more easily than sort of boring, you know, standard, fair, technical information about Congress or public policy. So one reason we're seeing widespread dissemination of false information is because people are able to share their beliefs and their ideas. And there's not great gatekeepers on social media platforms that would traditionally have made sure that claims were grounded in facts and evidence or fairly characterizes the range of perspectives. So that's one big issue, is that we have the Internet and social media. And we also have what I was describing earlier, this sort of selective exposure where people choose to participate in certain information environments. And that's true not just about the information we seek out but the kind of people we affiliate with. So we have widespread dissemination of information that's not necessarily regulated with standards of evidence. We select information environments and social networks that resonate with our values and our beliefs. And more than anything, people are very busy in their lives and making sense of the political environment, making sense of public policy is very difficult. Public knowledge about political actors, public policy, the law, the function of institutions, the impact of policies is relatively low. People are busy with their own lives. They're focusing on their families. They're focusing on how to make do in a pandemic. And so the ability to engage rigorously, exhaustively, deeply with information sources is limited just by virtue of people having busy lives. And we know that the willingness to support symbolically or otherwise, or to share and spread epistemically problematic beliefs or ideas, that is fake news, goes way up when people are sort of acting in a somewhat superficial way. And that, I think, characterizes much of how we engage on Twitter and on Facebook. We are somewhat thoughtlessly scrolling through our media platforms, somewhat thoughtlessly, just liking and retweeting and sharing things that vaguely resonate with our beliefs and our attitudes. And we're not necessarily thinking carefully or critically about that information source. And so what we do find is, is that when people are motivated to be accurate, either because they're accountable to others or they to arrive at a good judgment, or we induce a motivation to be accurate by reminding them, for example, the importance of being fair-minded and evidence-based in their thinking, that accuracy motivation actually reduces greatly the willingness to accept and spread fake news. So a lot of the misinformation that is spread simply reflects just superficial levels of engagement, kind of an intellectual laziness that we're all guilty of in how we interact and engage with information content in the social media space, which is made even more possible in the absence of gatekeepers and editorial standards that regulate the content that's being shared.
>>BOB ZALTSBERG: I want to rejoin this conversation and follow up on that briefly, Joe. It seems like - I've seen social media posts where someone will share an outrageous claim and then somebody who's following them on Facebook will maybe challenge that and say this has been debunked. And then the person who shared the claim would say, well, I didn't say I agreed with it. I just, you know, I didn't check. I just decided I wanted to share it.
>>JOSEPH VITRIOL: Yes.
>>BOB ZALTSBERG: That's sort of what you're talking about with...
>>JOSEPH VITRIOL: That's exactly - that's a great example of what I'm talking about. It also can be understood, depending on the goal of the communicator, as a sort of strategic posturing to maintain plausible deniability and to say, well, I'm just sharing something because it seemed true or because it's funny or because I saw other people expressing views about it, is a way of licensing one person, oneself to not be a critical and engaged thinker about public events and current affairs. And I think that's highly problematic. But I think what's more common is not sort of the disowning of false claims when one is ousted on social media, but those false claims go unchallenged, unchecked. And our - over time, people come to believe that it's true simply by having been exposed to it regularly within their social network.
>>BOB ZALTSBERG: All right. If you want to join our conversation today about misinformation and the public distrust of institutions, you can follow us on Twitter @NoonEdition. You can also send us questions for the show at news@indianapublicmedia.org. I wanted to follow up about - with Mike, about maybe a little more local view of this because we have lost a lot of local news sources and local news sources have been, you know, cut back on the number of journalists they have. There are a lot fewer people at the Statehouse covering state news. There are fewer people in local news. Lori's organization is a fact-checker for, you know, mostly the - she outlines the three different areas. But what about local news? Are we in jeopardy of having a lot of misinformation spread around with our next set of local elections, for instance?
>>MIKE GRUSZCZYNSKI: Well, that's a good question. And I will say, we're actually researching this quite a bit right now in terms of news deserts, as they're known. The Bloomington area kind of almost represents a news desert. You know, we have a local newspaper that's kind of been just, you know, it's really done poorly in the new millennium. And I think that a lot of this, you know, if you look at news environments and information environments as ecologies rather than as marketplaces so - because communication scholars used to think of, you know, kind of media environments as a marketplace and, you know, the best idea, a marketplace of ideas and the best ideas would bubble up and people would, quote, unquote, "kind of purchase them" and the bad ideas would go away. And now, we kind of conceptualize it more like an ecological system. And the ecology of a lot of these smaller areas - so Bloomington is kind of a mid-sized, small city. And you know, there's not a lot of information out there. And people have a need for information about their environment, about what's going on. And so they'll turn to what they can find to kind of rectify that need or mitigate that need for information. And we see this around Bloomington. Just to use a local example, I'm a part of a few local social media groups. There's a Bloomington roads group on Facebook, there's a Bloomington subreddit. And you can kind of see that drive for information. We had the Bloomingtony as well, that is a kind of a nontraditional, community-oriented news source. And so that can kind of fulfill those needs. But at the same time, a lot of - I think a lot of people end up stitching it together. You know, word of mouth, hearsay, this person told me this, this is what's going on. And with that kind of gap and reportage all the way from, you know, the Statehouse down to the local level, people will fill that need. And I think that that's especially problematic for the spread of, you know, fake news and misinformation just because it hones in on it. You know, to use an example that is especially suitable right now like a virus, right? Like right now, a lot of people's information environments are just - it's like not wearing a mask to the grocery store, like, they're going to be needing information. They'll take it where they can get it. And if the only information out there is this bad information, that's going to possibly hit harder than maybe another environments. And you know, to bring up Joseph's point earlier as well, you know, people being busy, that's absolutely right. And I think, you know, given that most people, you know, have just a huge crunch on their - you know, everybody has a crunch for time. You know, the information environment has gotten more full of information generally, but we have not gained hours in the day. So in some ways, the kind of accuracy motivation that he talks about is decreased. And I think that's especially true, maybe when you can't even be selective in the information you get, if there's a lack of it in your area.
>>BOB ZALTSBERG: Lori, I wanted to ask you about the different types of misinformation. I mean, I think that a lot of times, people will confuse satire for something that is actually they think is someone supposed to - who is trying to be accurate. But in fact, they're just being satirical. I mean, where does that fit into this?
>>LORI ROBERTSON: Yeah, I mean, it's definitely the type of misinformation that we see. You know, there's a well-known New Yorker satire column that a lot of readers would ask us about the stories and say, oh, could this be true? You know, a lot of the messages we would get, readers would indicate that they were skeptical of it. And you know, we'd write back, well, no, this is a satire column and they'd be kind of relieved, actually, that they found out that this wasn't the case. But one of the problems that we've seen is that these satire pieces are picked up by other websites or, you know, Facebook pages, and they are putting that information out there again without an indication, you know, of a label or something's telling people that it's satire. We've also seen some websites crop up in the last, you know, four or five years. They say they're satire, you know, whether it's funny or not, I guess is a matter of opinion. But really, their goal is to fool people. They're trying to fool people. And that information, again, if you took the time, you know, if you saw that information and it was clearly sourced to the original website that claims to be satire and you looked on the about us page of that website, it would tell you right then and there, this is satire and you shouldn't believe any of this stuff. But you know, as we've been talking, people don't - talking about people don't have time and they don't do that. And then on top of that, other websites pick that information up and put it out there without that disclaimer. So even if you did take the time to look for it, you're not going to find it. You know, we have several tips on spotting misinformation that we've provided to our readers. And unfortunately for our readers, most of those involve taking more time with the information that you're being confronted with. You know, our first tip is considered the source. I think Mike mentioned this earlier. You know, when you see something as your scrolling through social media or something a friend sends you, ask, well, where did this information come from? Is it from a trustworthy source? Is it from, you know, something that I've heard of before? You know, another of our tips is to read beyond the headline. Even in a perfectly accurate story, the headline doesn't always capture that full story, and the headline could leave you with an impression that's different from what you would come away with if you read the whole story or even more of this story. You know, and another thing that you can ask is, well, what's the support for the claim? If there are figures and statistics being put forth, where did those come from? You know, you should be able to look up that information. And I know all of that takes time but, you know, I would advise people, you know, don't share information until you've done a few of those things. And at the very least, you can consult the fact-checkers. You know, we are paid to do all of those things. So we've - if FactCheck.org hasn't written about a particular claim that you see, it's quite likely that PolitiFact.com, the Washington Post fact-checker, Snopes.com, you know, one of these other organizations may well have already investigated the claim that you're seeing.
>>BOB ZALTSBERG: Mike?
>>MIKE GRUSZCZYNSKI: Yeah, I wanted to follow up on what Lori said because I wanted to share an anecdote from my own life. I got sucked in by some misinformation recently after the insurrection at the Capitol, a spoof account for Chiquita Bananas posted about how they believe in a peaceful transition of power. And I was like, well, that's rich, because, you know, if you read about the history of the United Fruit Company and what they did in Central and South America, of course, that's really problematic. But you know, I mean, like I think it just goes to show, you know, for a lot of this stuff, we often treat it as an educational issue, which is certainly the case. And I think I'm contractually obligated to say that as an educator. But even people who are well-educated can be sucked in by this. And a lot of that, like Lori said, like Joseph said, it's a function of time and taking the time. And we find time and time again that, you know, it's not necessarily education, it's not necessarily even political awareness or knowledge that drives us but it's something else. And I think it's a difficult thing to slow down, like Lori said, on social media. But I really think we have to do it.
>>LORI ROBERTSON: Yeah, if I could just jump in quickly. I think that, you know, check your biases is another one of our tips. And you know, I think that's why people do get sucked into certain things because, you know, you're scrolling through your Twitter feed or your news feed and you're seeing things and you're not reading beyond the headline or looking into it more because you don't have time. And your - things that are resonating with your biases, you're accepting, as Joseph has been talking about as well, and the things that don't, you're rejecting. You know, and it's - it is hard for people to kind of step back and say, well, wait a second, let me look into that. Is that really the case? But you know, skepticism, it's a good thing. And I would advise people to use a little more of that in their daily life.
>>BOB ZALTSBERG: Lori, this question was sent in to you, but I think any of the three of you might have an answer for it. And that is, how does misinformation affect different communities like minority groups? Have there been studies on that?
>>LORI ROBERTSON: Well, our other two guests, Joseph and Mike, might have a better response to that in terms of studies. But one thing I'll mention here on this topic, we've actually launched a new project just in the last couple of weeks, looking at misinformation about COVID-19 and, in particular, COVID-19 vaccines, and misinformation that is targeting communities of color, Black community, Hispanic community, Native American communities, you know, that have hesitancy toward the vaccine. And we are trying to do - we actually have stories now on our website in both English and Spanish, aimed at debunking some of that misinformation.
>>BOB ZALTSBERG: Mike?
>>MIKE GRUSZCZYNSKI: And I'll just add to that and say that, you know, I think a lot of this, you know, a lot of scholars are really playing catch-up with this stuff right now. But I really do think it shows a problem with kind of our white blinders as well because there was a lot of evidence, especially in South Florida, of misinformation targeting Latino, Latino, Latinx communities. And we don't do a good job of studying that. A lot of it is because we're limited in the platforms we can study. Twitter is the one that everybody uses because it's accessible. But there are all sorts of platforms that are used differentially by different populations. And there's just a lot we don't know. But definitely with - yeah, within these kind of subgroups within the population. particularly given that they're especially prone to being in places that are information lean, like, there's not as much local news or even regional news, I really do think we need to do a better job of understanding the impacts of it.
>>BOB ZALTSBERG: Joe, did you have something you wanted to add?
>>JOSEPH VITRIOL: Yes. Misperceptions, fake news, these things can have real pernicious consequences for a wide range of individuals and communities. It can impair our ability to understand the way in which a disease is transmitted and what consequence that might have for my own health. It can undermine our ability to hold accountable, corrupt and criminal political actors. It can undermine our ability to identify good policies that are intended to solve important problems that we face. And so, misinformation minimally obfuscates our understanding of reality, which makes it very difficult to elect public officials that are grounded in evidence-based thinking, it makes it difficult for us to hold them accountable. But among the more common misperceptions that we see are not just about specific political events or specific political actors but about our broader social and political conditions. And in fact, the most common misperception I would suggest is the way in which we perceive or make attributions about inequalities within society. For example, the majority of Americans greatly overestimate the level of equality and the causes of inequality between racial minorities and white populations. And part of that is some of the motivated reasoning dynamics we talked about earlier. But also, it's the case that people simply have a difficult time understanding these factors, these conditions, and are not always able or willing to think carefully and critically about it and are not always able to overcome the negative implications of the pernicious consequences of misinformation. And so we're often misinformed and that can produce polarization, extremism, hostility towards others, a willingness to support policies that are problematic, that reinforce an equitable status quo, that lead us to be dismissive or ignore the perspectives of members of marginalized communities, to misunderstand our political moment. There's a large segment of the population, for example, who characterizes the Black Lives Matter protest as a violent form of political protest when the facts and the evidence clearly suggest the opposite. So misinformation and misperceptions and fake news cut at the heart of what a democracy is. One that is evidence-based, rational, in which political leaders and institutions are accountable to an informed, engaged public. Misinformation obfuscates those things. It creates obstacles between citizens and groups of individuals and it allows political actors to act with impunity. And that can, over time, lead to disastrous consequences.
>>BOB ZALTSBERG: Lori, I want to ask you about technology because I don't think we've touched on that yet. But it used to be that, you know, a picture was worth a thousand words. Is it any more or can photos and videos be manipulated?
>>LORI ROBERTSON: Oh, yes, definitely. You know, we've definitely seen an increase, particularly on social media, of the use of images, you know, what we call memes, an image with some words overlaid or videos that have been edited in a deceptive way. That is definitely something we've seen more of. And you know, that's really just a result of this technology improving to the point where people can send those kinds of things. You know, it wasn't that long ago where it was difficult to have a video play, you know, correctly all the way through without a lot of buffering or something. So that's definitely something that we've seen more of, particularly on social media, but also political campaigns using video in a way that doesn't - either doesn't tell the whole story or, you know, portrays a falsehood.
>>BOB ZALTSBERG: So we have less than two minutes to go. So Mike, I wanted to ask you, is there any last point that you wanted to - that you haven't been able to get in, that you wanted to sum up here for us?
>>MIKE GRUSZCZYNSKI: Oh, sure. The - I'll try to make it quick so everybody else has time. You know, I think a lot of this stuff, like Lori just said, you know, the idea of photos and videos being untrustworthy is like the scariest thing to me because - I mean, I think to a lot of people because we tend to think like, well, if you can take a photo of it, then it happened. Pics or it didn't happen, right? And I don't even think we know the start of this. Once deepfakes, where you can change what somebody says, that's already starting. Once those become something that I can do on my home computer, we're going to have to have this whole conversation all over again, you know, about how we know what we know. It's really - this is really threatening even our idea of what it means to know things. And I think in kind of an odd way, our traditional media platforms have struggled to catch up. Academics have struggled to catch up. And I think we really have to take a good, hard look at how we report on information, how we keep from spreading this misinformation. Don't just report it in a way that, you know, this person said this and then it becomes kind of truth or truthy because, you know, a traditional media platform reported on it. But instead of, you know, taking a step back, it's not just the normal, quote, unquote, "citizens" who need to slow down on their media feeds. Maybe we need to slow the news process down a little bit and think about, you know, what are we giving voice to? Who are we giving platforms to and what's the possible harm in giving platforms to these entities?
>>BOB ZALTSBERG: Yeah, we have plenty to talk about and we're going to have to come back and do it again some time. So we are out of time today. So I want to thank our three guests, Joseph Vitriol, who's a senior researcher at Stony Brook University, Mike Gruszczynski, assistant professor at IU Media School, and Lori Robertson, the managing editor of FactCheck.org. I want to also thank my co-host, Sara Wittmeyer, our producer, Bente Bouthier and our engineer, John Bailey. I'm Bob Bob Zaltsberg. Thanks for listening to Noon Edition.