WEBVTT 00:00.049 --> 00:04.430 JUDY WOODRUFF: The attack on the U.S. Capitol nearly one year ago was based on a big lie 00:04.430 --> 00:09.430 about election fraud in 2020 and the hope of supporters for former President Trump that 00:10.340 --> 00:13.630 they could stop the certification of electoral vote results. 00:13.630 --> 00:18.630 But starting that day, there's been a new misinformation campaign to recast, downplay, 00:19.570 --> 00:22.829 and misrepresent the events that unfolded at the Capitol. 00:22.829 --> 00:25.019 Amna Nawaz reports. 00:25.019 --> 00:30.019 AMNA NAWAZ: They broke through barricades, assaulted police, smashed their way into the 00:31.689 --> 00:36.689 Capitol, and sent lawmakers into hiding. 00:36.870 --> 00:41.860 Yet, even as the attack was playing out, there were already alternative narratives being 00:41.860 --> 00:43.399 spun about who was to blame. 00:43.399 --> 00:47.949 LAURA INGRAHAM, FOX News: There are some reports that Antifa sympathizers may have been sprinkled 00:47.949 --> 00:49.019 throughout the crowd. 00:49.019 --> 00:52.629 DREW HERNANDEZ, Investigative Reporter: Possibly Antifa insurrectionists possibly could have 00:52.629 --> 00:55.840 infiltrated some of these movements and maybe instigated some of this. 00:55.840 --> 01:00.109 REP. MATT GAETZ (R-FL): The Washington Times has just reported some pretty compelling evidence 01:00.109 --> 01:04.570 from a facial recognition company showing that some of the people who breached the Capitol 01:04.570 --> 01:09.570 today were not Trump supporters. They were masquerading as Trump supporters and, in fact, 01:10.720 --> 01:13.430 were members of the violent terrorist group Antifa. 01:13.430 --> 01:16.571 DAVID GRAHAM, Staff Writer, "The Atlantic": In the first hours and days afterward, you 01:16.571 --> 01:21.571 could see Trump and his allies and supporters sort of groping for what the appropriate narrative 01:21.960 --> 01:22.960 was. 01:22.960 --> 01:25.470 AMNA NAWAZ: David Graham is a staff writer at "The Atlantic" magazine. 01:25.470 --> 01:29.560 DAVID GRAHAM: So, on the one hand, you had Trump coming out with his video on the day 01:29.560 --> 01:33.190 of saying: We love you, but now go home. 01:33.190 --> 01:37.880 But you also saw people saying, oh, this is agitators, it was Antifa, it was Black Lives 01:37.880 --> 01:38.880 Matter. 01:38.880 --> 01:43.760 AMNA NAWAZ: That despite contemporaneous texts between pundits on FOX and the White House 01:43.760 --> 01:46.960 showing they thought Trump supporters were responsible. 01:46.960 --> 01:51.960 When subsequent arrests confirmed that publicly, the narrative on the right shifted to downplay 01:52.180 --> 01:53.910 the violence that day. 01:53.910 --> 01:55.840 Here's former President Trump on FOX in March. 01:55.840 --> 01:58.540 DONALD TRUMP, Former President of the United States: Right from the start, it was zero 01:58.540 --> 02:03.540 threat. Look, they went in. They shouldn't have done it. Some of them went in and they're 02:05.300 --> 02:07.480 hugging and kissing the police and the guards. 02:07.480 --> 02:11.700 REP. ANDREW CLYDE (R-GA): There was no insurrection. And to call it an insurrection, in my opinion, 02:11.700 --> 02:13.020 is a bold-faced lie. 02:13.020 --> 02:16.450 AMNA NAWAZ: Republican Congressman Andrew Clyde at a hearing in May. 02:16.450 --> 02:19.420 REP. ANDREW CLYDE: You know, if you didn't know the TV footage was a video from January 02:19.420 --> 02:22.670 the 6th, you would actually think it was a normal tourist visit. 02:22.670 --> 02:27.560 DAVID GRAHAM: It was strange to see somebody like Congressman Andrew Clyde, who -- of Georgia, 02:27.560 --> 02:32.560 who we saw in videos and footage from January 6 helping to bar the doors, suddenly saying, 02:33.040 --> 02:35.190 well, these were just tourists, they were walking through. 02:35.190 --> 02:40.190 AMNA NAWAZ: Another recurrent theme, shifting focus away from January 6 and towards protests 02:41.720 --> 02:44.600 for Black Lives Matter the year before. 02:44.600 --> 02:46.310 Republican Congressman Clay Higgins of Louisiana: 02:46.310 --> 02:51.171 REP. CLAY HIGGINS (R-LA): Nineteen people died during BLM riots last year. Hundreds 02:51.171 --> 02:56.171 and hundreds were injured; 2,000 police officers were injured from BLM riots last year. 02:57.910 --> 03:02.540 AMNA NAWAZ: Voices on the right have also recast those awaiting trial for their part 03:02.540 --> 03:05.290 in the attack as political prisoners. 03:05.290 --> 03:08.260 Here's Republican Congressman Paul Gosar of Arizona last month: 03:08.260 --> 03:13.260 REP. PAUL GOSAR (R-AZ): These are dads, brothers, veterans, teachers, all political prisoners 03:14.040 --> 03:17.620 who continue to be persecuted and endure the pain of unjust suffering. 03:17.620 --> 03:22.620 AMNA NAWAZ: So too with the death of Ashli Babbitt, the Air Force veteran shot by Capitol 03:23.750 --> 03:27.120 Police as she attempted to breach the speaker's lobby. 03:27.120 --> 03:29.840 Here's Republican Representative Jody Hice of Georgia in May: 03:29.840 --> 03:34.840 REP. JODY HICE (R-GA): In fact, it was Trump supporters who lost their lives that day, 03:35.170 --> 03:38.000 not Trump supporters who were taking the lives of others. 03:38.000 --> 03:41.980 AMNA NAWAZ: Former President Trump reinforced that in a July interview on FOX. 03:41.980 --> 03:46.980 DONALD TRUMP: Who was the person who shot an innocent, wonderful, incredible woman, 03:50.740 --> 03:52.110 a military woman? 03:52.110 --> 03:56.660 DAVID GRAHAM: The idea that they were all motivated by these good intentions, they believed 03:56.660 --> 04:00.270 the election was stolen, which, of course, was false -- it was a lie that had been peddled 04:00.270 --> 04:04.060 to them by the president and many of his allies - - but they were going in and they wanted 04:04.060 --> 04:08.770 to stand up for what was right, that they were sort of like the American revolutionaries 04:08.770 --> 04:12.730 or like the Confederate rebels, who wanted to really uphold the best of the Constitution. 04:12.730 --> 04:17.730 AMNA NAWAZ: In an October piece in "The Atlantic," Graham explored this idea, how those who committed 04:18.329 --> 04:23.329 criminal acts to stop a democratic process have been recast by the far right as heroes, 04:24.539 --> 04:29.539 patriots and martyrs for a just cause, much like the Confederate soldiers celebrated by 04:30.129 --> 04:32.759 the mythology of the Lost Cause. 04:32.759 --> 04:36.790 The fact that those people are referred to by some in these circles as patriots, what 04:36.790 --> 04:38.139 does that do to the narrative? 04:38.139 --> 04:41.900 DAVID GRAHAM: It makes them into the heirs of what was right. It turns something that 04:41.900 --> 04:46.889 was one of the darker moments in American history into one of the brighter ones, into 04:46.889 --> 04:51.889 a moment of unity and rebellion against what's wrong and standing up for what's right, which 04:52.919 --> 04:54.349 I think is really dangerous. 04:54.349 --> 04:58.849 If we can turn that something that's an assault on a constitutional process into a moment 04:58.849 --> 05:03.849 of triumph and a moment of -- a sort of lodestar for what's to come, I think that doesn't bode 05:04.020 --> 05:05.970 well for American democracy. 05:05.970 --> 05:08.910 AMNA NAWAZ: These efforts could be working. 05:08.910 --> 05:13.910 An NPR/"NewsHour"/Marist poll conducted last month showed a sharp partisan divide over 05:14.080 --> 05:19.080 how Americans view what happened on January 6, the legitimacy of investigations into it, 05:20.080 --> 05:25.080 and decreasing blame for President Trump, even as the former president continues to 05:25.270 --> 05:28.249 push the lie at the heart of January 6. 05:28.249 --> 05:33.249 The durability of that lie, where does that fit into sort of the larger misinformation 05:34.650 --> 05:38.520 campaign, the very thing that brought people out on January 6 in the first place? 05:38.520 --> 05:43.520 DAVID GRAHAM: Well, it's essential to the legitimacy of Trump as a political actor today. 05:44.069 --> 05:48.460 If he's somebody who had the election stolen from him, that makes him still a sort of heroic 05:48.460 --> 05:53.460 figure and a more legitimate leader perhaps than Joe Biden, in the eyes of his supporters. 05:53.870 --> 05:57.249 And that makes it -- that enables a lot of other information. 05:57.249 --> 06:02.020 AMNA NAWAZ: Information or, more accurately, misinformation questioning or undermining 06:02.020 --> 06:07.020 everything from measures to stop the spread of COVID-19, to the safety and efficacy of 06:07.270 --> 06:12.270 vaccines, from bogus stories about vaccines tracking and controlling Americans, to campaigns 06:13.430 --> 06:17.259 to stop teachers from talking about race or racism in schools. 06:17.259 --> 06:21.830 DAVID GRAHAM: So, when people in the Trumpist orbit spread misinformation about Joe Biden, 06:21.830 --> 06:26.590 or they spread misinformation about vaccines or about COVID, all of these spring from his 06:26.590 --> 06:31.590 legitimacy as the real elected leader, which depends on the lie of the election being stolen. 06:32.839 --> 06:37.409 AMNA NAWAZ: For more on the misinformation surrounding January 6 and how it's spread 06:37.409 --> 06:42.389 and evolved, I'm joined by two people who track and study just that. 06:42.389 --> 06:46.900 Jennifer Kavanagh is a senior political strategist at the RAND Corporation. She co-authored the 06:46.900 --> 06:51.900 book "Truth Decay" about the rise of misinformation. And Claire Wardle is the U.S. director of 06:52.819 --> 06:57.249 First Draft. That's a nonprofit that tracks misinformation online. 06:57.249 --> 06:59.460 Welcome to you both, and thank you for being here. 06:59.460 --> 07:00.750 Claire, I will begin with you. 07:00.750 --> 07:05.750 As we just saw, immediately after the Capitol attack, there were already alternative narratives 07:06.979 --> 07:11.979 being spun, despite live pictures, live reports, people seeing it in real time. 07:12.119 --> 07:17.119 In our latest "NewsHour"/NPR/Marist poll, it shows a divide on how Americans saw that 07:17.550 --> 07:22.400 day; 89 percent of Democrats say January 6 was an insurrection, was a threat to democracy, 07:22.400 --> 07:26.539 but only 10 percent of Republicans agree with that. 07:26.539 --> 07:27.550 How does that happen, Claire? 07:27.550 --> 07:30.159 CLAIRE WARDLE, U.S. Director, First Draft: Because there was a foundation being laid 07:30.159 --> 07:34.749 all the way through 2020, and then from Election Day onwards. 07:34.749 --> 07:39.229 This Stop the Steal narrative was emerging, this idea that the election was not safe, 07:39.229 --> 07:43.910 that the election was stolen. There was this drip, drip, drip throughout November and December. 07:43.910 --> 07:48.910 And so, when we had the events of January, very quickly, very smart people began shaping 07:50.440 --> 07:55.050 these narratives that already had a foundation that made sense to people who wanted to believe 07:55.050 --> 07:56.159 a certain world view. 07:56.159 --> 08:01.159 AMNA NAWAZ: Jennifer, talk to me about the role of news and journalism in all this, because 08:01.679 --> 08:05.960 you have studied this about the declining trust in news, Americans' skepticism around 08:05.960 --> 08:06.960 news. 08:06.960 --> 08:11.229 How much do you think that contributed to people being willing to say, what you're reporting, 08:11.229 --> 08:12.949 what you're showing me, I don't believe? 08:12.949 --> 08:14.679 JENNIFER KAVANAGH, RAND Corporation: I think it played a big role. 08:14.679 --> 08:19.679 I mean, people get their information from specific sources. And when they see information 08:20.259 --> 08:24.889 coming to them from sources that they don't trust, they tend to discard that information. 08:24.889 --> 08:28.429 It's also really hard to change people's minds once they have made it up. So, when people 08:28.429 --> 08:32.509 see additional information coming at them that contradicts that, they're not ready to 08:32.509 --> 08:35.430 discard what they have been believing for months or what they have been hearing from 08:35.430 --> 08:36.430 their trusted figures. 08:36.430 --> 08:41.430 So, the fact that people have such low trust in media plays a big role in their lack of 08:43.099 --> 08:47.819 - - their lack of ability to change their mind, and the difficulty that we face in trying 08:47.819 --> 08:50.470 to spread accurate information after the fact. 08:50.470 --> 08:54.430 AMNA NAWAZ: Claire, we know one of the main ways in which that information was spread 08:54.430 --> 08:57.880 even well before the Capitol attack was on social media, right? 08:57.880 --> 09:02.090 We saw even leading up to that day the whole Stop the Steal narrative, how those groups 09:02.090 --> 09:07.090 not only organized online, but then mobilized online, got people to show up in real life 09:07.730 --> 09:10.670 to commit criminal acts after that organization. 09:10.670 --> 09:14.610 What responsibility lies with the companies behind those social media platforms? 09:14.610 --> 09:19.230 CLAIRE WARDLE: When you look back at the timeline, it was only September of 2020 when Twitter 09:19.230 --> 09:24.130 started marking as false tweets from the president, for example, saying that the votes couldn't 09:24.130 --> 09:25.130 be trusted. 09:25.130 --> 09:29.260 So, I think the platforms were -- absolutely weren't ready for this. And then, as we saw 09:29.260 --> 09:33.980 on essentially January 7 and 8, they panicked and, like dominoes, they all started changing 09:33.980 --> 09:36.339 their policies and deplatforming. 09:36.339 --> 09:41.130 But the disinformation ecosystem is really participatory and engaging. And that's what's 09:41.130 --> 09:44.579 happening on these platforms. Not that much has changed in a year. And that's what we 09:44.579 --> 09:48.450 should be more worried about, not to see it as a one-off, and what changes have the platforms 09:48.450 --> 09:49.779 made? And I would say, not enough. 09:49.779 --> 09:54.779 AMNA NAWAZ: So, Jennifer, you have used this phrase truth decay in your work, and nowhere 09:55.010 --> 09:59.900 have we seen that more potently than when it comes to the pandemic and disinformation 09:59.900 --> 10:04.900 on social media and other places around the efficacy of vaccines and the efficacy of mitigation 10:05.100 --> 10:06.100 measures. 10:06.100 --> 10:10.660 And these are all things that are backed by science. They're backed by data. But, as you 10:10.660 --> 10:15.660 lay out, there's declining trust in those two things. So, can that decay, as you lay 10:17.270 --> 10:18.410 it out, can it be reversed? 10:18.410 --> 10:22.079 JENNIFER KAVANAGH: Well, the challenge is that disinformation tends to have an emotional 10:22.079 --> 10:26.800 component. As Claire described, it's participatory. It becomes part of the believer's identity. 10:26.800 --> 10:31.800 And so, trying to reverse the decay, as you described, is not simple. It's very, very 10:32.150 --> 10:35.699 challenging, because you're actually having to break into people's world view and change 10:35.699 --> 10:40.160 how they see the world. This is a challenge for a whole range of stakeholders. 10:40.160 --> 10:45.160 Social media companies are one. Researchers and scientists are another. How do we make 10:45.600 --> 10:50.600 data, whether it's about vaccines or COVID or election integrity, how we do make that 10:51.510 --> 10:56.459 data, that narrative compelling to people who are not inclined to believe it? 10:56.459 --> 11:01.190 One piece of that is thinking about who provides the messages. There's a concept of strategic 11:01.190 --> 11:06.190 messengers, trusted people within communities that are vulnerable or at risk for believing 11:07.069 --> 11:08.290 conspiracies and disinformation. 11:08.290 --> 11:13.290 I think election integrity is one of those cases where identifying allies within the 11:13.449 --> 11:18.449 communities that are vulnerable to that information is a challenge. And I don't think it's a challenge 11:18.460 --> 11:23.230 that has been addressed yet, which is why this -- the conspiracies and disinformation 11:23.230 --> 11:25.870 around the 2020 election continue to thrive. 11:25.870 --> 11:29.980 AMNA NAWAZ: Claire, you have also done some work on this about how people can arm themselves, 11:29.980 --> 11:34.769 right, how they can outsmart misinformation or disinformation campaigns, whether it is 11:34.769 --> 11:39.130 around elections or political candidates or vaccines or the pandemic. 11:39.130 --> 11:41.310 What are some of those tactics? What should people know? 11:41.310 --> 11:45.540 CLAIRE WARDLE: What the research shows is, whilst it's important to have fact-checking, 11:45.540 --> 11:50.410 what we should be doing is actually, rather than focusing on the individual rumor or conspiracy, 11:50.410 --> 11:54.019 teaching people the tactics of those who are trying to manipulate them, because what the 11:54.019 --> 11:59.019 research shows is, whoever you are, whatever your political persuasion or even education 11:59.100 --> 12:02.149 level, nobody wants to believe that they're being hoaxed or fooled. 12:02.149 --> 12:06.750 So, the more that communities can work with each other to teach them, well, if you see 12:06.750 --> 12:10.560 a text message that says, my brother works for the government and he's telling me, dot, 12:10.560 --> 12:15.180 dot, dot, an anecdote, as Jennifer just said, that, in itself, teaching people, well, just 12:15.180 --> 12:18.100 be a little bit more savvy about that, because that's a known tactic. 12:18.100 --> 12:22.329 So, the more we can teach people tactics and techniques, rather than waiting for the rumor 12:22.329 --> 12:25.540 and then kind of playing Whac-A-Mole, we're actually seeing the research show that's a 12:25.540 --> 12:30.540 much more effective way of building the resilience that means that, when they see misinformation, 12:30.639 --> 12:32.389 they're more likely to identify it as that. 12:32.389 --> 12:35.240 AMNA NAWAZ: Claire, I have to ask, after all the work you have done -- and, Jennifer, I 12:35.240 --> 12:40.240 will ask the same thing of you -- with misinformation and disinformation so prolific, now being 12:40.629 --> 12:45.300 pronounced and perpetuated from even the highest office in the land at times, do you have hope 12:45.300 --> 12:48.070 that that can be brought back under control? 12:48.070 --> 12:51.269 CLAIRE WARDLE: I still have hope. Otherwise, I wouldn't get up every day. 12:51.269 --> 12:55.889 But I think what we have to realize is, this is a very long game. I'd say, this is the 12:55.889 --> 13:00.329 battle of our lives for the next 20 to 30 years around climate, elections, vaccines, 13:00.329 --> 13:05.329 health. And we need to start thinking that this is a long game. There's no quick fix. 13:05.430 --> 13:08.000 We can't just shift the Facebook algorithm and make it all go away. 13:08.000 --> 13:09.000 AMNA NAWAZ: Jennifer, what about you? 13:09.000 --> 13:10.170 JENNIFER KAVANAGH: I agree with Claire. 13:10.170 --> 13:14.069 I think it's important to recognize that this - - that the challenge that we face now has 13:14.069 --> 13:19.069 evolved over several decades. And it's going to take just as long to figure out a way to 13:19.480 --> 13:24.480 manage the situation, so really thinking about this as a -- from a holistic perspective, 13:24.580 --> 13:28.590 and understanding that, whatever future we work to, that's hopefully better than what 13:28.590 --> 13:30.139 than what we face today. 13:30.139 --> 13:34.389 It's not going to look the same as 20 or 30 years ago. The goal isn't to put the cat back 13:34.389 --> 13:39.389 in the bag. The goal is to figure out sort of what we want online spaces to look like, 13:39.720 --> 13:44.089 what we want our society to look like, and how we want to interact in that way. 13:44.089 --> 13:49.089 I guess that's what gives me hope, is thinking that we can -- we can work towards that better 13:49.589 --> 13:52.720 future, rather than thinking about how we make things go back to the way they were. 13:52.720 --> 13:55.720 AMNA NAWAZ: That is Jennifer Kavanagh and Claire Wardle. 13:55.720 --> 13:57.889 Thank you so much to both of you for joining us. 13:57.889 --> 13:58.889 CLAIRE WARDLE: Thank you. 13:58.889 --> 13:59.600 JENNIFER KAVANAGH: Thanks for having me.