1 00:00:00,049 --> 00:00:04,430 JUDY WOODRUFF: The attack on the U.S. Capitol nearly one year ago was based on a big lie 2 00:00:04,430 --> 00:00:09,430 about election fraud in 2020 and the hope of supporters for former President Trump that 3 00:00:10,340 --> 00:00:13,630 they could stop the certification of electoral vote results. 4 00:00:13,630 --> 00:00:18,630 But starting that day, there's been a new misinformation campaign to recast, downplay, 5 00:00:19,570 --> 00:00:22,829 and misrepresent the events that unfolded at the Capitol. 6 00:00:22,829 --> 00:00:25,019 Amna Nawaz reports. 7 00:00:25,019 --> 00:00:30,019 AMNA NAWAZ: They broke through barricades, assaulted police, smashed their way into the 8 00:00:31,689 --> 00:00:36,689 Capitol, and sent lawmakers into hiding. 9 00:00:36,870 --> 00:00:41,860 Yet, even as the attack was playing out, there were already alternative narratives being 10 00:00:41,860 --> 00:00:43,399 spun about who was to blame. 11 00:00:43,399 --> 00:00:47,949 LAURA INGRAHAM, FOX News: There are some reports that Antifa sympathizers may have been sprinkled 12 00:00:47,949 --> 00:00:49,019 throughout the crowd. 13 00:00:49,019 --> 00:00:52,629 DREW HERNANDEZ, Investigative Reporter: Possibly Antifa insurrectionists possibly could have 14 00:00:52,629 --> 00:00:55,840 infiltrated some of these movements and maybe instigated some of this. 15 00:00:55,840 --> 00:01:00,109 REP. MATT GAETZ (R-FL): The Washington Times has just reported some pretty compelling evidence 16 00:01:00,109 --> 00:01:04,570 from a facial recognition company showing that some of the people who breached the Capitol 17 00:01:04,570 --> 00:01:09,570 today were not Trump supporters. They were masquerading as Trump supporters and, in fact, 18 00:01:10,720 --> 00:01:13,430 were members of the violent terrorist group Antifa. 19 00:01:13,430 --> 00:01:16,571 DAVID GRAHAM, Staff Writer, "The Atlantic": In the first hours and days afterward, you 20 00:01:16,571 --> 00:01:21,571 could see Trump and his allies and supporters sort of groping for what the appropriate narrative 21 00:01:21,960 --> 00:01:22,960 was. 22 00:01:22,960 --> 00:01:25,470 AMNA NAWAZ: David Graham is a staff writer at "The Atlantic" magazine. 23 00:01:25,470 --> 00:01:29,560 DAVID GRAHAM: So, on the one hand, you had Trump coming out with his video on the day 24 00:01:29,560 --> 00:01:33,190 of saying: We love you, but now go home. 25 00:01:33,190 --> 00:01:37,880 But you also saw people saying, oh, this is agitators, it was Antifa, it was Black Lives 26 00:01:37,880 --> 00:01:38,880 Matter. 27 00:01:38,880 --> 00:01:43,760 AMNA NAWAZ: That despite contemporaneous texts between pundits on FOX and the White House 28 00:01:43,760 --> 00:01:46,960 showing they thought Trump supporters were responsible. 29 00:01:46,960 --> 00:01:51,960 When subsequent arrests confirmed that publicly, the narrative on the right shifted to downplay 30 00:01:52,180 --> 00:01:53,910 the violence that day. 31 00:01:53,910 --> 00:01:55,840 Here's former President Trump on FOX in March. 32 00:01:55,840 --> 00:01:58,540 DONALD TRUMP, Former President of the United States: Right from the start, it was zero 33 00:01:58,540 --> 00:02:03,540 threat. Look, they went in. They shouldn't have done it. Some of them went in and they're 34 00:02:05,300 --> 00:02:07,480 hugging and kissing the police and the guards. 35 00:02:07,480 --> 00:02:11,700 REP. ANDREW CLYDE (R-GA): There was no insurrection. And to call it an insurrection, in my opinion, 36 00:02:11,700 --> 00:02:13,020 is a bold-faced lie. 37 00:02:13,020 --> 00:02:16,450 AMNA NAWAZ: Republican Congressman Andrew Clyde at a hearing in May. 38 00:02:16,450 --> 00:02:19,420 REP. ANDREW CLYDE: You know, if you didn't know the TV footage was a video from January 39 00:02:19,420 --> 00:02:22,670 the 6th, you would actually think it was a normal tourist visit. 40 00:02:22,670 --> 00:02:27,560 DAVID GRAHAM: It was strange to see somebody like Congressman Andrew Clyde, who -- of Georgia, 41 00:02:27,560 --> 00:02:32,560 who we saw in videos and footage from January 6 helping to bar the doors, suddenly saying, 42 00:02:33,040 --> 00:02:35,190 well, these were just tourists, they were walking through. 43 00:02:35,190 --> 00:02:40,190 AMNA NAWAZ: Another recurrent theme, shifting focus away from January 6 and towards protests 44 00:02:41,720 --> 00:02:44,600 for Black Lives Matter the year before. 45 00:02:44,600 --> 00:02:46,310 Republican Congressman Clay Higgins of Louisiana: 46 00:02:46,310 --> 00:02:51,171 REP. CLAY HIGGINS (R-LA): Nineteen people died during BLM riots last year. Hundreds 47 00:02:51,171 --> 00:02:56,171 and hundreds were injured; 2,000 police officers were injured from BLM riots last year. 48 00:02:57,910 --> 00:03:02,540 AMNA NAWAZ: Voices on the right have also recast those awaiting trial for their part 49 00:03:02,540 --> 00:03:05,290 in the attack as political prisoners. 50 00:03:05,290 --> 00:03:08,260 Here's Republican Congressman Paul Gosar of Arizona last month: 51 00:03:08,260 --> 00:03:13,260 REP. PAUL GOSAR (R-AZ): These are dads, brothers, veterans, teachers, all political prisoners 52 00:03:14,040 --> 00:03:17,620 who continue to be persecuted and endure the pain of unjust suffering. 53 00:03:17,620 --> 00:03:22,620 AMNA NAWAZ: So too with the death of Ashli Babbitt, the Air Force veteran shot by Capitol 54 00:03:23,750 --> 00:03:27,120 Police as she attempted to breach the speaker's lobby. 55 00:03:27,120 --> 00:03:29,840 Here's Republican Representative Jody Hice of Georgia in May: 56 00:03:29,840 --> 00:03:34,840 REP. JODY HICE (R-GA): In fact, it was Trump supporters who lost their lives that day, 57 00:03:35,170 --> 00:03:38,000 not Trump supporters who were taking the lives of others. 58 00:03:38,000 --> 00:03:41,980 AMNA NAWAZ: Former President Trump reinforced that in a July interview on FOX. 59 00:03:41,980 --> 00:03:46,980 DONALD TRUMP: Who was the person who shot an innocent, wonderful, incredible woman, 60 00:03:50,740 --> 00:03:52,110 a military woman? 61 00:03:52,110 --> 00:03:56,660 DAVID GRAHAM: The idea that they were all motivated by these good intentions, they believed 62 00:03:56,660 --> 00:04:00,270 the election was stolen, which, of course, was false -- it was a lie that had been peddled 63 00:04:00,270 --> 00:04:04,060 to them by the president and many of his allies - - but they were going in and they wanted 64 00:04:04,060 --> 00:04:08,770 to stand up for what was right, that they were sort of like the American revolutionaries 65 00:04:08,770 --> 00:04:12,730 or like the Confederate rebels, who wanted to really uphold the best of the Constitution. 66 00:04:12,730 --> 00:04:17,730 AMNA NAWAZ: In an October piece in "The Atlantic," Graham explored this idea, how those who committed 67 00:04:18,329 --> 00:04:23,329 criminal acts to stop a democratic process have been recast by the far right as heroes, 68 00:04:24,539 --> 00:04:29,539 patriots and martyrs for a just cause, much like the Confederate soldiers celebrated by 69 00:04:30,129 --> 00:04:32,759 the mythology of the Lost Cause. 70 00:04:32,759 --> 00:04:36,790 The fact that those people are referred to by some in these circles as patriots, what 71 00:04:36,790 --> 00:04:38,139 does that do to the narrative? 72 00:04:38,139 --> 00:04:41,900 DAVID GRAHAM: It makes them into the heirs of what was right. It turns something that 73 00:04:41,900 --> 00:04:46,889 was one of the darker moments in American history into one of the brighter ones, into 74 00:04:46,889 --> 00:04:51,889 a moment of unity and rebellion against what's wrong and standing up for what's right, which 75 00:04:52,919 --> 00:04:54,349 I think is really dangerous. 76 00:04:54,349 --> 00:04:58,849 If we can turn that something that's an assault on a constitutional process into a moment 77 00:04:58,849 --> 00:05:03,849 of triumph and a moment of -- a sort of lodestar for what's to come, I think that doesn't bode 78 00:05:04,020 --> 00:05:05,970 well for American democracy. 79 00:05:05,970 --> 00:05:08,910 AMNA NAWAZ: These efforts could be working. 80 00:05:08,910 --> 00:05:13,910 An NPR/"NewsHour"/Marist poll conducted last month showed a sharp partisan divide over 81 00:05:14,080 --> 00:05:19,080 how Americans view what happened on January 6, the legitimacy of investigations into it, 82 00:05:20,080 --> 00:05:25,080 and decreasing blame for President Trump, even as the former president continues to 83 00:05:25,270 --> 00:05:28,249 push the lie at the heart of January 6. 84 00:05:28,249 --> 00:05:33,249 The durability of that lie, where does that fit into sort of the larger misinformation 85 00:05:34,650 --> 00:05:38,520 campaign, the very thing that brought people out on January 6 in the first place? 86 00:05:38,520 --> 00:05:43,520 DAVID GRAHAM: Well, it's essential to the legitimacy of Trump as a political actor today. 87 00:05:44,069 --> 00:05:48,460 If he's somebody who had the election stolen from him, that makes him still a sort of heroic 88 00:05:48,460 --> 00:05:53,460 figure and a more legitimate leader perhaps than Joe Biden, in the eyes of his supporters. 89 00:05:53,870 --> 00:05:57,249 And that makes it -- that enables a lot of other information. 90 00:05:57,249 --> 00:06:02,020 AMNA NAWAZ: Information or, more accurately, misinformation questioning or undermining 91 00:06:02,020 --> 00:06:07,020 everything from measures to stop the spread of COVID-19, to the safety and efficacy of 92 00:06:07,270 --> 00:06:12,270 vaccines, from bogus stories about vaccines tracking and controlling Americans, to campaigns 93 00:06:13,430 --> 00:06:17,259 to stop teachers from talking about race or racism in schools. 94 00:06:17,259 --> 00:06:21,830 DAVID GRAHAM: So, when people in the Trumpist orbit spread misinformation about Joe Biden, 95 00:06:21,830 --> 00:06:26,590 or they spread misinformation about vaccines or about COVID, all of these spring from his 96 00:06:26,590 --> 00:06:31,590 legitimacy as the real elected leader, which depends on the lie of the election being stolen. 97 00:06:32,839 --> 00:06:37,409 AMNA NAWAZ: For more on the misinformation surrounding January 6 and how it's spread 98 00:06:37,409 --> 00:06:42,389 and evolved, I'm joined by two people who track and study just that. 99 00:06:42,389 --> 00:06:46,900 Jennifer Kavanagh is a senior political strategist at the RAND Corporation. She co-authored the 100 00:06:46,900 --> 00:06:51,900 book "Truth Decay" about the rise of misinformation. And Claire Wardle is the U.S. director of 101 00:06:52,819 --> 00:06:57,249 First Draft. That's a nonprofit that tracks misinformation online. 102 00:06:57,249 --> 00:06:59,460 Welcome to you both, and thank you for being here. 103 00:06:59,460 --> 00:07:00,750 Claire, I will begin with you. 104 00:07:00,750 --> 00:07:05,750 As we just saw, immediately after the Capitol attack, there were already alternative narratives 105 00:07:06,979 --> 00:07:11,979 being spun, despite live pictures, live reports, people seeing it in real time. 106 00:07:12,119 --> 00:07:17,119 In our latest "NewsHour"/NPR/Marist poll, it shows a divide on how Americans saw that 107 00:07:17,550 --> 00:07:22,400 day; 89 percent of Democrats say January 6 was an insurrection, was a threat to democracy, 108 00:07:22,400 --> 00:07:26,539 but only 10 percent of Republicans agree with that. 109 00:07:26,539 --> 00:07:27,550 How does that happen, Claire? 110 00:07:27,550 --> 00:07:30,159 CLAIRE WARDLE, U.S. Director, First Draft: Because there was a foundation being laid 111 00:07:30,159 --> 00:07:34,749 all the way through 2020, and then from Election Day onwards. 112 00:07:34,749 --> 00:07:39,229 This Stop the Steal narrative was emerging, this idea that the election was not safe, 113 00:07:39,229 --> 00:07:43,910 that the election was stolen. There was this drip, drip, drip throughout November and December. 114 00:07:43,910 --> 00:07:48,910 And so, when we had the events of January, very quickly, very smart people began shaping 115 00:07:50,440 --> 00:07:55,050 these narratives that already had a foundation that made sense to people who wanted to believe 116 00:07:55,050 --> 00:07:56,159 a certain world view. 117 00:07:56,159 --> 00:08:01,159 AMNA NAWAZ: Jennifer, talk to me about the role of news and journalism in all this, because 118 00:08:01,679 --> 00:08:05,960 you have studied this about the declining trust in news, Americans' skepticism around 119 00:08:05,960 --> 00:08:06,960 news. 120 00:08:06,960 --> 00:08:11,229 How much do you think that contributed to people being willing to say, what you're reporting, 121 00:08:11,229 --> 00:08:12,949 what you're showing me, I don't believe? 122 00:08:12,949 --> 00:08:14,679 JENNIFER KAVANAGH, RAND Corporation: I think it played a big role. 123 00:08:14,679 --> 00:08:19,679 I mean, people get their information from specific sources. And when they see information 124 00:08:20,259 --> 00:08:24,889 coming to them from sources that they don't trust, they tend to discard that information. 125 00:08:24,889 --> 00:08:28,429 It's also really hard to change people's minds once they have made it up. So, when people 126 00:08:28,429 --> 00:08:32,509 see additional information coming at them that contradicts that, they're not ready to 127 00:08:32,509 --> 00:08:35,430 discard what they have been believing for months or what they have been hearing from 128 00:08:35,430 --> 00:08:36,430 their trusted figures. 129 00:08:36,430 --> 00:08:41,430 So, the fact that people have such low trust in media plays a big role in their lack of 130 00:08:43,099 --> 00:08:47,819 - - their lack of ability to change their mind, and the difficulty that we face in trying 131 00:08:47,819 --> 00:08:50,470 to spread accurate information after the fact. 132 00:08:50,470 --> 00:08:54,430 AMNA NAWAZ: Claire, we know one of the main ways in which that information was spread 133 00:08:54,430 --> 00:08:57,880 even well before the Capitol attack was on social media, right? 134 00:08:57,880 --> 00:09:02,090 We saw even leading up to that day the whole Stop the Steal narrative, how those groups 135 00:09:02,090 --> 00:09:07,090 not only organized online, but then mobilized online, got people to show up in real life 136 00:09:07,730 --> 00:09:10,670 to commit criminal acts after that organization. 137 00:09:10,670 --> 00:09:14,610 What responsibility lies with the companies behind those social media platforms? 138 00:09:14,610 --> 00:09:19,230 CLAIRE WARDLE: When you look back at the timeline, it was only September of 2020 when Twitter 139 00:09:19,230 --> 00:09:24,130 started marking as false tweets from the president, for example, saying that the votes couldn't 140 00:09:24,130 --> 00:09:25,130 be trusted. 141 00:09:25,130 --> 00:09:29,260 So, I think the platforms were -- absolutely weren't ready for this. And then, as we saw 142 00:09:29,260 --> 00:09:33,980 on essentially January 7 and 8, they panicked and, like dominoes, they all started changing 143 00:09:33,980 --> 00:09:36,339 their policies and deplatforming. 144 00:09:36,339 --> 00:09:41,130 But the disinformation ecosystem is really participatory and engaging. And that's what's 145 00:09:41,130 --> 00:09:44,579 happening on these platforms. Not that much has changed in a year. And that's what we 146 00:09:44,579 --> 00:09:48,450 should be more worried about, not to see it as a one-off, and what changes have the platforms 147 00:09:48,450 --> 00:09:49,779 made? And I would say, not enough. 148 00:09:49,779 --> 00:09:54,779 AMNA NAWAZ: So, Jennifer, you have used this phrase truth decay in your work, and nowhere 149 00:09:55,010 --> 00:09:59,900 have we seen that more potently than when it comes to the pandemic and disinformation 150 00:09:59,900 --> 00:10:04,900 on social media and other places around the efficacy of vaccines and the efficacy of mitigation 151 00:10:05,100 --> 00:10:06,100 measures. 152 00:10:06,100 --> 00:10:10,660 And these are all things that are backed by science. They're backed by data. But, as you 153 00:10:10,660 --> 00:10:15,660 lay out, there's declining trust in those two things. So, can that decay, as you lay 154 00:10:17,270 --> 00:10:18,410 it out, can it be reversed? 155 00:10:18,410 --> 00:10:22,079 JENNIFER KAVANAGH: Well, the challenge is that disinformation tends to have an emotional 156 00:10:22,079 --> 00:10:26,800 component. As Claire described, it's participatory. It becomes part of the believer's identity. 157 00:10:26,800 --> 00:10:31,800 And so, trying to reverse the decay, as you described, is not simple. It's very, very 158 00:10:32,150 --> 00:10:35,699 challenging, because you're actually having to break into people's world view and change 159 00:10:35,699 --> 00:10:40,160 how they see the world. This is a challenge for a whole range of stakeholders. 160 00:10:40,160 --> 00:10:45,160 Social media companies are one. Researchers and scientists are another. How do we make 161 00:10:45,600 --> 00:10:50,600 data, whether it's about vaccines or COVID or election integrity, how we do make that 162 00:10:51,510 --> 00:10:56,459 data, that narrative compelling to people who are not inclined to believe it? 163 00:10:56,459 --> 00:11:01,190 One piece of that is thinking about who provides the messages. There's a concept of strategic 164 00:11:01,190 --> 00:11:06,190 messengers, trusted people within communities that are vulnerable or at risk for believing 165 00:11:07,069 --> 00:11:08,290 conspiracies and disinformation. 166 00:11:08,290 --> 00:11:13,290 I think election integrity is one of those cases where identifying allies within the 167 00:11:13,449 --> 00:11:18,449 communities that are vulnerable to that information is a challenge. And I don't think it's a challenge 168 00:11:18,460 --> 00:11:23,230 that has been addressed yet, which is why this -- the conspiracies and disinformation 169 00:11:23,230 --> 00:11:25,870 around the 2020 election continue to thrive. 170 00:11:25,870 --> 00:11:29,980 AMNA NAWAZ: Claire, you have also done some work on this about how people can arm themselves, 171 00:11:29,980 --> 00:11:34,769 right, how they can outsmart misinformation or disinformation campaigns, whether it is 172 00:11:34,769 --> 00:11:39,130 around elections or political candidates or vaccines or the pandemic. 173 00:11:39,130 --> 00:11:41,310 What are some of those tactics? What should people know? 174 00:11:41,310 --> 00:11:45,540 CLAIRE WARDLE: What the research shows is, whilst it's important to have fact-checking, 175 00:11:45,540 --> 00:11:50,410 what we should be doing is actually, rather than focusing on the individual rumor or conspiracy, 176 00:11:50,410 --> 00:11:54,019 teaching people the tactics of those who are trying to manipulate them, because what the 177 00:11:54,019 --> 00:11:59,019 research shows is, whoever you are, whatever your political persuasion or even education 178 00:11:59,100 --> 00:12:02,149 level, nobody wants to believe that they're being hoaxed or fooled. 179 00:12:02,149 --> 00:12:06,750 So, the more that communities can work with each other to teach them, well, if you see 180 00:12:06,750 --> 00:12:10,560 a text message that says, my brother works for the government and he's telling me, dot, 181 00:12:10,560 --> 00:12:15,180 dot, dot, an anecdote, as Jennifer just said, that, in itself, teaching people, well, just 182 00:12:15,180 --> 00:12:18,100 be a little bit more savvy about that, because that's a known tactic. 183 00:12:18,100 --> 00:12:22,329 So, the more we can teach people tactics and techniques, rather than waiting for the rumor 184 00:12:22,329 --> 00:12:25,540 and then kind of playing Whac-A-Mole, we're actually seeing the research show that's a 185 00:12:25,540 --> 00:12:30,540 much more effective way of building the resilience that means that, when they see misinformation, 186 00:12:30,639 --> 00:12:32,389 they're more likely to identify it as that. 187 00:12:32,389 --> 00:12:35,240 AMNA NAWAZ: Claire, I have to ask, after all the work you have done -- and, Jennifer, I 188 00:12:35,240 --> 00:12:40,240 will ask the same thing of you -- with misinformation and disinformation so prolific, now being 189 00:12:40,629 --> 00:12:45,300 pronounced and perpetuated from even the highest office in the land at times, do you have hope 190 00:12:45,300 --> 00:12:48,070 that that can be brought back under control? 191 00:12:48,070 --> 00:12:51,269 CLAIRE WARDLE: I still have hope. Otherwise, I wouldn't get up every day. 192 00:12:51,269 --> 00:12:55,889 But I think what we have to realize is, this is a very long game. I'd say, this is the 193 00:12:55,889 --> 00:13:00,329 battle of our lives for the next 20 to 30 years around climate, elections, vaccines, 194 00:13:00,329 --> 00:13:05,329 health. And we need to start thinking that this is a long game. There's no quick fix. 195 00:13:05,430 --> 00:13:08,000 We can't just shift the Facebook algorithm and make it all go away. 196 00:13:08,000 --> 00:13:09,000 AMNA NAWAZ: Jennifer, what about you? 197 00:13:09,000 --> 00:13:10,170 JENNIFER KAVANAGH: I agree with Claire. 198 00:13:10,170 --> 00:13:14,069 I think it's important to recognize that this - - that the challenge that we face now has 199 00:13:14,069 --> 00:13:19,069 evolved over several decades. And it's going to take just as long to figure out a way to 200 00:13:19,480 --> 00:13:24,480 manage the situation, so really thinking about this as a -- from a holistic perspective, 201 00:13:24,580 --> 00:13:28,590 and understanding that, whatever future we work to, that's hopefully better than what 202 00:13:28,590 --> 00:13:30,139 than what we face today. 203 00:13:30,139 --> 00:13:34,389 It's not going to look the same as 20 or 30 years ago. The goal isn't to put the cat back 204 00:13:34,389 --> 00:13:39,389 in the bag. The goal is to figure out sort of what we want online spaces to look like, 205 00:13:39,720 --> 00:13:44,089 what we want our society to look like, and how we want to interact in that way. 206 00:13:44,089 --> 00:13:49,089 I guess that's what gives me hope, is thinking that we can -- we can work towards that better 207 00:13:49,589 --> 00:13:52,720 future, rather than thinking about how we make things go back to the way they were. 208 00:13:52,720 --> 00:13:55,720 AMNA NAWAZ: That is Jennifer Kavanagh and Claire Wardle. 209 00:13:55,720 --> 00:13:57,889 Thank you so much to both of you for joining us. 210 00:13:57,889 --> 00:13:58,889 CLAIRE WARDLE: Thank you. 211 00:13:58,889 --> 00:13:59,600 JENNIFER KAVANAGH: Thanks for having me.