1 00:00:00,000 --> 00:00:03,320 YAMICHE ALCINDOR: Welcome to the Washington Week Extra. I'm Yamiche Alcindor. 2 00:00:03,320 --> 00:00:07,960 Tonight let's continue the conversation about Facebook whistleblower Frances Haugen's 3 00:00:07,960 --> 00:00:13,570 testimony before Congress. She told lawmakers the company put profits before public safety. 4 00:00:13,570 --> 00:00:17,420 FRANCES HAUGEN: (From video.) The company's leadership knows how to make Facebook and 5 00:00:17,420 --> 00:00:21,250 Instagram safer but won't make the necessary changes because they have put their 6 00:00:21,250 --> 00:00:25,210 astronomical profits before people. 7 00:00:25,210 --> 00:00:30,700 ALCINDOR: She also told lawmakers that the company, Facebook, knows their products harm kids and teens. 8 00:00:30,700 --> 00:00:34,320 FRANCES HAUGEN: (From video.) It's just like cigarettes. Teenagers don't have good 9 00:00:34,320 --> 00:00:39,340 self-regulation. They say explicitly I feel bad when I use Instagram and yet 10 00:00:39,340 --> 00:00:43,440 I can't stop. We need to protect the kids. 11 00:00:43,440 --> 00:00:47,910 ALCINDOR: Now, Facebook founder Mark Zuckerberg responded in a note to his employees 12 00:00:47,910 --> 00:00:50,600 posted on his public Facebook page. 13 00:00:50,600 --> 00:00:55,150 He wrote, quote, "We care deeply about issues like safety, well-being and mental 14 00:00:55,150 --> 00:00:59,580 health...it's very important to me that everything we build is safe and good for kids." 15 00:00:59,580 --> 00:01:03,780 Now joining us remotely, Cecilia Kang, technology reporter for The New York Times and 16 00:01:03,780 --> 00:01:07,960 co-author of An Ugly Truth: Inside Facebook's Battle for Domination. 17 00:01:07,960 --> 00:01:12,660 And with me here at the table, Nancy Cordes, CBS News chief White House correspondent; 18 00:01:12,660 --> 00:01:17,790 Eamon Javers, CNBC's senior Washington correspondent; and Marianna Sotomayor, 19 00:01:17,790 --> 00:01:21,540 congressional reporter for The Washington Post. Thank you all for being here. 20 00:01:21,540 --> 00:01:25,610 Cecilia, you get the first question because, of course, you're our Facebook expert at 21 00:01:25,610 --> 00:01:28,400 this table. Tell us a little bit more about this whistleblower. 22 00:01:28,400 --> 00:01:32,200 Who is she? What did she say? And what's motivating her, do you think? 23 00:01:32,200 --> 00:01:37,220 CECILIA KANG: Yeah. Frances Haugen spent nearly two years at Facebook on a team 24 00:01:37,220 --> 00:01:41,680 called the civic integrity team. That is a team that basically tries to fight off 25 00:01:41,680 --> 00:01:44,390 misinformation and other harmful content. 26 00:01:44,390 --> 00:01:50,080 Her background and her expertise is actually in the technology behind the newsfeed, on 27 00:01:50,080 --> 00:01:55,280 how - and how the company determines what it wants to rank highest and lowest in terms of 28 00:01:55,280 --> 00:01:59,450 engagement. So she's really deep into the system. She understands the technology. 29 00:01:59,450 --> 00:02:05,540 And she's also a Silicon Valley veteran. She's worked at Google, Pinterest, and Yelp as well. 30 00:02:05,540 --> 00:02:13,060 And what motivated her was a decision in December 2020, when Facebook decided to disband 31 00:02:13,060 --> 00:02:17,860 her civic integrity team. This was right after the election, and certainly when there 32 00:02:17,860 --> 00:02:21,980 was still certainty a lot of unrest in the country about the election results. 33 00:02:21,980 --> 00:02:27,170 And to her, that was the clear sign that the company was not making - was not serious 34 00:02:27,170 --> 00:02:32,090 enough about protecting its users and making sure that misinformation about the election, 35 00:02:32,090 --> 00:02:37,430 as well as a slew of other types of harmful content, was not on the site. And she was 36 00:02:37,430 --> 00:02:43,560 seeing internally practices and a struggle with really important issues internally that 37 00:02:43,560 --> 00:02:47,640 the company was not admitting to the public. So what she did was she quit in December. 38 00:02:47,640 --> 00:02:52,710 And before she left she copied off tens of thousands of documents of internal research 39 00:02:52,710 --> 00:02:57,290 that's actually available to many, many employees. But she copied it off. 40 00:02:57,290 --> 00:03:02,270 And this is the kind of research, like the teens in Instagram research that you 41 00:03:02,270 --> 00:03:07,560 mentioned, Yamiche, earlier. And she decided that she would take those documents once 42 00:03:07,560 --> 00:03:11,120 she left, and she brought them to a reporter at The Wall Street Journal. And The Wall 43 00:03:11,120 --> 00:03:15,320 Street Journal has since begun a series of stories. They and other journalists are now 44 00:03:15,320 --> 00:03:20,080 continuing to report on all these documents that the whistleblower has brought to the public. 45 00:03:20,080 --> 00:03:24,920 ALCINDOR: And, Cecilia, one of the first times I really understood the sort of backdoor 46 00:03:24,920 --> 00:03:29,160 things that happen in Facebook is when you started reporting on it, and when you wrote 47 00:03:29,160 --> 00:03:34,020 your amazing book - that everyone, of course, should get. I wonder if you can talk a 48 00:03:34,020 --> 00:03:38,130 little bit about how your reporting connects to what this whistleblower's saying. 49 00:03:38,130 --> 00:03:42,720 KANG: Yeah. We really feel like the whistleblower's testimony, certainly, and the 50 00:03:42,720 --> 00:03:48,230 reporting from her documents confirm absolutely the main theme of our book. 51 00:03:48,230 --> 00:03:53,720 The book theme and the book title, An Ugly Truth, comes from a memo from a very senior 52 00:03:53,720 --> 00:03:58,210 executive named Andrew Bosworth, where - it's called "the ugly." Where he says: Facebook 53 00:03:58,210 --> 00:04:03,100 believes so much in connecting the world that it believes that even though there will be 54 00:04:03,100 --> 00:04:07,460 a lot of collateral damage because of its quest to connect the world - that kind of 55 00:04:07,460 --> 00:04:11,490 damage can be terrorist attacks, it can be bullying, it can be deaths even. 56 00:04:11,490 --> 00:04:16,080 But in the end, the goal of connecting the world will be better for the world, and it 57 00:04:16,080 --> 00:04:20,550 will be net-net good. And we're willing to absorb those costs. That's the calculus that 58 00:04:20,550 --> 00:04:25,000 the company has. That's sort of the thrust of what the whistleblower's documents show, 59 00:04:25,000 --> 00:04:28,220 is that growth is the most important thing. 60 00:04:28,220 --> 00:04:31,440 Because the memo said "connecting the world," but we've come to realize that that's 61 00:04:31,440 --> 00:04:36,220 actually sort of a euphemism for growth, growth in engagement, and growth in profits. 62 00:04:36,220 --> 00:04:41,280 And the whistleblower's main argument is that the company is so bent on growing and 63 00:04:41,280 --> 00:04:47,680 keeping its site very relevant, that it is making decisions that has not just small 64 00:04:47,680 --> 00:04:51,280 collateral damage, but enormous collateral damage. 65 00:04:51,280 --> 00:04:55,650 ALCINDOR: And, you know, Cecilia's talking about this sort of idea of Facebook putting 66 00:04:55,650 --> 00:05:00,460 everything - putting profit before everything. Eamon, I wonder when Cecilia's also 67 00:05:00,460 --> 00:05:04,800 talking about how we rely on Facebook. What did this outage this week - which if people 68 00:05:04,800 --> 00:05:09,330 don't really kind of realize it was Instagram, it was WhatsApp, it's Facebook. 69 00:05:09,330 --> 00:05:12,120 So when say "Facebook" we're talking about multiple platforms. What did that 70 00:05:12,120 --> 00:05:15,840 outage show about how much people rely on Facebook, especially around the world? 71 00:05:15,840 --> 00:05:18,920 EAMON JAVERS: Well, and multiple countries around the world, and also you're talking 72 00:05:18,920 --> 00:05:22,110 about businesses that do all their advertising on Facebook, that communicate with their 73 00:05:22,110 --> 00:05:24,500 customers through WhatsApp. 74 00:05:24,500 --> 00:05:27,550 I mean, I think of Facebook as the service that we use to keep in touch with those people 75 00:05:27,550 --> 00:05:30,820 that we went to high school with, who we're too lazy to actually pick up the phone and 76 00:05:30,820 --> 00:05:33,590 call. But actually, a lot of businesses are done on Facebook. 77 00:05:33,590 --> 00:05:36,520 And you saw this enormous impact globally on all of those people. 78 00:05:36,520 --> 00:05:41,000 And take a minute to step back and realize the impact of what the whistleblower did here. 79 00:05:41,000 --> 00:05:44,950 I mean, first of all, serving as sort of an undercover anti-Facebook agent inside the 80 00:05:44,950 --> 00:05:48,760 company, stealing those documents. Facebook says those are stolen documents. 81 00:05:48,760 --> 00:05:53,560 Then leaking them out to The Wall Street Journal in a very tactical way for a devastating 82 00:05:53,560 --> 00:05:58,280 series of blockbuster articles in the Journal, day after day after day with revelations. 83 00:05:58,280 --> 00:06:03,160 Then coming forward on 60 Minutes with a big reveal of her own identity. 84 00:06:03,160 --> 00:06:06,910 And then two days later Capitol Hill testimony that riveted the country. 85 00:06:06,910 --> 00:06:10,850 This rollout of what the whistleblower did, this operation under cover inside of 86 00:06:10,850 --> 00:06:15,510 Facebook, was devastating for Facebook. This was a very tough week for them. 87 00:06:15,510 --> 00:06:18,620 ALCINDOR: And Nancy, you're nodding your head. I want to bring you in here. 88 00:06:18,620 --> 00:06:21,540 I was going to ask you what does President Biden think about all this, but really Eamon 89 00:06:21,540 --> 00:06:24,120 just also talked about this PR rollout that I hadn't really even put together. 90 00:06:24,120 --> 00:06:26,550 What do make of all - (laughter) - he just said? 91 00:06:26,550 --> 00:06:29,420 NANCY CORDES: It was impressive and I want to know who was behind it because they're 92 00:06:29,420 --> 00:06:31,560 going to get a lot more business. 93 00:06:31,560 --> 00:06:34,410 JAVERS: The reporting is Bill Burton was behind it, right, so I mean, there's some 94 00:06:34,410 --> 00:06:36,760 Washington insiders who might have had a hand in this. 95 00:06:36,760 --> 00:06:40,360 CORDES: Ah, right, who know - you know, they know - they know how the Washington 96 00:06:40,360 --> 00:06:43,400 ecosystem works, certainly. You know, I think the president and the White House have 97 00:06:43,400 --> 00:06:46,920 made no secret of their disdain for Facebook, right? I mean, didn't the president kind 98 00:06:46,920 --> 00:06:50,350 of have to walk back his comments after he said that they were killing people, you know? 99 00:06:50,350 --> 00:06:54,630 And then he clarified; he said, well, no, it's not Facebook itself that's killing people, 100 00:06:54,630 --> 00:06:59,040 it's people who post on Facebook. But you know, they've been very outspoken about the 101 00:06:59,040 --> 00:07:02,950 fact that they think that a lot of social media platforms, but Facebook in particular, 102 00:07:02,950 --> 00:07:06,770 have a responsibility that they're not meeting right now. 103 00:07:06,770 --> 00:07:11,080 The problem is, and Marianna really hit on it earlier, that they've got a very crowded 104 00:07:11,080 --> 00:07:14,720 agenda. They've got a lot of things they'd like to accomplish. 105 00:07:14,720 --> 00:07:19,200 And so while this is one of those issues on which Democrats and Republicans agree 106 00:07:19,200 --> 00:07:25,530 something needs to be done, you wonder when it is going to rise to the top of the agenda, 107 00:07:25,530 --> 00:07:30,030 especially because, I don't know if you've noticed, but lawmakers, some of them, tend not 108 00:07:30,030 --> 00:07:35,410 to be all that technologically savvy - (laughter) - you've noticed that? 109 00:07:35,410 --> 00:07:37,710 JAVERS: That's a very generous way of putting that. (Laughter.) 110 00:07:37,710 --> 00:07:40,580 CORDES: - in some of their questioning at hearings before. 111 00:07:40,580 --> 00:07:44,790 So it seems that there's - they know something needs to be done, but they're sometimes a 112 00:07:44,790 --> 00:07:49,350 little bit tentative to say, definitively, and this is what I think should be done, these 113 00:07:49,350 --> 00:07:51,830 are the new regulations I want to see. 114 00:07:51,830 --> 00:07:54,520 JAVERS: When are you going to ban "finsta," was one of the questions. Right? (Laughs.) 115 00:07:54,520 --> 00:07:58,180 CORDES: Right, exactly, so that's another reason why you'll continue to see a lot of agreement 116 00:07:58,180 --> 00:08:03,440 that something should happen; when we will actually see that happen, that's an open question. 117 00:08:03,440 --> 00:08:07,980 ALCINDOR: Marianna, what are you hearing on Capitol Hill from these lawmakers about 118 00:08:07,980 --> 00:08:11,210 Facebook, their time for trying to regulate this, and, also, just their understanding of 119 00:08:11,210 --> 00:08:13,560 what needs to be done? 120 00:08:13,560 --> 00:08:16,420 MARIANNA SOTOMAYOR: Yeah, you know, there's been many years where there's been these 121 00:08:16,420 --> 00:08:20,740 kinds of oversight hearings, not as blockbuster as this one, where you do have members, 122 00:08:20,740 --> 00:08:28,140 you can tell, and senators, they don't really know which way to question someone; like, they get there - 123 00:08:28,140 --> 00:08:30,630 ALCINDOR: In that exact tone. (Laughter.) 124 00:08:30,630 --> 00:08:34,690 SOTOMAYOR: Yeah, exactly, there's a lot of hesitancy of, like, I hope I'm getting this 125 00:08:34,690 --> 00:08:38,550 right. (Laughter.) But then you get the "finsta" commentaries and things like that. 126 00:08:38,550 --> 00:08:41,930 So there's still a lot of people who are looking at this. 127 00:08:41,930 --> 00:08:46,460 And one thing to note, too, is that there's probably going to be more investigations or 128 00:08:46,460 --> 00:08:51,780 hearings before there will be any kind of legislation proposed. 129 00:08:51,780 --> 00:08:57,630 And one thing to note is the January 6 committee, for example; they really want to talk 130 00:08:57,630 --> 00:09:02,360 to this Facebook whistleblower because she has also mentioned the fact that Facebook had 131 00:09:02,360 --> 00:09:08,560 a role in potentially allowing or, you know, not doing enough oversight to allow these 132 00:09:08,560 --> 00:09:12,890 people, these insurrectionists to communicate on all these different devices and social 133 00:09:12,890 --> 00:09:17,920 media networks. So that is something that - it's likely we might be able to see - in a 134 00:09:17,920 --> 00:09:22,800 couple weeks or so she might come back and testify before that committee behind closed doors. 135 00:09:22,800 --> 00:09:27,680 ALCINDOR: And Cecilia, it's a question that my producers and I were thinking through: 136 00:09:27,680 --> 00:09:33,050 What makes Facebook so different than other social media platforms, when you think about 137 00:09:33,050 --> 00:09:35,310 Twitter or other things? What sets them apart? 138 00:09:35,310 --> 00:09:38,030 What possibly makes them worse than these other platforms? 139 00:09:38,030 --> 00:09:42,490 KANG: Well, I think one very distinguishing factor is that the company is basically 140 00:09:42,490 --> 00:09:47,850 Mark's company. It's Mark Zuckerberg's company. He owns 55 percent of voting shares. 141 00:09:47,850 --> 00:09:51,190 He makes the decisions. And Frances Haugen, the whistleblower, said the buck stops with Mark. 142 00:09:51,190 --> 00:09:53,780 And I think that's absolutely true in my reporting. 143 00:09:53,780 --> 00:09:58,100 The other thing that's really different, in relation to the research that you mentioned, 144 00:09:58,100 --> 00:10:03,230 Yamiche, on teens and Instagram and the harms, the toxic harms and sort of the negativity 145 00:10:03,230 --> 00:10:09,390 that a lot of teenagers feel from using the platform: One really interesting finding from 146 00:10:09,390 --> 00:10:15,660 that research, Facebook's own internal research, is that Facebook believes that Instagram 147 00:10:15,660 --> 00:10:20,050 is different and, in some ways, worse than TikTok and Snapchat, and just in a very small, 148 00:10:20,050 --> 00:10:25,150 interesting way. Instagram has these sort of beauty filters and there's also this culture 149 00:10:25,150 --> 00:10:29,510 of trying to curate this vision of who you are in your life. There's a lot of focus on 150 00:10:29,510 --> 00:10:34,580 the full body. TikTok - and, by the way, TikTok and Snapchat definitely have their 151 00:10:34,580 --> 00:10:38,700 problems; they're not completely, you know, immune to problems. 152 00:10:38,700 --> 00:10:43,500 But TikTok is much more of a sort of performance-based fun app, is what a lot of the 153 00:10:43,500 --> 00:10:48,460 teenagers who took the surveys for Facebook said; they feel like it's a little bit more 154 00:10:48,460 --> 00:10:53,150 humorous, like, sort of like, just different kinds of challenges, dances, a lot more 155 00:10:53,150 --> 00:10:58,210 lighthearted. Snapchat, interestingly, has these face filters that are really sort of goofy, 156 00:10:58,210 --> 00:11:04,170 cartoon-animated filters that are just supposed to also be fun, and the focus is on the face. 157 00:11:04,170 --> 00:11:09,430 And so the kind of body-image issues that Instagram users reported to Facebook in its own 158 00:11:09,430 --> 00:11:14,340 research, one out of three teenagers said that when they use Instagram they - because of 159 00:11:14,340 --> 00:11:20,330 using Instagram they feel worse about their body image. Fourteen percent of teens in the U.K. 160 00:11:20,330 --> 00:11:26,470 said that those - that they had suicidal ideations and they could trace it back to Instagram use. 161 00:11:26,470 --> 00:11:31,690 I mean, those are the kinds of feelings and anxieties and really, really harmful kind of 162 00:11:31,690 --> 00:11:36,260 responses that didn't exist with these other apps, and I thought that was a really 163 00:11:36,260 --> 00:11:38,780 important distinguishing factor. 164 00:11:38,780 --> 00:11:44,520 The other, last thing I would say is, Twitter is, very interestingly, more willing to 165 00:11:44,520 --> 00:11:49,580 experiment with ways to try to fight misinformation and also to try to protect its users, 166 00:11:49,580 --> 00:11:52,960 and one thing that they do is - I'm sure we've all gone through this: When you try to 167 00:11:52,960 --> 00:11:56,580 retweet something that you haven't - a story that you haven't read and actually opened 168 00:11:56,580 --> 00:12:00,120 up, you get a popup box that says: Are you sure you really want to retweet this? 169 00:12:00,120 --> 00:12:04,620 Looks like you haven't read it. 170 00:12:04,620 --> 00:12:09,280 Facebook doesn't have that kind of feature, and that feature is known as friction. 171 00:12:09,280 --> 00:12:13,440 It provides friction between you and sharing, you and otherwise - in other - in other 172 00:12:13,440 --> 00:12:18,660 words, you and amplifying more of that content, and Facebook just doesn't do that. 173 00:12:18,660 --> 00:12:22,230 So they're not making the same kinds of decisions as some of their competitors are that 174 00:12:22,230 --> 00:12:28,220 arguably could be good solutions to at least start solving this misinformation problem. 175 00:12:28,220 --> 00:12:32,520 ALCINDOR: It's such a comprehensive answer and one that I think so many people really 176 00:12:32,520 --> 00:12:35,560 need to hear about just the difference of Facebook with all the other social media platforms. 177 00:12:35,560 --> 00:12:38,910 Eamon, I'm going to come to you for the last word here: Is this all about money? 178 00:12:38,910 --> 00:12:43,460 Does this all, at the end of the day, end up about profits, and where do we go from here? 179 00:12:43,460 --> 00:12:46,890 JAVERS: Yeah, look, Facebook has grown so fast over such a relatively short period of 180 00:12:46,890 --> 00:12:51,120 time, you know, and you think of the past 15 years or so. The question for Facebook 181 00:12:51,120 --> 00:12:55,010 is, how can they keep growing? I mean, the law of big numbers suggests once you have 182 00:12:55,010 --> 00:12:58,450 almost everybody on planet Earth who's connected to the internet as part of your 183 00:12:58,450 --> 00:13:01,640 service, how can you continue to grow, right? And so one of the things that they're 184 00:13:01,640 --> 00:13:04,830 trying to do is keep all those people on the service for even longer amounts of time. 185 00:13:04,830 --> 00:13:10,810 That's what engagement is. And the idea is that all these angry things that we are 186 00:13:10,810 --> 00:13:14,340 seeing on Facebook are enticing people to stay on the service for a longer period of time. 187 00:13:14,340 --> 00:13:17,050 That represents more ad dollars, more revenue for Facebook. 188 00:13:17,050 --> 00:13:20,080 So the more engagement they get, the more profit they make. 189 00:13:20,080 --> 00:13:23,450 And in a world where it's going to be very hard for them to find new customers because 190 00:13:23,450 --> 00:13:27,260 they already have just about everybody on the planet, well, engagement is the answer. 191 00:13:27,260 --> 00:13:31,120 And so if they dialed back on some of these things and dialed back on some of the angry 192 00:13:31,120 --> 00:13:36,540 content, they're also going to be dialing back on profits, and that's a real problem for a public company. 193 00:13:36,540 --> 00:13:40,060 ALCINDOR: Yeah, yeah. Well, we'll have to leave it there tonight. Thank you so much 194 00:13:40,060 --> 00:13:43,690 to Cecilia, Nancy, Eamon, and Marianna for joining us and sharing your reporting. 195 00:13:43,690 --> 00:13:47,060 And make sure to sign up for the Washington Week newsletter on our website. 196 00:13:47,060 --> 00:13:51,520 We will give you a look at all things Washington. Thank you so much for joining. 197 00:13:51,520 --> 00:14:04,280 I'm Yamiche Alcindor. Good night.