WEBVTT 00:00.000 --> 00:03.320 align:start YAMICHE ALCINDOR: Welcome to the Washington Week Extra. I'm Yamiche Alcindor. 00:03.320 --> 00:07.960 align:start Tonight let's continue the conversation about Facebook whistleblower Frances Haugen's 00:07.960 --> 00:13.570 align:start testimony before Congress. She told lawmakers the company put profits before public safety. 00:13.570 --> 00:17.420 align:start FRANCES HAUGEN: (From video.) The company's leadership knows how to make Facebook and 00:17.420 --> 00:21.250 align:start Instagram safer but won't make the necessary changes because they have put their 00:21.250 --> 00:25.210 align:start astronomical profits before people. 00:25.210 --> 00:30.700 align:start ALCINDOR: She also told lawmakers that the company, Facebook, knows their products harm kids and teens. 00:30.700 --> 00:34.320 align:start FRANCES HAUGEN: (From video.) It's just like cigarettes. Teenagers don't have good 00:34.320 --> 00:39.340 align:start self-regulation. They say explicitly I feel bad when I use Instagram and yet 00:39.340 --> 00:43.440 align:start I can't stop. We need to protect the kids. 00:43.440 --> 00:47.910 align:start ALCINDOR: Now, Facebook founder Mark Zuckerberg responded in a note to his employees 00:47.910 --> 00:50.600 align:start posted on his public Facebook page. 00:50.600 --> 00:55.150 align:start He wrote, quote, "We care deeply about issues like safety, well-being and mental 00:55.150 --> 00:59.580 align:start health...it's very important to me that everything we build is safe and good for kids." 00:59.580 --> 01:03.780 align:start Now joining us remotely, Cecilia Kang, technology reporter for The New York Times and 01:03.780 --> 01:07.960 align:start co-author of An Ugly Truth: Inside Facebook's Battle for Domination. 01:07.960 --> 01:12.660 align:start And with me here at the table, Nancy Cordes, CBS News chief White House correspondent; 01:12.660 --> 01:17.790 align:start Eamon Javers, CNBC's senior Washington correspondent; and Marianna Sotomayor, 01:17.790 --> 01:21.540 align:start congressional reporter for The Washington Post. Thank you all for being here. 01:21.540 --> 01:25.610 align:start Cecilia, you get the first question because, of course, you're our Facebook expert at 01:25.610 --> 01:28.400 align:start this table. Tell us a little bit more about this whistleblower. 01:28.400 --> 01:32.200 align:start Who is she? What did she say? And what's motivating her, do you think? 01:32.200 --> 01:37.220 align:start CECILIA KANG: Yeah. Frances Haugen spent nearly two years at Facebook on a team 01:37.220 --> 01:41.680 align:start called the civic integrity team. That is a team that basically tries to fight off 01:41.680 --> 01:44.390 align:start misinformation and other harmful content. 01:44.390 --> 01:50.080 align:start Her background and her expertise is actually in the technology behind the newsfeed, on 01:50.080 --> 01:55.280 align:start how - and how the company determines what it wants to rank highest and lowest in terms of 01:55.280 --> 01:59.450 align:start engagement. So she's really deep into the system. She understands the technology. 01:59.450 --> 02:05.540 align:start And she's also a Silicon Valley veteran. She's worked at Google, Pinterest, and Yelp as well. 02:05.540 --> 02:13.060 align:start And what motivated her was a decision in December 2020, when Facebook decided to disband 02:13.060 --> 02:17.860 align:start her civic integrity team. This was right after the election, and certainly when there 02:17.860 --> 02:21.980 align:start was still certainty a lot of unrest in the country about the election results. 02:21.980 --> 02:27.170 align:start And to her, that was the clear sign that the company was not making - was not serious 02:27.170 --> 02:32.090 align:start enough about protecting its users and making sure that misinformation about the election, 02:32.090 --> 02:37.430 align:start as well as a slew of other types of harmful content, was not on the site. And she was 02:37.430 --> 02:43.560 align:start seeing internally practices and a struggle with really important issues internally that 02:43.560 --> 02:47.640 align:start the company was not admitting to the public. So what she did was she quit in December. 02:47.640 --> 02:52.710 align:start And before she left she copied off tens of thousands of documents of internal research 02:52.710 --> 02:57.290 align:start that's actually available to many, many employees. But she copied it off. 02:57.290 --> 03:02.270 align:start And this is the kind of research, like the teens in Instagram research that you 03:02.270 --> 03:07.560 align:start mentioned, Yamiche, earlier. And she decided that she would take those documents once 03:07.560 --> 03:11.120 align:start she left, and she brought them to a reporter at The Wall Street Journal. And The Wall 03:11.120 --> 03:15.320 align:start Street Journal has since begun a series of stories. They and other journalists are now 03:15.320 --> 03:20.080 align:start continuing to report on all these documents that the whistleblower has brought to the public. 03:20.080 --> 03:24.920 align:start ALCINDOR: And, Cecilia, one of the first times I really understood the sort of backdoor 03:24.920 --> 03:29.160 align:start things that happen in Facebook is when you started reporting on it, and when you wrote 03:29.160 --> 03:34.020 align:start your amazing book - that everyone, of course, should get. I wonder if you can talk a 03:34.020 --> 03:38.130 align:start little bit about how your reporting connects to what this whistleblower's saying. 03:38.130 --> 03:42.720 align:start KANG: Yeah. We really feel like the whistleblower's testimony, certainly, and the 03:42.720 --> 03:48.230 align:start reporting from her documents confirm absolutely the main theme of our book. 03:48.230 --> 03:53.720 align:start The book theme and the book title, An Ugly Truth, comes from a memo from a very senior 03:53.720 --> 03:58.210 align:start executive named Andrew Bosworth, where - it's called "the ugly." Where he says: Facebook 03:58.210 --> 04:03.100 align:start believes so much in connecting the world that it believes that even though there will be 04:03.100 --> 04:07.460 align:start a lot of collateral damage because of its quest to connect the world - that kind of 04:07.460 --> 04:11.490 align:start damage can be terrorist attacks, it can be bullying, it can be deaths even. 04:11.490 --> 04:16.080 align:start But in the end, the goal of connecting the world will be better for the world, and it 04:16.080 --> 04:20.550 align:start will be net-net good. And we're willing to absorb those costs. That's the calculus that 04:20.550 --> 04:25.000 align:start the company has. That's sort of the thrust of what the whistleblower's documents show, 04:25.000 --> 04:28.220 align:start is that growth is the most important thing. 04:28.220 --> 04:31.440 align:start Because the memo said "connecting the world," but we've come to realize that that's 04:31.440 --> 04:36.220 align:start actually sort of a euphemism for growth, growth in engagement, and growth in profits. 04:36.220 --> 04:41.280 align:start And the whistleblower's main argument is that the company is so bent on growing and 04:41.280 --> 04:47.680 align:start keeping its site very relevant, that it is making decisions that has not just small 04:47.680 --> 04:51.280 align:start collateral damage, but enormous collateral damage. 04:51.280 --> 04:55.650 align:start ALCINDOR: And, you know, Cecilia's talking about this sort of idea of Facebook putting 04:55.650 --> 05:00.460 align:start everything - putting profit before everything. Eamon, I wonder when Cecilia's also 05:00.460 --> 05:04.800 align:start talking about how we rely on Facebook. What did this outage this week - which if people 05:04.800 --> 05:09.330 align:start don't really kind of realize it was Instagram, it was WhatsApp, it's Facebook. 05:09.330 --> 05:12.120 align:start So when say "Facebook" we're talking about multiple platforms. What did that 05:12.120 --> 05:15.840 align:start outage show about how much people rely on Facebook, especially around the world? 05:15.840 --> 05:18.920 align:start EAMON JAVERS: Well, and multiple countries around the world, and also you're talking 05:18.920 --> 05:22.110 align:start about businesses that do all their advertising on Facebook, that communicate with their 05:22.110 --> 05:24.500 align:start customers through WhatsApp. 05:24.500 --> 05:27.550 align:start I mean, I think of Facebook as the service that we use to keep in touch with those people 05:27.550 --> 05:30.820 align:start that we went to high school with, who we're too lazy to actually pick up the phone and 05:30.820 --> 05:33.590 align:start call. But actually, a lot of businesses are done on Facebook. 05:33.590 --> 05:36.520 align:start And you saw this enormous impact globally on all of those people. 05:36.520 --> 05:41.000 align:start And take a minute to step back and realize the impact of what the whistleblower did here. 05:41.000 --> 05:44.950 align:start I mean, first of all, serving as sort of an undercover anti-Facebook agent inside the 05:44.950 --> 05:48.760 align:start company, stealing those documents. Facebook says those are stolen documents. 05:48.760 --> 05:53.560 align:start Then leaking them out to The Wall Street Journal in a very tactical way for a devastating 05:53.560 --> 05:58.280 align:start series of blockbuster articles in the Journal, day after day after day with revelations. 05:58.280 --> 06:03.160 align:start Then coming forward on 60 Minutes with a big reveal of her own identity. 06:03.160 --> 06:06.910 align:start And then two days later Capitol Hill testimony that riveted the country. 06:06.910 --> 06:10.850 align:start This rollout of what the whistleblower did, this operation under cover inside of 06:10.850 --> 06:15.510 align:start Facebook, was devastating for Facebook. This was a very tough week for them. 06:15.510 --> 06:18.620 align:start ALCINDOR: And Nancy, you're nodding your head. I want to bring you in here. 06:18.620 --> 06:21.540 align:start I was going to ask you what does President Biden think about all this, but really Eamon 06:21.540 --> 06:24.120 align:start just also talked about this PR rollout that I hadn't really even put together. 06:24.120 --> 06:26.550 align:start What do make of all - (laughter) - he just said? 06:26.550 --> 06:29.420 align:start NANCY CORDES: It was impressive and I want to know who was behind it because they're 06:29.420 --> 06:31.560 align:start going to get a lot more business. 06:31.560 --> 06:34.410 align:start JAVERS: The reporting is Bill Burton was behind it, right, so I mean, there's some 06:34.410 --> 06:36.760 align:start Washington insiders who might have had a hand in this. 06:36.760 --> 06:40.360 align:start CORDES: Ah, right, who know - you know, they know - they know how the Washington 06:40.360 --> 06:43.400 align:start ecosystem works, certainly. You know, I think the president and the White House have 06:43.400 --> 06:46.920 align:start made no secret of their disdain for Facebook, right? I mean, didn't the president kind 06:46.920 --> 06:50.350 align:start of have to walk back his comments after he said that they were killing people, you know? 06:50.350 --> 06:54.630 align:start And then he clarified; he said, well, no, it's not Facebook itself that's killing people, 06:54.630 --> 06:59.040 align:start it's people who post on Facebook. But you know, they've been very outspoken about the 06:59.040 --> 07:02.950 align:start fact that they think that a lot of social media platforms, but Facebook in particular, 07:02.950 --> 07:06.770 align:start have a responsibility that they're not meeting right now. 07:06.770 --> 07:11.080 align:start The problem is, and Marianna really hit on it earlier, that they've got a very crowded 07:11.080 --> 07:14.720 align:start agenda. They've got a lot of things they'd like to accomplish. 07:14.720 --> 07:19.200 align:start And so while this is one of those issues on which Democrats and Republicans agree 07:19.200 --> 07:25.530 align:start something needs to be done, you wonder when it is going to rise to the top of the agenda, 07:25.530 --> 07:30.030 align:start especially because, I don't know if you've noticed, but lawmakers, some of them, tend not 07:30.030 --> 07:35.410 align:start to be all that technologically savvy - (laughter) - you've noticed that? 07:35.410 --> 07:37.710 align:start JAVERS: That's a very generous way of putting that. (Laughter.) 07:37.710 --> 07:40.580 align:start CORDES: - in some of their questioning at hearings before. 07:40.580 --> 07:44.790 align:start So it seems that there's - they know something needs to be done, but they're sometimes a 07:44.790 --> 07:49.350 align:start little bit tentative to say, definitively, and this is what I think should be done, these 07:49.350 --> 07:51.830 align:start are the new regulations I want to see. 07:51.830 --> 07:54.520 align:start JAVERS: When are you going to ban "finsta," was one of the questions. Right? (Laughs.) 07:54.520 --> 07:58.180 align:start CORDES: Right, exactly, so that's another reason why you'll continue to see a lot of agreement 07:58.180 --> 08:03.440 align:start that something should happen; when we will actually see that happen, that's an open question. 08:03.440 --> 08:07.980 align:start ALCINDOR: Marianna, what are you hearing on Capitol Hill from these lawmakers about 08:07.980 --> 08:11.210 align:start Facebook, their time for trying to regulate this, and, also, just their understanding of 08:11.210 --> 08:13.560 align:start what needs to be done? 08:13.560 --> 08:16.420 align:start MARIANNA SOTOMAYOR: Yeah, you know, there's been many years where there's been these 08:16.420 --> 08:20.740 align:start kinds of oversight hearings, not as blockbuster as this one, where you do have members, 08:20.740 --> 08:28.140 align:start you can tell, and senators, they don't really know which way to question someone; like, they get there - 08:28.140 --> 08:30.630 align:start ALCINDOR: In that exact tone. (Laughter.) 08:30.630 --> 08:34.690 align:start SOTOMAYOR: Yeah, exactly, there's a lot of hesitancy of, like, I hope I'm getting this 08:34.690 --> 08:38.550 align:start right. (Laughter.) But then you get the "finsta" commentaries and things like that. 08:38.550 --> 08:41.930 align:start So there's still a lot of people who are looking at this. 08:41.930 --> 08:46.460 align:start And one thing to note, too, is that there's probably going to be more investigations or 08:46.460 --> 08:51.780 align:start hearings before there will be any kind of legislation proposed. 08:51.780 --> 08:57.630 align:start And one thing to note is the January 6 committee, for example; they really want to talk 08:57.630 --> 09:02.360 align:start to this Facebook whistleblower because she has also mentioned the fact that Facebook had 09:02.360 --> 09:08.560 align:start a role in potentially allowing or, you know, not doing enough oversight to allow these 09:08.560 --> 09:12.890 align:start people, these insurrectionists to communicate on all these different devices and social 09:12.890 --> 09:17.920 align:start media networks. So that is something that - it's likely we might be able to see - in a 09:17.920 --> 09:22.800 align:start couple weeks or so she might come back and testify before that committee behind closed doors. 09:22.800 --> 09:27.680 align:start ALCINDOR: And Cecilia, it's a question that my producers and I were thinking through: 09:27.680 --> 09:33.050 align:start What makes Facebook so different than other social media platforms, when you think about 09:33.050 --> 09:35.310 align:start Twitter or other things? What sets them apart? 09:35.310 --> 09:38.030 align:start What possibly makes them worse than these other platforms? 09:38.030 --> 09:42.490 align:start KANG: Well, I think one very distinguishing factor is that the company is basically 09:42.490 --> 09:47.850 align:start Mark's company. It's Mark Zuckerberg's company. He owns 55 percent of voting shares. 09:47.850 --> 09:51.190 align:start He makes the decisions. And Frances Haugen, the whistleblower, said the buck stops with Mark. 09:51.190 --> 09:53.780 align:start And I think that's absolutely true in my reporting. 09:53.780 --> 09:58.100 align:start The other thing that's really different, in relation to the research that you mentioned, 09:58.100 --> 10:03.230 align:start Yamiche, on teens and Instagram and the harms, the toxic harms and sort of the negativity 10:03.230 --> 10:09.390 align:start that a lot of teenagers feel from using the platform: One really interesting finding from 10:09.390 --> 10:15.660 align:start that research, Facebook's own internal research, is that Facebook believes that Instagram 10:15.660 --> 10:20.050 align:start is different and, in some ways, worse than TikTok and Snapchat, and just in a very small, 10:20.050 --> 10:25.150 align:start interesting way. Instagram has these sort of beauty filters and there's also this culture 10:25.150 --> 10:29.510 align:start of trying to curate this vision of who you are in your life. There's a lot of focus on 10:29.510 --> 10:34.580 align:start the full body. TikTok - and, by the way, TikTok and Snapchat definitely have their 10:34.580 --> 10:38.700 align:start problems; they're not completely, you know, immune to problems. 10:38.700 --> 10:43.500 align:start But TikTok is much more of a sort of performance-based fun app, is what a lot of the 10:43.500 --> 10:48.460 align:start teenagers who took the surveys for Facebook said; they feel like it's a little bit more 10:48.460 --> 10:53.150 align:start humorous, like, sort of like, just different kinds of challenges, dances, a lot more 10:53.150 --> 10:58.210 align:start lighthearted. Snapchat, interestingly, has these face filters that are really sort of goofy, 10:58.210 --> 11:04.170 align:start cartoon-animated filters that are just supposed to also be fun, and the focus is on the face. 11:04.170 --> 11:09.430 align:start And so the kind of body-image issues that Instagram users reported to Facebook in its own 11:09.430 --> 11:14.340 align:start research, one out of three teenagers said that when they use Instagram they - because of 11:14.340 --> 11:20.330 align:start using Instagram they feel worse about their body image. Fourteen percent of teens in the U.K. 11:20.330 --> 11:26.470 align:start said that those - that they had suicidal ideations and they could trace it back to Instagram use. 11:26.470 --> 11:31.690 align:start I mean, those are the kinds of feelings and anxieties and really, really harmful kind of 11:31.690 --> 11:36.260 align:start responses that didn't exist with these other apps, and I thought that was a really 11:36.260 --> 11:38.780 align:start important distinguishing factor. 11:38.780 --> 11:44.520 align:start The other, last thing I would say is, Twitter is, very interestingly, more willing to 11:44.520 --> 11:49.580 align:start experiment with ways to try to fight misinformation and also to try to protect its users, 11:49.580 --> 11:52.960 align:start and one thing that they do is - I'm sure we've all gone through this: When you try to 11:52.960 --> 11:56.580 align:start retweet something that you haven't - a story that you haven't read and actually opened 11:56.580 --> 12:00.120 align:start up, you get a popup box that says: Are you sure you really want to retweet this? 12:00.120 --> 12:04.620 align:start Looks like you haven't read it. 12:04.620 --> 12:09.280 align:start Facebook doesn't have that kind of feature, and that feature is known as friction. 12:09.280 --> 12:13.440 align:start It provides friction between you and sharing, you and otherwise - in other - in other 12:13.440 --> 12:18.660 align:start words, you and amplifying more of that content, and Facebook just doesn't do that. 12:18.660 --> 12:22.230 align:start So they're not making the same kinds of decisions as some of their competitors are that 12:22.230 --> 12:28.220 align:start arguably could be good solutions to at least start solving this misinformation problem. 12:28.220 --> 12:32.520 align:start ALCINDOR: It's such a comprehensive answer and one that I think so many people really 12:32.520 --> 12:35.560 align:start need to hear about just the difference of Facebook with all the other social media platforms. 12:35.560 --> 12:38.910 align:start Eamon, I'm going to come to you for the last word here: Is this all about money? 12:38.910 --> 12:43.460 align:start Does this all, at the end of the day, end up about profits, and where do we go from here? 12:43.460 --> 12:46.890 align:start JAVERS: Yeah, look, Facebook has grown so fast over such a relatively short period of 12:46.890 --> 12:51.120 align:start time, you know, and you think of the past 15 years or so. The question for Facebook 12:51.120 --> 12:55.010 align:start is, how can they keep growing? I mean, the law of big numbers suggests once you have 12:55.010 --> 12:58.450 align:start almost everybody on planet Earth who's connected to the internet as part of your 12:58.450 --> 13:01.640 align:start service, how can you continue to grow, right? And so one of the things that they're 13:01.640 --> 13:04.830 align:start trying to do is keep all those people on the service for even longer amounts of time. 13:04.830 --> 13:10.810 align:start That's what engagement is. And the idea is that all these angry things that we are 13:10.810 --> 13:14.340 align:start seeing on Facebook are enticing people to stay on the service for a longer period of time. 13:14.340 --> 13:17.050 align:start That represents more ad dollars, more revenue for Facebook. 13:17.050 --> 13:20.080 align:start So the more engagement they get, the more profit they make. 13:20.080 --> 13:23.450 align:start And in a world where it's going to be very hard for them to find new customers because 13:23.450 --> 13:27.260 align:start they already have just about everybody on the planet, well, engagement is the answer. 13:27.260 --> 13:31.120 align:start And so if they dialed back on some of these things and dialed back on some of the angry 13:31.120 --> 13:36.540 align:start content, they're also going to be dialing back on profits, and that's a real problem for a public company. 13:36.540 --> 13:40.060 align:start ALCINDOR: Yeah, yeah. Well, we'll have to leave it there tonight. Thank you so much 13:40.060 --> 13:43.690 align:start to Cecilia, Nancy, Eamon, and Marianna for joining us and sharing your reporting. 13:43.690 --> 13:47.060 align:start And make sure to sign up for the Washington Week newsletter on our website. 13:47.060 --> 13:51.520 align:start We will give you a look at all things Washington. Thank you so much for joining. 13:51.520 --> 14:04.280 align:start I'm Yamiche Alcindor. Good night.