YAMICHE ALCINDOR: Welcome to the Washington Week Extra. I'm Yamiche Alcindor.

Tonight let's continue the conversation about Facebook whistleblower Frances Haugen's

testimony before Congress. She told lawmakers the company put profits before public safety.

FRANCES HAUGEN: (From video.) The company's leadership knows how to make Facebook and

Instagram safer but won't make the necessary changes because they have put their

astronomical profits before people.

ALCINDOR: She also told lawmakers that the company, Facebook, knows their products harm kids and teens.

FRANCES HAUGEN: (From video.) It's just like cigarettes. Teenagers don't have good

self-regulation. They say explicitly I feel bad when I use Instagram and yet

I can't stop. We need to protect the kids.

ALCINDOR: Now, Facebook founder Mark Zuckerberg responded in a note to his employees

posted on his public Facebook page.

He wrote, quote, "We care deeply about issues like safety, well-being and mental

health...it's very important to me that everything we build is safe and good for kids."

Now joining us remotely, Cecilia Kang, technology reporter for The New York Times and

co-author of An Ugly Truth: Inside Facebook's Battle for Domination.

And with me here at the table, Nancy Cordes, CBS News chief White House correspondent;

Eamon Javers, CNBC's senior Washington correspondent; and Marianna Sotomayor,

congressional reporter for The Washington Post. Thank you all for being here.

Cecilia, you get the first question because, of course, you're our Facebook expert at

this table. Tell us a little bit more about this whistleblower.

Who is she? What did she say? And what's motivating her, do you think?

CECILIA KANG: Yeah. Frances Haugen spent nearly two years at Facebook on a team

called the civic integrity team. That is a team that basically tries to fight off

misinformation and other harmful content.

Her background and her expertise is actually in the technology behind the newsfeed, on

how - and how the company determines what it wants to rank highest and lowest in terms of

engagement. So she's really deep into the system. She understands the technology.

And she's also a Silicon Valley veteran. She's worked at Google, Pinterest, and Yelp as well.

And what motivated her was a decision in December 2020, when Facebook decided to disband

her civic integrity team. This was right after the election, and certainly when there

was still certainty a lot of unrest in the country about the election results.

And to her, that was the clear sign that the company was not making - was not serious

enough about protecting its users and making sure that misinformation about the election,

as well as a slew of other types of harmful content, was not on the site. And she was

seeing internally practices and a struggle with really important issues internally that

the company was not admitting to the public. So what she did was she quit in December.

And before she left she copied off tens of thousands of documents of internal research

that's actually available to many, many employees. But she copied it off.

And this is the kind of research, like the teens in Instagram research that you

mentioned, Yamiche, earlier. And she decided that she would take those documents once

she left, and she brought them to a reporter at The Wall Street Journal. And The Wall

Street Journal has since begun a series of stories. They and other journalists are now

continuing to report on all these documents that the whistleblower has brought to the public.

ALCINDOR: And, Cecilia, one of the first times I really understood the sort of backdoor

things that happen in Facebook is when you started reporting on it, and when you wrote

your amazing book - that everyone, of course, should get. I wonder if you can talk a

little bit about how your reporting connects to what this whistleblower's saying.

KANG: Yeah. We really feel like the whistleblower's testimony, certainly, and the

reporting from her documents confirm absolutely the main theme of our book.

The book theme and the book title, An Ugly Truth, comes from a memo from a very senior

executive named Andrew Bosworth, where - it's called "the ugly." Where he says: Facebook

believes so much in connecting the world that it believes that even though there will be

a lot of collateral damage because of its quest to connect the world - that kind of

damage can be terrorist attacks, it can be bullying, it can be deaths even.

But in the end, the goal of connecting the world will be better for the world, and it

will be net-net good. And we're willing to absorb those costs. That's the calculus that

the company has. That's sort of the thrust of what the whistleblower's documents show,

is that growth is the most important thing.

Because the memo said "connecting the world," but we've come to realize that that's

actually sort of a euphemism for growth, growth in engagement, and growth in profits.

And the whistleblower's main argument is that the company is so bent on growing and

keeping its site very relevant, that it is making decisions that has not just small

collateral damage, but enormous collateral damage.

ALCINDOR: And, you know, Cecilia's talking about this sort of idea of Facebook putting

everything - putting profit before everything. Eamon, I wonder when Cecilia's also

talking about how we rely on Facebook. What did this outage this week - which if people

don't really kind of realize it was Instagram, it was WhatsApp, it's Facebook.

So when say "Facebook" we're talking about multiple platforms. What did that

outage show about how much people rely on Facebook, especially around the world?

EAMON JAVERS: Well, and multiple countries around the world, and also you're talking

about businesses that do all their advertising on Facebook, that communicate with their

customers through WhatsApp.

I mean, I think of Facebook as the service that we use to keep in touch with those people

that we went to high school with, who we're too lazy to actually pick up the phone and

call. But actually, a lot of businesses are done on Facebook.

And you saw this enormous impact globally on all of those people.

And take a minute to step back and realize the impact of what the whistleblower did here.

I mean, first of all, serving as sort of an undercover anti-Facebook agent inside the

company, stealing those documents. Facebook says those are stolen documents.

Then leaking them out to The Wall Street Journal in a very tactical way for a devastating

series of blockbuster articles in the Journal, day after day after day with revelations.

Then coming forward on 60 Minutes with a big reveal of her own identity.

And then two days later Capitol Hill testimony that riveted the country.

This rollout of what the whistleblower did, this operation under cover inside of

Facebook, was devastating for Facebook. This was a very tough week for them.

ALCINDOR: And Nancy, you're nodding your head. I want to bring you in here.

I was going to ask you what does President Biden think about all this, but really Eamon

just also talked about this PR rollout that I hadn't really even put together.

What do make of all - (laughter) - he just said?

NANCY CORDES: It was impressive and I want to know who was behind it because they're

going to get a lot more business.

JAVERS: The reporting is Bill Burton was behind it, right, so I mean, there's some

Washington insiders who might have had a hand in this.

CORDES: Ah, right, who know - you know, they know - they know how the Washington

ecosystem works, certainly. You know, I think the president and the White House have

made no secret of their disdain for Facebook, right? I mean, didn't the president kind

of have to walk back his comments after he said that they were killing people, you know?

And then he clarified; he said, well, no, it's not Facebook itself that's killing people,

it's people who post on Facebook. But you know, they've been very outspoken about the

fact that they think that a lot of social media platforms, but Facebook in particular,

have a responsibility that they're not meeting right now.

The problem is, and Marianna really hit on it earlier, that they've got a very crowded

agenda. They've got a lot of things they'd like to accomplish.

And so while this is one of those issues on which Democrats and Republicans agree

something needs to be done, you wonder when it is going to rise to the top of the agenda,

especially because, I don't know if you've noticed, but lawmakers, some of them, tend not

to be all that technologically savvy - (laughter) - you've noticed that?

JAVERS: That's a very generous way of putting that. (Laughter.)

CORDES: - in some of their questioning at hearings before.

So it seems that there's - they know something needs to be done, but they're sometimes a

little bit tentative to say, definitively, and this is what I think should be done, these

are the new regulations I want to see.

JAVERS: When are you going to ban "finsta," was one of the questions. Right? (Laughs.)

CORDES: Right, exactly, so that's another reason why you'll continue to see a lot of agreement

that something should happen; when we will actually see that happen, that's an open question.

ALCINDOR: Marianna, what are you hearing on Capitol Hill from these lawmakers about

Facebook, their time for trying to regulate this, and, also, just their understanding of

what needs to be done?

MARIANNA SOTOMAYOR: Yeah, you know, there's been many years where there's been these

kinds of oversight hearings, not as blockbuster as this one, where you do have members,

you can tell, and senators, they don't really know which way to question someone; like, they get there -

ALCINDOR: In that exact tone. (Laughter.)

SOTOMAYOR: Yeah, exactly, there's a lot of hesitancy of, like, I hope I'm getting this

right. (Laughter.) But then you get the "finsta" commentaries and things like that.

So there's still a lot of people who are looking at this.

And one thing to note, too, is that there's probably going to be more investigations or

hearings before there will be any kind of legislation proposed.

And one thing to note is the January 6 committee, for example; they really want to talk

to this Facebook whistleblower because she has also mentioned the fact that Facebook had

a role in potentially allowing or, you know, not doing enough oversight to allow these

people, these insurrectionists to communicate on all these different devices and social

media networks. So that is something that - it's likely we might be able to see - in a

couple weeks or so she might come back and testify before that committee behind closed doors.

ALCINDOR: And Cecilia, it's a question that my producers and I were thinking through:

What makes Facebook so different than other social media platforms, when you think about

Twitter or other things? What sets them apart?

What possibly makes them worse than these other platforms?

KANG: Well, I think one very distinguishing factor is that the company is basically

Mark's company. It's Mark Zuckerberg's company. He owns 55 percent of voting shares.

He makes the decisions. And Frances Haugen, the whistleblower, said the buck stops with Mark.

And I think that's absolutely true in my reporting.

The other thing that's really different, in relation to the research that you mentioned,

Yamiche, on teens and Instagram and the harms, the toxic harms and sort of the negativity

that a lot of teenagers feel from using the platform: One really interesting finding from

that research, Facebook's own internal research, is that Facebook believes that Instagram

is different and, in some ways, worse than TikTok and Snapchat, and just in a very small,

interesting way. Instagram has these sort of beauty filters and there's also this culture

of trying to curate this vision of who you are in your life. There's a lot of focus on

the full body. TikTok - and, by the way, TikTok and Snapchat definitely have their

problems; they're not completely, you know, immune to problems.

But TikTok is much more of a sort of performance-based fun app, is what a lot of the

teenagers who took the surveys for Facebook said; they feel like it's a little bit more

humorous, like, sort of like, just different kinds of challenges, dances, a lot more

lighthearted. Snapchat, interestingly, has these face filters that are really sort of goofy,

cartoon-animated filters that are just supposed to also be fun, and the focus is on the face.

And so the kind of body-image issues that Instagram users reported to Facebook in its own

research, one out of three teenagers said that when they use Instagram they - because of

using Instagram they feel worse about their body image. Fourteen percent of teens in the U.K.

said that those - that they had suicidal ideations and they could trace it back to Instagram use.

I mean, those are the kinds of feelings and anxieties and really, really harmful kind of

responses that didn't exist with these other apps, and I thought that was a really

important distinguishing factor.

The other, last thing I would say is, Twitter is, very interestingly, more willing to

experiment with ways to try to fight misinformation and also to try to protect its users,

and one thing that they do is - I'm sure we've all gone through this: When you try to

retweet something that you haven't - a story that you haven't read and actually opened

up, you get a popup box that says: Are you sure you really want to retweet this?

Looks like you haven't read it.

Facebook doesn't have that kind of feature, and that feature is known as friction.

It provides friction between you and sharing, you and otherwise - in other - in other

words, you and amplifying more of that content, and Facebook just doesn't do that.

So they're not making the same kinds of decisions as some of their competitors are that

arguably could be good solutions to at least start solving this misinformation problem.

ALCINDOR: It's such a comprehensive answer and one that I think so many people really

need to hear about just the difference of Facebook with all the other social media platforms.

Eamon, I'm going to come to you for the last word here: Is this all about money?

Does this all, at the end of the day, end up about profits, and where do we go from here?

JAVERS: Yeah, look, Facebook has grown so fast over such a relatively short period of

time, you know, and you think of the past 15 years or so. The question for Facebook

is, how can they keep growing? I mean, the law of big numbers suggests once you have

almost everybody on planet Earth who's connected to the internet as part of your

service, how can you continue to grow, right? And so one of the things that they're

trying to do is keep all those people on the service for even longer amounts of time.

That's what engagement is. And the idea is that all these angry things that we are

seeing on Facebook are enticing people to stay on the service for a longer period of time.

That represents more ad dollars, more revenue for Facebook.

So the more engagement they get, the more profit they make.

And in a world where it's going to be very hard for them to find new customers because

they already have just about everybody on the planet, well, engagement is the answer.

And so if they dialed back on some of these things and dialed back on some of the angry

content, they're also going to be dialing back on profits, and that's a real problem for a public company.

ALCINDOR: Yeah, yeah. Well, we'll have to leave it there tonight. Thank you so much

to Cecilia, Nancy, Eamon, and Marianna for joining us and sharing your reporting.

And make sure to sign up for the Washington Week newsletter on our website.

We will give you a look at all things Washington. Thank you so much for joining.

I'm Yamiche Alcindor. Good night.