HARI SREENIVASAN: Facebook
founder Mark Zuckerberg
broke his silence about
what he acknowledged

was a breach of trust
with the public.

It came after news
investigations found
Cambridge Analytica, a firm
used by the Trump campaign,

 

improperly obtained data on
50 million Facebook users.

In his statement on Facebook,
Zuckerberg wrote: "We have a
responsibility to protect your

data, and if we can't, then we
don't deserve to serve you."

He said steps had been taken to
prevent these problems before,
but he said: "We also made

mistakes.

There's more to do."

Those changes will include
auditing apps that use Facebook
data and investigating apps that

 

used large amounts of data
before the company changed
its policies in 2014.

It will also try to restrict
some access to future data.

Tim Wu of Columbia Law School
joins us for reaction now.

He writes extensively about the
Web, privacy, data collection.

He's the author of "The
Attention Merchants."

Thanks for joining us.

First, your reaction
to the statement.

TIM WU, Columbia
Law School: Sure.

You know, I think it was good
that they took responsibility,
but I still think that, you

 

know, not coming fully clean
about what happened and what
they're going to do here.

 

One thing that's very notable
is, they agreed to do all this
stuff back in 2011, and it

looks like they didn't live
up to the promises then.

So the question is, what
makes us believe them now?

HARI SREENIVASAN: And this was
when they were in -- under a
consent decree by the Federal

Trade Commission.

TIM WU: Yes, that's
exactly right.

So, in 2011, the Federal Trade
Commission - - I was working
there at the time -- found

that they had let the apps take
all kinds of data from people
and do whatever they like.

 

And Facebook agreed, as
you said, in the consent
decree, that they'd no
longer allow this to

happen.

Now it turns out it
has happened, and it's
happened repeatedly.

So I'm not just as reassured
as you might think, given
that they have already broken

similar promises, that
they will keep these
promises in the future.

HARI SREENIVASAN: All right,
we have a piece of video from
"Frontline," an upcoming film

that's going to come out with
one of the former employees.

Let's take a listen
to what he said.

SANDY PARAKILAS, Former Facebook
Platform Operations Manager:
I ended up in an interesting

situation where, because I had
been the main person who was
working on privacy issues with

 

respect to the Facebook
platform, which had many, many,
many privacy issues -- it was

a real hornet's nest
of problems, because
they were giving access
to all this Facebook

 

data to developers
with very few controls.

And because I had been one of
the only people who was really
focused on this issue, we ended

 

up in a situation a few weeks
before the IPO where the press
had been calling out these

 

issues over and over again,
and they had been pointing out
the ways in which Facebook had

not been meeting
its obligations.

And I ended up in a meeting
with a bunch of the most senior
executives of the company.

And they sort of went around
the room and said, well, you
know who's in charge of fixing

this huge problem which has
been called out in the press as
one of the two biggest problems

 

for the company going into the
biggest tech IPO in history?

And the answer was me.

HARI SREENIVASAN: Tim,
that was Sandy Parakilas.

He's a platform operations
manager between 2011 and 2012.

Obviously, the company is
much bigger now, has far more
resources, but, as you say, they

have said before, that they're
going to clean up their act.

TIM WU: Yes, I mean, that's
the problem, is that they keep
saying this, but, you know,

 

there's this recidivism problem.

They keep not really
doing anything.

And I think that the problem
is that their model depends on
accumulating data and giving

 

it to advertisers.

And anything that comes close
to threatening that business
model, they don't really seem

 

that interested in doing
something serious about it.

You know, I understand that,
but I think the time of
"trust us" has got to be over.

 

HARI SREENIVASAN: Are any of the
changes that they're proposing
today going to fundamentally

 

change the business model
you're talking about?

TIM WU: No, I don't
think so at all.

You know, the --
fundamentally, Facebook
is a surveillance machine.

They get as much data
as they can, and they
promise advertisers that
they're able to manipulate

us, and that is at the core.

And so, you know, they
started this by saying,
well, this wasn't really
a data breach, this

is our normal business model,
which I think should tell
you something, and then later

said, well, it's not
so great, and so forth.

But they're really showing an
unwillingness to do something
more serious about this problem.

 

And it keeps happening
over and over again.

This time, it's
the app platform.

Another times, it's
Russians buying ads.

There is just something not
right here with this company
and their unwillingness to come

clean.

And I think that the idea, well,
just trust because Zuckerberg
wrote a message on Facebook,

that everything is going to
be fine is really something
government investigators cannot

trust.

HARI SREENIVASAN: This is
after the fact, but they're
saying now that they're willing

to have app developers be
audited or require that kind
of layer of verification
or authentication.

 

But in the case of Cambridge
Analytica or the particular
app developer, that person

 

was supposed to certify
that the data was gone.

TIM WU: Yes.

No, I will add to that.

In the 2011 settlement,
they agreed that they'd
set up a verification
system for apps to make

sure apps never did the kinds of
things they were doing before.

That was in 2011.

And now we're talking about
stuff happening afterwards.

And so whatever verification
systems are going on,
I guess they're like,
well, it's something

like whatever -- they're
accepting promises from
the app developments.

They're not really
taking measures.

And once again, I think
the concern in Facebook's
heart is that, at some
point, this will hurt

 

their advertising revenue
and the promises they
have made investors.

And so they're unwilling
to take serious steps.

HARI SREENIVASAN: So, Tim, at
scale, what can actually be
done, if we sort of abstract

larger to Facebook, to Google,
to Twitter, a lot of the tech
platforms that have so much

information about us?

TIM WU: You know, it
is a great question.

And I think the fundamental
problem is, they're
all dependent on this
pure advertising model,

 

you know, nothing but trying
to get as much data out of us
and sell as much as they can

 

of our time and attention
to other people.

And that just leads in
very dark directions.

I think we need to start
patronizing subscription-based
services, that they
need to start rethinking

 

these business models, because
they have really reached an
intolerable level for American

 

society.

And it's starting to threaten
American democracy and
other values we hold dear.

HARI SREENIVASAN: This is also
prompting government to take a
look and say perhaps we need to

stake a more active role
in regulating the space.

Does government even have the
capacity and the tools to try
to figure out how to monitor

or set up the rules of
the road on how these
companies can operate?

TIM WU: I mean, we thought
we did at the FTC when
we put in that consent
degree, but obviously

it didn't really do anything.

So, yes, I think there's
a serious problem here.

And I think part of the problem
is, we haven't wanted, like
Europe, to sort of get serious

 

because we're worried about
hurting these businesses,
which are, after all, American

darlings.

But, you know, when the costs
become this serious, where it
starts to be about the viability

 

of our republic and about,
you know, the manipulation
of people, I think
that we need to take a

 

much more serious look and
understand and, for example,
look at what the Europeans are

doing and see if there's
something to learn.

HARI SREENIVASAN: Yes.

All right, Tim Wu of Columbia
Law School, thanks so much.

TIM WU: Yes.

It's a pleasure.

HARI SREENIVASAN: Online, we
discuss what Facebook knows
about you and how you can adjust

 

your privacy settings.

That's at Facebook.com/NewsHour.

And you can watch more of
"Frontline"'s Facebook insider
story at PBS.org/Frontline.