WEBVTT 00:02.533 --> 00:04.433 align:left position:10%,start line:71% size:80% HARI SREENIVASAN: Facebook founder Mark Zuckerberg broke his silence about what he acknowledged 00:04.433 --> 00:07.100 align:left position:20%,start line:83% size:70% was a breach of trust with the public. 00:07.100 --> 00:12.100 align:left position:10%,start line:71% size:80% It came after news investigations found Cambridge Analytica, a firm used by the Trump campaign, 00:13.533 --> 00:16.700 align:left position:10%,start line:83% size:80% improperly obtained data on 50 million Facebook users. 00:16.700 --> 00:21.366 align:left position:10%,start line:77% size:80% In his statement on Facebook, Zuckerberg wrote: "We have a responsibility to protect your 00:21.366 --> 00:24.866 align:left position:10%,start line:83% size:80% data, and if we can't, then we don't deserve to serve you." 00:24.866 --> 00:29.433 align:left position:10%,start line:77% size:80% He said steps had been taken to prevent these problems before, but he said: "We also made 00:29.433 --> 00:30.433 align:left position:30%,start line:89% size:60% mistakes. 00:30.433 --> 00:32.466 align:left position:20%,start line:89% size:70% There's more to do." 00:32.466 --> 00:36.433 align:left position:10%,start line:77% size:80% Those changes will include auditing apps that use Facebook data and investigating apps that 00:38.466 --> 00:41.566 align:left position:10%,start line:77% size:80% used large amounts of data before the company changed its policies in 2014. 00:41.566 --> 00:45.133 align:left position:10%,start line:83% size:80% It will also try to restrict some access to future data. 00:45.133 --> 00:48.766 align:left position:10%,start line:83% size:80% Tim Wu of Columbia Law School joins us for reaction now. 00:48.766 --> 00:52.200 align:left position:10%,start line:83% size:80% He writes extensively about the Web, privacy, data collection. 00:52.200 --> 00:55.000 align:left position:20%,start line:83% size:70% He's the author of "The Attention Merchants." 00:55.000 --> 00:56.266 align:left position:20%,start line:89% size:70% Thanks for joining us. 00:56.266 --> 00:57.533 align:left position:20%,start line:83% size:70% First, your reaction to the statement. 00:57.533 --> 00:59.666 align:left position:20%,start line:83% size:70% TIM WU, Columbia Law School: Sure. 00:59.666 --> 01:04.000 align:left position:10%,start line:77% size:80% You know, I think it was good that they took responsibility, but I still think that, you 01:06.033 --> 01:09.900 align:left position:10%,start line:77% size:80% know, not coming fully clean about what happened and what they're going to do here. 01:11.900 --> 01:14.266 align:left position:10%,start line:77% size:80% One thing that's very notable is, they agreed to do all this stuff back in 2011, and it 01:14.266 --> 01:16.166 align:left position:10%,start line:83% size:80% looks like they didn't live up to the promises then. 01:16.166 --> 01:18.266 align:left position:10%,start line:83% size:80% So the question is, what makes us believe them now? 01:18.266 --> 01:21.933 align:left position:10%,start line:77% size:80% HARI SREENIVASAN: And this was when they were in -- under a consent decree by the Federal 01:21.933 --> 01:23.100 align:left position:20%,start line:89% size:70% Trade Commission. 01:23.100 --> 01:25.133 align:left position:20%,start line:83% size:70% TIM WU: Yes, that's exactly right. 01:25.133 --> 01:28.966 align:left position:10%,start line:77% size:80% So, in 2011, the Federal Trade Commission - - I was working there at the time -- found 01:28.966 --> 01:33.966 align:left position:10%,start line:77% size:80% that they had let the apps take all kinds of data from people and do whatever they like. 01:36.000 --> 01:39.266 align:left position:20%,start line:71% size:70% And Facebook agreed, as you said, in the consent decree, that they'd no longer allow this to 01:39.266 --> 01:40.966 align:left position:40%,start line:89% size:50% happen. 01:40.966 --> 01:43.033 align:left position:20%,start line:77% size:70% Now it turns out it has happened, and it's happened repeatedly. 01:43.033 --> 01:47.400 align:left position:10%,start line:77% size:80% So I'm not just as reassured as you might think, given that they have already broken 01:47.400 --> 01:50.900 align:left position:20%,start line:77% size:70% similar promises, that they will keep these promises in the future. 01:50.900 --> 01:54.666 align:left position:10%,start line:77% size:80% HARI SREENIVASAN: All right, we have a piece of video from "Frontline," an upcoming film 01:54.666 --> 01:57.633 align:left position:10%,start line:83% size:80% that's going to come out with one of the former employees. 01:57.633 --> 01:59.666 align:left position:20%,start line:83% size:70% Let's take a listen to what he said. 01:59.666 --> 02:01.833 align:left position:10%,start line:77% size:80% SANDY PARAKILAS, Former Facebook Platform Operations Manager: I ended up in an interesting 02:01.833 --> 02:06.233 align:left position:10%,start line:77% size:80% situation where, because I had been the main person who was working on privacy issues with 02:08.866 --> 02:13.733 align:left position:10%,start line:77% size:80% respect to the Facebook platform, which had many, many, many privacy issues -- it was 02:13.733 --> 02:18.733 align:left position:20%,start line:71% size:70% a real hornet's nest of problems, because they were giving access to all this Facebook 02:19.833 --> 02:23.066 align:left position:20%,start line:83% size:70% data to developers with very few controls. 02:23.066 --> 02:28.066 align:left position:10%,start line:77% size:80% And because I had been one of the only people who was really focused on this issue, we ended 02:30.066 --> 02:33.166 align:left position:10%,start line:77% size:80% up in a situation a few weeks before the IPO where the press had been calling out these 02:35.200 --> 02:37.466 align:left position:10%,start line:77% size:80% issues over and over again, and they had been pointing out the ways in which Facebook had 02:37.466 --> 02:41.066 align:left position:30%,start line:83% size:60% not been meeting its obligations. 02:41.066 --> 02:45.500 align:left position:10%,start line:77% size:80% And I ended up in a meeting with a bunch of the most senior executives of the company. 02:45.500 --> 02:49.000 align:left position:10%,start line:77% size:80% And they sort of went around the room and said, well, you know who's in charge of fixing 02:49.000 --> 02:54.000 align:left position:10%,start line:77% size:80% this huge problem which has been called out in the press as one of the two biggest problems 02:55.466 --> 02:57.666 align:left position:10%,start line:83% size:80% for the company going into the biggest tech IPO in history? 02:57.666 --> 02:59.166 align:left position:20%,start line:89% size:70% And the answer was me. 02:59.166 --> 03:01.900 align:left position:10%,start line:83% size:80% HARI SREENIVASAN: Tim, that was Sandy Parakilas. 03:01.900 --> 03:05.766 align:left position:10%,start line:83% size:80% He's a platform operations manager between 2011 and 2012. 03:05.766 --> 03:10.733 align:left position:10%,start line:77% size:80% Obviously, the company is much bigger now, has far more resources, but, as you say, they 03:10.733 --> 03:12.800 align:left position:10%,start line:83% size:80% have said before, that they're going to clean up their act. 03:12.800 --> 03:17.633 align:left position:10%,start line:77% size:80% TIM WU: Yes, I mean, that's the problem, is that they keep saying this, but, you know, 03:18.700 --> 03:20.166 align:left position:10%,start line:89% size:80% there's this recidivism problem. 03:20.166 --> 03:22.433 align:left position:20%,start line:83% size:70% They keep not really doing anything. 03:22.433 --> 03:27.433 align:left position:10%,start line:77% size:80% And I think that the problem is that their model depends on accumulating data and giving 03:28.166 --> 03:30.266 align:left position:20%,start line:89% size:70% it to advertisers. 03:30.266 --> 03:34.466 align:left position:10%,start line:77% size:80% And anything that comes close to threatening that business model, they don't really seem 03:35.866 --> 03:37.866 align:left position:10%,start line:83% size:80% that interested in doing something serious about it. 03:37.866 --> 03:42.400 align:left position:10%,start line:77% size:80% You know, I understand that, but I think the time of "trust us" has got to be over. 03:44.366 --> 03:47.833 align:left position:10%,start line:77% size:80% HARI SREENIVASAN: Are any of the changes that they're proposing today going to fundamentally 03:49.233 --> 03:50.433 align:left position:10%,start line:83% size:80% change the business model you're talking about? 03:50.433 --> 03:52.366 align:left position:20%,start line:83% size:70% TIM WU: No, I don't think so at all. 03:52.366 --> 03:57.233 align:left position:10%,start line:77% size:80% You know, the -- fundamentally, Facebook is a surveillance machine. 03:57.233 --> 04:01.966 align:left position:10%,start line:71% size:80% They get as much data as they can, and they promise advertisers that they're able to manipulate 04:01.966 --> 04:04.766 align:left position:10%,start line:89% size:80% us, and that is at the core. 04:04.766 --> 04:08.033 align:left position:20%,start line:71% size:70% And so, you know, they started this by saying, well, this wasn't really a data breach, this 04:08.033 --> 04:11.933 align:left position:10%,start line:77% size:80% is our normal business model, which I think should tell you something, and then later 04:11.933 --> 04:14.633 align:left position:20%,start line:83% size:70% said, well, it's not so great, and so forth. 04:14.633 --> 04:19.633 align:left position:10%,start line:77% size:80% But they're really showing an unwillingness to do something more serious about this problem. 04:20.800 --> 04:22.833 align:left position:20%,start line:83% size:70% And it keeps happening over and over again. 04:22.833 --> 04:24.200 align:left position:20%,start line:83% size:70% This time, it's the app platform. 04:24.200 --> 04:26.200 align:left position:20%,start line:83% size:70% Another times, it's Russians buying ads. 04:26.200 --> 04:30.433 align:left position:10%,start line:77% size:80% There is just something not right here with this company and their unwillingness to come 04:30.433 --> 04:32.466 align:left position:40%,start line:89% size:50% clean. 04:32.466 --> 04:36.100 align:left position:10%,start line:77% size:80% And I think that the idea, well, just trust because Zuckerberg wrote a message on Facebook, 04:36.100 --> 04:40.066 align:left position:10%,start line:77% size:80% that everything is going to be fine is really something government investigators cannot 04:40.066 --> 04:42.033 align:left position:40%,start line:89% size:50% trust. 04:42.033 --> 04:44.500 align:left position:10%,start line:77% size:80% HARI SREENIVASAN: This is after the fact, but they're saying now that they're willing 04:44.500 --> 04:48.100 align:left position:10%,start line:71% size:80% to have app developers be audited or require that kind of layer of verification or authentication. 04:50.133 --> 04:53.933 align:left position:10%,start line:77% size:80% But in the case of Cambridge Analytica or the particular app developer, that person 04:55.133 --> 04:56.200 align:left position:20%,start line:83% size:70% was supposed to certify that the data was gone. 04:56.200 --> 04:57.233 align:left position:30%,start line:89% size:60% TIM WU: Yes. 04:57.233 --> 04:59.666 align:left position:20%,start line:89% size:70% No, I will add to that. 04:59.666 --> 05:03.600 align:left position:20%,start line:71% size:70% In the 2011 settlement, they agreed that they'd set up a verification system for apps to make 05:03.600 --> 05:06.333 align:left position:10%,start line:83% size:80% sure apps never did the kinds of things they were doing before. 05:06.333 --> 05:07.966 align:left position:20%,start line:89% size:70% That was in 2011. 05:07.966 --> 05:10.366 align:left position:10%,start line:83% size:80% And now we're talking about stuff happening afterwards. 05:10.366 --> 05:15.133 align:left position:10%,start line:71% size:80% And so whatever verification systems are going on, I guess they're like, well, it's something 05:15.133 --> 05:19.566 align:left position:20%,start line:77% size:70% like whatever -- they're accepting promises from the app developments. 05:19.566 --> 05:22.133 align:left position:20%,start line:83% size:70% They're not really taking measures. 05:22.133 --> 05:26.533 align:left position:10%,start line:71% size:80% And once again, I think the concern in Facebook's heart is that, at some point, this will hurt 05:28.433 --> 05:30.966 align:left position:10%,start line:77% size:80% their advertising revenue and the promises they have made investors. 05:30.966 --> 05:33.033 align:left position:20%,start line:83% size:70% And so they're unwilling to take serious steps. 05:33.033 --> 05:37.133 align:left position:10%,start line:77% size:80% HARI SREENIVASAN: So, Tim, at scale, what can actually be done, if we sort of abstract 05:37.133 --> 05:41.900 align:left position:10%,start line:77% size:80% larger to Facebook, to Google, to Twitter, a lot of the tech platforms that have so much 05:41.900 --> 05:43.533 align:left position:20%,start line:89% size:70% information about us? 05:43.533 --> 05:46.366 align:left position:20%,start line:83% size:70% TIM WU: You know, it is a great question. 05:46.366 --> 05:51.366 align:left position:10%,start line:71% size:80% And I think the fundamental problem is, they're all dependent on this pure advertising model, 05:53.366 --> 05:57.166 align:left position:10%,start line:77% size:80% you know, nothing but trying to get as much data out of us and sell as much as they can 05:58.466 --> 06:00.133 align:left position:10%,start line:83% size:80% of our time and attention to other people. 06:00.133 --> 06:02.600 align:left position:20%,start line:83% size:70% And that just leads in very dark directions. 06:02.600 --> 06:07.600 align:left position:10%,start line:71% size:80% I think we need to start patronizing subscription-based services, that they need to start rethinking 06:09.633 --> 06:13.500 align:left position:10%,start line:77% size:80% these business models, because they have really reached an intolerable level for American 06:14.300 --> 06:16.233 align:left position:40%,start line:89% size:50% society. 06:16.233 --> 06:18.300 align:left position:10%,start line:77% size:80% And it's starting to threaten American democracy and other values we hold dear. 06:18.300 --> 06:22.333 align:left position:10%,start line:77% size:80% HARI SREENIVASAN: This is also prompting government to take a look and say perhaps we need to 06:22.333 --> 06:25.033 align:left position:20%,start line:83% size:70% stake a more active role in regulating the space. 06:25.033 --> 06:29.133 align:left position:10%,start line:77% size:80% Does government even have the capacity and the tools to try to figure out how to monitor 06:29.133 --> 06:32.633 align:left position:20%,start line:77% size:70% or set up the rules of the road on how these companies can operate? 06:32.633 --> 06:37.166 align:left position:10%,start line:71% size:80% TIM WU: I mean, we thought we did at the FTC when we put in that consent degree, but obviously 06:37.166 --> 06:38.500 align:left position:10%,start line:89% size:80% it didn't really do anything. 06:38.500 --> 06:41.366 align:left position:20%,start line:83% size:70% So, yes, I think there's a serious problem here. 06:41.366 --> 06:46.366 align:left position:10%,start line:77% size:80% And I think part of the problem is, we haven't wanted, like Europe, to sort of get serious 06:48.333 --> 06:51.066 align:left position:10%,start line:77% size:80% because we're worried about hurting these businesses, which are, after all, American 06:51.066 --> 06:53.133 align:left position:30%,start line:89% size:60% darlings. 06:53.133 --> 06:57.066 align:left position:10%,start line:77% size:80% But, you know, when the costs become this serious, where it starts to be about the viability 06:59.533 --> 07:03.100 align:left position:10%,start line:71% size:80% of our republic and about, you know, the manipulation of people, I think that we need to take a 07:05.833 --> 07:09.100 align:left position:10%,start line:77% size:80% much more serious look and understand and, for example, look at what the Europeans are 07:09.100 --> 07:10.766 align:left position:20%,start line:83% size:70% doing and see if there's something to learn. 07:10.766 --> 07:12.266 align:left position:20%,start line:89% size:70% HARI SREENIVASAN: Yes. 07:12.266 --> 07:13.433 align:left position:10%,start line:83% size:80% All right, Tim Wu of Columbia Law School, thanks so much. 07:13.433 --> 07:14.666 align:left position:30%,start line:89% size:60% TIM WU: Yes. 07:14.666 --> 07:16.666 align:left position:30%,start line:89% size:60% It's a pleasure. 07:16.666 --> 07:21.566 align:left position:10%,start line:77% size:80% HARI SREENIVASAN: Online, we discuss what Facebook knows about you and how you can adjust 07:22.300 --> 07:23.500 align:left position:20%,start line:89% size:70% your privacy settings. 07:23.500 --> 07:26.700 align:left position:10%,start line:89% size:80% That's at Facebook.com/NewsHour. 07:26.700 --> 07:31.700 align:left position:10%,start line:77% size:80% And you can watch more of "Frontline"'s Facebook insider story at PBS.org/Frontline.