1 00:00:02,533 --> 00:00:04,433 HARI SREENIVASAN: Facebook founder Mark Zuckerberg broke his silence about what he acknowledged 2 00:00:04,433 --> 00:00:07,100 was a breach of trust with the public. 3 00:00:07,100 --> 00:00:12,100 It came after news investigations found Cambridge Analytica, a firm used by the Trump campaign, 4 00:00:13,533 --> 00:00:16,700 improperly obtained data on 50 million Facebook users. 5 00:00:16,700 --> 00:00:21,366 In his statement on Facebook, Zuckerberg wrote: "We have a responsibility to protect your 6 00:00:21,366 --> 00:00:24,866 data, and if we can't, then we don't deserve to serve you." 7 00:00:24,866 --> 00:00:29,433 He said steps had been taken to prevent these problems before, but he said: "We also made 8 00:00:29,433 --> 00:00:30,433 mistakes. 9 00:00:30,433 --> 00:00:32,466 There's more to do." 10 00:00:32,466 --> 00:00:36,433 Those changes will include auditing apps that use Facebook data and investigating apps that 11 00:00:38,466 --> 00:00:41,566 used large amounts of data before the company changed its policies in 2014. 12 00:00:41,566 --> 00:00:45,133 It will also try to restrict some access to future data. 13 00:00:45,133 --> 00:00:48,766 Tim Wu of Columbia Law School joins us for reaction now. 14 00:00:48,766 --> 00:00:52,200 He writes extensively about the Web, privacy, data collection. 15 00:00:52,200 --> 00:00:55,000 He's the author of "The Attention Merchants." 16 00:00:55,000 --> 00:00:56,266 Thanks for joining us. 17 00:00:56,266 --> 00:00:57,533 First, your reaction to the statement. 18 00:00:57,533 --> 00:00:59,666 TIM WU, Columbia Law School: Sure. 19 00:00:59,666 --> 00:01:04,000 You know, I think it was good that they took responsibility, but I still think that, you 20 00:01:06,033 --> 00:01:09,900 know, not coming fully clean about what happened and what they're going to do here. 21 00:01:11,900 --> 00:01:14,266 One thing that's very notable is, they agreed to do all this stuff back in 2011, and it 22 00:01:14,266 --> 00:01:16,166 looks like they didn't live up to the promises then. 23 00:01:16,166 --> 00:01:18,266 So the question is, what makes us believe them now? 24 00:01:18,266 --> 00:01:21,933 HARI SREENIVASAN: And this was when they were in -- under a consent decree by the Federal 25 00:01:21,933 --> 00:01:23,100 Trade Commission. 26 00:01:23,100 --> 00:01:25,133 TIM WU: Yes, that's exactly right. 27 00:01:25,133 --> 00:01:28,966 So, in 2011, the Federal Trade Commission - - I was working there at the time -- found 28 00:01:28,966 --> 00:01:33,966 that they had let the apps take all kinds of data from people and do whatever they like. 29 00:01:36,000 --> 00:01:39,266 And Facebook agreed, as you said, in the consent decree, that they'd no longer allow this to 30 00:01:39,266 --> 00:01:40,966 happen. 31 00:01:40,966 --> 00:01:43,033 Now it turns out it has happened, and it's happened repeatedly. 32 00:01:43,033 --> 00:01:47,400 So I'm not just as reassured as you might think, given that they have already broken 33 00:01:47,400 --> 00:01:50,900 similar promises, that they will keep these promises in the future. 34 00:01:50,900 --> 00:01:54,666 HARI SREENIVASAN: All right, we have a piece of video from "Frontline," an upcoming film 35 00:01:54,666 --> 00:01:57,633 that's going to come out with one of the former employees. 36 00:01:57,633 --> 00:01:59,666 Let's take a listen to what he said. 37 00:01:59,666 --> 00:02:01,833 SANDY PARAKILAS, Former Facebook Platform Operations Manager: I ended up in an interesting 38 00:02:01,833 --> 00:02:06,233 situation where, because I had been the main person who was working on privacy issues with 39 00:02:08,866 --> 00:02:13,733 respect to the Facebook platform, which had many, many, many privacy issues -- it was 40 00:02:13,733 --> 00:02:18,733 a real hornet's nest of problems, because they were giving access to all this Facebook 41 00:02:19,833 --> 00:02:23,066 data to developers with very few controls. 42 00:02:23,066 --> 00:02:28,066 And because I had been one of the only people who was really focused on this issue, we ended 43 00:02:30,066 --> 00:02:33,166 up in a situation a few weeks before the IPO where the press had been calling out these 44 00:02:35,200 --> 00:02:37,466 issues over and over again, and they had been pointing out the ways in which Facebook had 45 00:02:37,466 --> 00:02:41,066 not been meeting its obligations. 46 00:02:41,066 --> 00:02:45,500 And I ended up in a meeting with a bunch of the most senior executives of the company. 47 00:02:45,500 --> 00:02:49,000 And they sort of went around the room and said, well, you know who's in charge of fixing 48 00:02:49,000 --> 00:02:54,000 this huge problem which has been called out in the press as one of the two biggest problems 49 00:02:55,466 --> 00:02:57,666 for the company going into the biggest tech IPO in history? 50 00:02:57,666 --> 00:02:59,166 And the answer was me. 51 00:02:59,166 --> 00:03:01,900 HARI SREENIVASAN: Tim, that was Sandy Parakilas. 52 00:03:01,900 --> 00:03:05,766 He's a platform operations manager between 2011 and 2012. 53 00:03:05,766 --> 00:03:10,733 Obviously, the company is much bigger now, has far more resources, but, as you say, they 54 00:03:10,733 --> 00:03:12,800 have said before, that they're going to clean up their act. 55 00:03:12,800 --> 00:03:17,633 TIM WU: Yes, I mean, that's the problem, is that they keep saying this, but, you know, 56 00:03:18,700 --> 00:03:20,166 there's this recidivism problem. 57 00:03:20,166 --> 00:03:22,433 They keep not really doing anything. 58 00:03:22,433 --> 00:03:27,433 And I think that the problem is that their model depends on accumulating data and giving 59 00:03:28,166 --> 00:03:30,266 it to advertisers. 60 00:03:30,266 --> 00:03:34,466 And anything that comes close to threatening that business model, they don't really seem 61 00:03:35,866 --> 00:03:37,866 that interested in doing something serious about it. 62 00:03:37,866 --> 00:03:42,400 You know, I understand that, but I think the time of "trust us" has got to be over. 63 00:03:44,366 --> 00:03:47,833 HARI SREENIVASAN: Are any of the changes that they're proposing today going to fundamentally 64 00:03:49,233 --> 00:03:50,433 change the business model you're talking about? 65 00:03:50,433 --> 00:03:52,366 TIM WU: No, I don't think so at all. 66 00:03:52,366 --> 00:03:57,233 You know, the -- fundamentally, Facebook is a surveillance machine. 67 00:03:57,233 --> 00:04:01,966 They get as much data as they can, and they promise advertisers that they're able to manipulate 68 00:04:01,966 --> 00:04:04,766 us, and that is at the core. 69 00:04:04,766 --> 00:04:08,033 And so, you know, they started this by saying, well, this wasn't really a data breach, this 70 00:04:08,033 --> 00:04:11,933 is our normal business model, which I think should tell you something, and then later 71 00:04:11,933 --> 00:04:14,633 said, well, it's not so great, and so forth. 72 00:04:14,633 --> 00:04:19,633 But they're really showing an unwillingness to do something more serious about this problem. 73 00:04:20,800 --> 00:04:22,833 And it keeps happening over and over again. 74 00:04:22,833 --> 00:04:24,200 This time, it's the app platform. 75 00:04:24,200 --> 00:04:26,200 Another times, it's Russians buying ads. 76 00:04:26,200 --> 00:04:30,433 There is just something not right here with this company and their unwillingness to come 77 00:04:30,433 --> 00:04:32,466 clean. 78 00:04:32,466 --> 00:04:36,100 And I think that the idea, well, just trust because Zuckerberg wrote a message on Facebook, 79 00:04:36,100 --> 00:04:40,066 that everything is going to be fine is really something government investigators cannot 80 00:04:40,066 --> 00:04:42,033 trust. 81 00:04:42,033 --> 00:04:44,500 HARI SREENIVASAN: This is after the fact, but they're saying now that they're willing 82 00:04:44,500 --> 00:04:48,100 to have app developers be audited or require that kind of layer of verification or authentication. 83 00:04:50,133 --> 00:04:53,933 But in the case of Cambridge Analytica or the particular app developer, that person 84 00:04:55,133 --> 00:04:56,200 was supposed to certify that the data was gone. 85 00:04:56,200 --> 00:04:57,233 TIM WU: Yes. 86 00:04:57,233 --> 00:04:59,666 No, I will add to that. 87 00:04:59,666 --> 00:05:03,600 In the 2011 settlement, they agreed that they'd set up a verification system for apps to make 88 00:05:03,600 --> 00:05:06,333 sure apps never did the kinds of things they were doing before. 89 00:05:06,333 --> 00:05:07,966 That was in 2011. 90 00:05:07,966 --> 00:05:10,366 And now we're talking about stuff happening afterwards. 91 00:05:10,366 --> 00:05:15,133 And so whatever verification systems are going on, I guess they're like, well, it's something 92 00:05:15,133 --> 00:05:19,566 like whatever -- they're accepting promises from the app developments. 93 00:05:19,566 --> 00:05:22,133 They're not really taking measures. 94 00:05:22,133 --> 00:05:26,533 And once again, I think the concern in Facebook's heart is that, at some point, this will hurt 95 00:05:28,433 --> 00:05:30,966 their advertising revenue and the promises they have made investors. 96 00:05:30,966 --> 00:05:33,033 And so they're unwilling to take serious steps. 97 00:05:33,033 --> 00:05:37,133 HARI SREENIVASAN: So, Tim, at scale, what can actually be done, if we sort of abstract 98 00:05:37,133 --> 00:05:41,900 larger to Facebook, to Google, to Twitter, a lot of the tech platforms that have so much 99 00:05:41,900 --> 00:05:43,533 information about us? 100 00:05:43,533 --> 00:05:46,366 TIM WU: You know, it is a great question. 101 00:05:46,366 --> 00:05:51,366 And I think the fundamental problem is, they're all dependent on this pure advertising model, 102 00:05:53,366 --> 00:05:57,166 you know, nothing but trying to get as much data out of us and sell as much as they can 103 00:05:58,466 --> 00:06:00,133 of our time and attention to other people. 104 00:06:00,133 --> 00:06:02,600 And that just leads in very dark directions. 105 00:06:02,600 --> 00:06:07,600 I think we need to start patronizing subscription-based services, that they need to start rethinking 106 00:06:09,633 --> 00:06:13,500 these business models, because they have really reached an intolerable level for American 107 00:06:14,300 --> 00:06:16,233 society. 108 00:06:16,233 --> 00:06:18,300 And it's starting to threaten American democracy and other values we hold dear. 109 00:06:18,300 --> 00:06:22,333 HARI SREENIVASAN: This is also prompting government to take a look and say perhaps we need to 110 00:06:22,333 --> 00:06:25,033 stake a more active role in regulating the space. 111 00:06:25,033 --> 00:06:29,133 Does government even have the capacity and the tools to try to figure out how to monitor 112 00:06:29,133 --> 00:06:32,633 or set up the rules of the road on how these companies can operate? 113 00:06:32,633 --> 00:06:37,166 TIM WU: I mean, we thought we did at the FTC when we put in that consent degree, but obviously 114 00:06:37,166 --> 00:06:38,500 it didn't really do anything. 115 00:06:38,500 --> 00:06:41,366 So, yes, I think there's a serious problem here. 116 00:06:41,366 --> 00:06:46,366 And I think part of the problem is, we haven't wanted, like Europe, to sort of get serious 117 00:06:48,333 --> 00:06:51,066 because we're worried about hurting these businesses, which are, after all, American 118 00:06:51,066 --> 00:06:53,133 darlings. 119 00:06:53,133 --> 00:06:57,066 But, you know, when the costs become this serious, where it starts to be about the viability 120 00:06:59,533 --> 00:07:03,100 of our republic and about, you know, the manipulation of people, I think that we need to take a 121 00:07:05,833 --> 00:07:09,100 much more serious look and understand and, for example, look at what the Europeans are 122 00:07:09,100 --> 00:07:10,766 doing and see if there's something to learn. 123 00:07:10,766 --> 00:07:12,266 HARI SREENIVASAN: Yes. 124 00:07:12,266 --> 00:07:13,433 All right, Tim Wu of Columbia Law School, thanks so much. 125 00:07:13,433 --> 00:07:14,666 TIM WU: Yes. 126 00:07:14,666 --> 00:07:16,666 It's a pleasure. 127 00:07:16,666 --> 00:07:21,566 HARI SREENIVASAN: Online, we discuss what Facebook knows about you and how you can adjust 128 00:07:22,300 --> 00:07:23,500 your privacy settings. 129 00:07:23,500 --> 00:07:26,700 That's at Facebook.com/NewsHour. 130 00:07:26,700 --> 00:07:31,700 And you can watch more of "Frontline"'s Facebook insider story at PBS.org/Frontline.