1 00:00:00,000 --> 00:00:03,600 JUDY WOODRUFF: A Senate committee is widening its investigation into the 2 00:00:03,600 --> 00:00:08,600 impact social media platforms have on children, teens and young adults, 3 00:00:08,960 --> 00:00:11,920 with more apps facing congressional scrutiny. 4 00:00:11,920 --> 00:00:15,520 William Brangham has our coverage, beginning with this report. 5 00:00:15,520 --> 00:00:20,520 And a warning: This story contains sensitive subject matter, including discussion of suicide. 6 00:00:21,600 --> 00:00:24,800 SEN. AMY KLOBUCHAR (D-MN): I don't think parents are going to 7 00:00:25,680 --> 00:00:30,680 stand by while our kids and democracy become collateral damage to a profit game. 8 00:00:34,160 --> 00:00:39,160 WILLIAM BRANGHAM: On Capitol Hill, executives from YouTube, Snapchat, and TikTok were grilled 9 00:00:39,920 --> 00:00:44,920 by lawmakers on what these wildly popular platforms are doing to protect children online, 10 00:00:46,320 --> 00:00:50,000 and exactly what kinds of material kids are able to access. 11 00:00:50,000 --> 00:00:55,000 SEN. MARSHA BLACKBURN (R-TN): Kids as young as 9 have died doing viral challenges on TikTok. 12 00:00:57,360 --> 00:01:01,920 WILLIAM BRANGHAM: Today marks the first time representatives from TikTok and Snapchat have 13 00:01:01,920 --> 00:01:06,320 appeared before Congress. Among the many issues lawmakers asked about, 14 00:01:06,320 --> 00:01:11,320 how to prevent dealers selling counterfeit pills and illegal substances to young people. 15 00:01:11,820 --> 00:01:14,560 SEN. AMY KLOBUCHAR: If a kid had just walked into, 16 00:01:14,560 --> 00:01:17,760 say, a pharmacy, he wouldn't be able to buy that or get that. 17 00:01:17,760 --> 00:01:19,280 JENNIFER STOUT, Vice President of Global Public Policy, Snapchat: Senator, it's not just happening 18 00:01:19,280 --> 00:01:23,360 on our platform. It's happening on others. So, therefore, we need to work collectively 19 00:01:23,360 --> 00:01:27,200 SEN. AMY KLOBUCHAR: I think there's other ways to do this too, as creating liability 20 00:01:27,200 --> 00:01:32,200 when this happens, so maybe that will make you work even faster, so we don't lose another kid. 21 00:01:32,480 --> 00:01:33,920 WILLIAM BRANGHAM: For much of the hearing, 22 00:01:33,920 --> 00:01:38,920 lawmakers pushed the executives to further limit certain features available to kids, 23 00:01:39,040 --> 00:01:44,040 such as autoplay of videos, targeted ad content, and the like and dislike buttons, 24 00:01:44,880 --> 00:01:49,520 which can keep children online longer, and potentially expose them to bullying. 25 00:01:50,240 --> 00:01:55,240 Executives stressed they have systems in place to flag harmful content and illegal activity, 26 00:01:56,000 --> 00:02:01,000 and that efforts to combat misinformation have been expanded. The executives also pledged to 27 00:02:01,520 --> 00:02:06,520 share more data and research on how their platforms impact teens and young adults, 28 00:02:06,640 --> 00:02:11,640 but they often fell short of pledging their full support for a number of bills already introduced. 29 00:02:12,640 --> 00:02:16,160 And lawmakers continued their calls for more transparency. 30 00:02:16,160 --> 00:02:19,040 SEN. JOHN THUNE (R-SD): What's your response to the Wall Street Journal article that 31 00:02:19,040 --> 00:02:23,760 describes in detail how TikTok's algorithm serves up sex and drug videos to minors? 32 00:02:23,760 --> 00:02:25,040 MICHAEL BECKERMAN, Head of Public Policy For the Americas, TikTok: We disagree with 33 00:02:25,040 --> 00:02:27,920 that being an authentic experience that an actual user would have. 34 00:02:27,920 --> 00:02:31,200 WILLIAM BRANGHAM: Another point of contention was how these platforms can 35 00:02:31,200 --> 00:02:35,360 ensure that children only see content that's appropriate for their age. 36 00:02:35,360 --> 00:02:38,480 JENNIFER STOUT: The content that appears on Snapchat 37 00:02:38,480 --> 00:02:42,400 is appropriate for the age group of 13 and above. 38 00:02:42,400 --> 00:02:43,760 SEN. MIKE LEE (R-UT): I beg to differ. 39 00:02:43,760 --> 00:02:48,760 I had my staff create a Snapchat account for a 13-year-old -- for a 15-year-old child. 40 00:02:49,840 --> 00:02:54,840 They were immediately bombarded with content that I can most politely describe as wildly 41 00:02:58,800 --> 00:03:03,800 inappropriate for a child, including recommendations for, among other things, 42 00:03:06,320 --> 00:03:11,320 an invite to play an online sexualized video game and articles about porn stars. 43 00:03:12,080 --> 00:03:16,080 JENNIFER STOUT: Any online sexual video game should be age-gated to 18 and above, 44 00:03:16,080 --> 00:03:18,720 so I'm unclear why that content would've shown up. 45 00:03:18,720 --> 00:03:22,240 WILLIAM BRANGHAM: Lawmakers sought clarity on how these companies police 46 00:03:22,240 --> 00:03:25,120 content that poses serious risks to users. 47 00:03:25,120 --> 00:03:30,120 LESLIE MILLER, YouTube: We heavily invest in making sure that all of our users, 48 00:03:30,560 --> 00:03:35,200 but particularly kids on the platform, have a safe experience. 49 00:03:35,200 --> 00:03:36,480 SEN. MARSHA BLACKBURN: I'm quoting from 50 00:03:37,200 --> 00:03:42,200 searches that we have done: Songs to slit your wrists by, vertical slit wrist. 51 00:03:43,360 --> 00:03:48,360 Do the self-harm and suicide videos violate YouTube's content guidelines? 52 00:03:49,760 --> 00:03:52,160 LESLIE MILLER: Senator, I would certainly welcome 53 00:03:52,160 --> 00:03:54,800 following up with you on the video you may be referencing. 54 00:03:54,800 --> 00:03:56,480 WILLIAM BRANGHAM: Legislators also wanted to 55 00:03:56,480 --> 00:04:00,640 know what data was being collected about children by these platforms. 56 00:04:00,640 --> 00:04:04,720 MICHAEL BECKERMAN: TikTok actually collects less in many categories than many of our peers. 57 00:04:04,720 --> 00:04:06,720 SEN. CYNTHIA LUMMIS (R-WY): Which of your competitors 58 00:04:06,720 --> 00:04:10,480 or other companies that you're aware of collect more information? 59 00:04:10,480 --> 00:04:12,240 MICHAEL BECKERMAN: Facebook and Instagram, for example. 60 00:04:12,240 --> 00:04:15,280 SEN. RICHARD BLUMENTHAL (D-CT): Being different from Facebook is not a defense. 61 00:04:16,000 --> 00:04:18,080 That bar is in the gutter. 62 00:04:18,080 --> 00:04:21,760 WILLIAM BRANGHAM: While the companies tried to separate themselves from each other, 63 00:04:21,760 --> 00:04:26,760 lawmakers from both sides agreed more action is needed to ensure kids are safe online. 64 00:04:29,200 --> 00:04:34,200 For more on how these platforms are affecting kids' mental health, we turn to Jean Twenge. 65 00:04:34,560 --> 00:04:39,560 She is a professor of psychology and the author of "iGen: Why Today's Super-Connected Kids Are 66 00:04:42,560 --> 00:04:47,560 Growing Up Less Rebellious, More Tolerant, Less Happy and Completely Unprepared for Adulthood." 67 00:04:49,920 --> 00:04:52,560 Jean Twenge, great to have you back on the "NewsHour." 68 00:04:52,560 --> 00:04:57,040 So, as we heard today, a lot of concern on Capitol Hill 69 00:04:57,040 --> 00:05:02,040 expressed about the potential for these platforms to be causing harm to young people. 70 00:05:03,200 --> 00:05:07,440 What do we know about the actual research as to whether or not these things do cause harm? 71 00:05:08,240 --> 00:05:12,400 JEAN TWENGE, Author, "iGen": Yes, so, generally speaking, the more time a kid or a teen 72 00:05:13,520 --> 00:05:18,520 spends in front of a screen, the more likely they are to be depressed, anxious, to harm themselves. 73 00:05:23,440 --> 00:05:28,440 There's gradations to this. Watching videos isn't as strongly linked to depression as, 74 00:05:28,800 --> 00:05:33,800 say, being on social media. But especially when kids and teens spend a lot of time 75 00:05:37,040 --> 00:05:42,000 online, it leaves less time for sleep, it leaves less time for interacting 76 00:05:42,000 --> 00:05:46,960 with people face to face, leaves less time for running around outside and exercising. 77 00:05:47,760 --> 00:05:51,040 And so, perhaps, as a result, what we have seen 78 00:05:51,760 --> 00:05:56,760 is a huge increase in teen depression right at the time that these platforms became very popular. 79 00:06:00,560 --> 00:06:05,560 WILLIAM BRANGHAM: So, do you feel that that -- is this causal or is this a correlation? I mean, 80 00:06:06,880 --> 00:06:11,440 do you feel confident that it's these platforms themselves or simply, 81 00:06:12,320 --> 00:06:15,840 as you're describing, sort of opportunity cost, that if you have got a screen in front 82 00:06:15,840 --> 00:06:19,840 of your face, you're not doing all these other things that we know are healthier for kids? 83 00:06:21,200 --> 00:06:26,200 JEAN TWENGE: Yes. So, yes, this is complex. There's many, many issues at stake here. 84 00:06:26,240 --> 00:06:31,240 So, one is that time spent, that, especially when it gets excessive to 85 00:06:32,000 --> 00:06:37,000 four, five, six, seven, eight hours a day, then it crowds out time for things that are more 86 00:06:37,360 --> 00:06:42,360 beneficial. Then there's the question of content, which was discussed a lot today, 87 00:06:42,560 --> 00:06:47,560 that there's a lot of negative content that kids get exposed to on these platforms. 88 00:06:48,560 --> 00:06:53,560 And as to whether it's causal, that's been a really hard question to answer. There have 89 00:06:53,760 --> 00:06:58,760 been some studies that have, say, had college students cut back on their social media use, 90 00:07:00,720 --> 00:07:04,720 and they found, after three weeks, the ones who cut back on their social media 91 00:07:04,720 --> 00:07:09,720 use were more mentally healthy than those who continued their usual high level of use. 92 00:07:10,800 --> 00:07:15,800 So that really points in the direction of at least some of that causation is going from using these 93 00:07:16,880 --> 00:07:21,680 platforms, especially many hours a day, toward depression and other mental health issues. 94 00:07:21,680 --> 00:07:23,120 WILLIAM BRANGHAM: 95 00:07:23,120 --> 00:07:28,120 So how does this body of research translate? If I'm a parent debating what to do with my child 96 00:07:29,280 --> 00:07:34,040 and devices and social media, what is the current state of best advice for parents? 97 00:07:34,040 --> 00:07:36,560 JEAN TWENGE: Parents are in a tough position. 98 00:07:36,560 --> 00:07:41,560 This is one reason we need more policy and regulation in this area, because 99 00:07:42,400 --> 00:07:47,400 you have the fear that, if your kid doesn't use social media, then they will be left out, and 100 00:07:47,600 --> 00:07:52,600 if they do use social media, then there's these mental health issues, negative content, and so on. 101 00:07:52,960 --> 00:07:57,960 So I think there's two important things. First, put off having your kid get social media for 102 00:07:58,720 --> 00:08:03,720 as long as you can. Ten is too young. It's actually the law. You need to be 13. Even 103 00:08:04,880 --> 00:08:09,880 13 is pretty young to start with social media. So, try to put it off to 15 or 16 or even later. 104 00:08:11,920 --> 00:08:16,920 And then the second aspect is just to make sure that they're using 105 00:08:18,160 --> 00:08:21,920 social media and video platforms in moderation, 106 00:08:21,920 --> 00:08:26,720 that it's not taking over their life, crowding out time that could be spend on other things. 107 00:08:26,720 --> 00:08:30,720 If they want to spend an hour or two a day outside school on these platforms, 108 00:08:30,720 --> 00:08:35,280 not a big deal. It's not really linked to depression. It's when the use gets to four, 109 00:08:35,280 --> 00:08:40,280 five, six hours and beyond that it's much more concerning for mental health and other issues. 110 00:08:41,280 --> 00:08:46,280 WILLIAM BRANGHAM: Really is a remarkable social experiment we're conducting right now. 111 00:08:46,320 --> 00:08:50,880 Jean Twenge of San Diego State University, always good to see you. Thanks for being here. 112 00:08:50,880 --> 00:08:53,840 JEAN TWENGE: Thank you.