WEBVTT 00:00.000 --> 00:03.600 JUDY WOODRUFF: A Senate committee is widening its investigation into the 00:03.600 --> 00:08.600 impact social media platforms have on children, teens and young adults, 00:08.960 --> 00:11.920 with more apps facing congressional scrutiny. 00:11.920 --> 00:15.520 William Brangham has our coverage, beginning with this report. 00:15.520 --> 00:20.520 And a warning: This story contains sensitive subject matter, including discussion of suicide. 00:21.600 --> 00:24.800 SEN. AMY KLOBUCHAR (D-MN): I don't think parents are going to 00:25.680 --> 00:30.680 stand by while our kids and democracy become collateral damage to a profit game. 00:34.160 --> 00:39.160 WILLIAM BRANGHAM: On Capitol Hill, executives from YouTube, Snapchat, and TikTok were grilled 00:39.920 --> 00:44.920 by lawmakers on what these wildly popular platforms are doing to protect children online, 00:46.320 --> 00:50.000 and exactly what kinds of material kids are able to access. 00:50.000 --> 00:55.000 SEN. MARSHA BLACKBURN (R-TN): Kids as young as 9 have died doing viral challenges on TikTok. 00:57.360 --> 01:01.920 WILLIAM BRANGHAM: Today marks the first time representatives from TikTok and Snapchat have 01:01.920 --> 01:06.320 appeared before Congress. Among the many issues lawmakers asked about, 01:06.320 --> 01:11.320 how to prevent dealers selling counterfeit pills and illegal substances to young people. 01:11.820 --> 01:14.560 SEN. AMY KLOBUCHAR: If a kid had just walked into, 01:14.560 --> 01:17.760 say, a pharmacy, he wouldn't be able to buy that or get that. 01:17.760 --> 01:19.280 JENNIFER STOUT, Vice President of Global Public Policy, Snapchat: Senator, it's not just happening 01:19.280 --> 01:23.360 on our platform. It's happening on others. So, therefore, we need to work collectively 01:23.360 --> 01:27.200 SEN. AMY KLOBUCHAR: I think there's other ways to do this too, as creating liability 01:27.200 --> 01:32.200 when this happens, so maybe that will make you work even faster, so we don't lose another kid. 01:32.480 --> 01:33.920 WILLIAM BRANGHAM: For much of the hearing, 01:33.920 --> 01:38.920 lawmakers pushed the executives to further limit certain features available to kids, 01:39.040 --> 01:44.040 such as autoplay of videos, targeted ad content, and the like and dislike buttons, 01:44.880 --> 01:49.520 which can keep children online longer, and potentially expose them to bullying. 01:50.240 --> 01:55.240 Executives stressed they have systems in place to flag harmful content and illegal activity, 01:56.000 --> 02:01.000 and that efforts to combat misinformation have been expanded. The executives also pledged to 02:01.520 --> 02:06.520 share more data and research on how their platforms impact teens and young adults, 02:06.640 --> 02:11.640 but they often fell short of pledging their full support for a number of bills already introduced. 02:12.640 --> 02:16.160 And lawmakers continued their calls for more transparency. 02:16.160 --> 02:19.040 SEN. JOHN THUNE (R-SD): What's your response to the Wall Street Journal article that 02:19.040 --> 02:23.760 describes in detail how TikTok's algorithm serves up sex and drug videos to minors? 02:23.760 --> 02:25.040 MICHAEL BECKERMAN, Head of Public Policy For the Americas, TikTok: We disagree with 02:25.040 --> 02:27.920 that being an authentic experience that an actual user would have. 02:27.920 --> 02:31.200 WILLIAM BRANGHAM: Another point of contention was how these platforms can 02:31.200 --> 02:35.360 ensure that children only see content that's appropriate for their age. 02:35.360 --> 02:38.480 JENNIFER STOUT: The content that appears on Snapchat 02:38.480 --> 02:42.400 is appropriate for the age group of 13 and above. 02:42.400 --> 02:43.760 SEN. MIKE LEE (R-UT): I beg to differ. 02:43.760 --> 02:48.760 I had my staff create a Snapchat account for a 13-year-old -- for a 15-year-old child. 02:49.840 --> 02:54.840 They were immediately bombarded with content that I can most politely describe as wildly 02:58.800 --> 03:03.800 inappropriate for a child, including recommendations for, among other things, 03:06.320 --> 03:11.320 an invite to play an online sexualized video game and articles about porn stars. 03:12.080 --> 03:16.080 JENNIFER STOUT: Any online sexual video game should be age-gated to 18 and above, 03:16.080 --> 03:18.720 so I'm unclear why that content would've shown up. 03:18.720 --> 03:22.240 WILLIAM BRANGHAM: Lawmakers sought clarity on how these companies police 03:22.240 --> 03:25.120 content that poses serious risks to users. 03:25.120 --> 03:30.120 LESLIE MILLER, YouTube: We heavily invest in making sure that all of our users, 03:30.560 --> 03:35.200 but particularly kids on the platform, have a safe experience. 03:35.200 --> 03:36.480 SEN. MARSHA BLACKBURN: I'm quoting from 03:37.200 --> 03:42.200 searches that we have done: Songs to slit your wrists by, vertical slit wrist. 03:43.360 --> 03:48.360 Do the self-harm and suicide videos violate YouTube's content guidelines? 03:49.760 --> 03:52.160 LESLIE MILLER: Senator, I would certainly welcome 03:52.160 --> 03:54.800 following up with you on the video you may be referencing. 03:54.800 --> 03:56.480 WILLIAM BRANGHAM: Legislators also wanted to 03:56.480 --> 04:00.640 know what data was being collected about children by these platforms. 04:00.640 --> 04:04.720 MICHAEL BECKERMAN: TikTok actually collects less in many categories than many of our peers. 04:04.720 --> 04:06.720 SEN. CYNTHIA LUMMIS (R-WY): Which of your competitors 04:06.720 --> 04:10.480 or other companies that you're aware of collect more information? 04:10.480 --> 04:12.240 MICHAEL BECKERMAN: Facebook and Instagram, for example. 04:12.240 --> 04:15.280 SEN. RICHARD BLUMENTHAL (D-CT): Being different from Facebook is not a defense. 04:16.000 --> 04:18.080 That bar is in the gutter. 04:18.080 --> 04:21.760 WILLIAM BRANGHAM: While the companies tried to separate themselves from each other, 04:21.760 --> 04:26.760 lawmakers from both sides agreed more action is needed to ensure kids are safe online. 04:29.200 --> 04:34.200 For more on how these platforms are affecting kids' mental health, we turn to Jean Twenge. 04:34.560 --> 04:39.560 She is a professor of psychology and the author of "iGen: Why Today's Super-Connected Kids Are 04:42.560 --> 04:47.560 Growing Up Less Rebellious, More Tolerant, Less Happy and Completely Unprepared for Adulthood." 04:49.920 --> 04:52.560 Jean Twenge, great to have you back on the "NewsHour." 04:52.560 --> 04:57.040 So, as we heard today, a lot of concern on Capitol Hill 04:57.040 --> 05:02.040 expressed about the potential for these platforms to be causing harm to young people. 05:03.200 --> 05:07.440 What do we know about the actual research as to whether or not these things do cause harm? 05:08.240 --> 05:12.400 JEAN TWENGE, Author, "iGen": Yes, so, generally speaking, the more time a kid or a teen 05:13.520 --> 05:18.520 spends in front of a screen, the more likely they are to be depressed, anxious, to harm themselves. 05:23.440 --> 05:28.440 There's gradations to this. Watching videos isn't as strongly linked to depression as, 05:28.800 --> 05:33.800 say, being on social media. But especially when kids and teens spend a lot of time 05:37.040 --> 05:42.000 online, it leaves less time for sleep, it leaves less time for interacting 05:42.000 --> 05:46.960 with people face to face, leaves less time for running around outside and exercising. 05:47.760 --> 05:51.040 And so, perhaps, as a result, what we have seen 05:51.760 --> 05:56.760 is a huge increase in teen depression right at the time that these platforms became very popular. 06:00.560 --> 06:05.560 WILLIAM BRANGHAM: So, do you feel that that -- is this causal or is this a correlation? I mean, 06:06.880 --> 06:11.440 do you feel confident that it's these platforms themselves or simply, 06:12.320 --> 06:15.840 as you're describing, sort of opportunity cost, that if you have got a screen in front 06:15.840 --> 06:19.840 of your face, you're not doing all these other things that we know are healthier for kids? 06:21.200 --> 06:26.200 JEAN TWENGE: Yes. So, yes, this is complex. There's many, many issues at stake here. 06:26.240 --> 06:31.240 So, one is that time spent, that, especially when it gets excessive to 06:32.000 --> 06:37.000 four, five, six, seven, eight hours a day, then it crowds out time for things that are more 06:37.360 --> 06:42.360 beneficial. Then there's the question of content, which was discussed a lot today, 06:42.560 --> 06:47.560 that there's a lot of negative content that kids get exposed to on these platforms. 06:48.560 --> 06:53.560 And as to whether it's causal, that's been a really hard question to answer. There have 06:53.760 --> 06:58.760 been some studies that have, say, had college students cut back on their social media use, 07:00.720 --> 07:04.720 and they found, after three weeks, the ones who cut back on their social media 07:04.720 --> 07:09.720 use were more mentally healthy than those who continued their usual high level of use. 07:10.800 --> 07:15.800 So that really points in the direction of at least some of that causation is going from using these 07:16.880 --> 07:21.680 platforms, especially many hours a day, toward depression and other mental health issues. 07:21.680 --> 07:23.120 WILLIAM BRANGHAM: 07:23.120 --> 07:28.120 So how does this body of research translate? If I'm a parent debating what to do with my child 07:29.280 --> 07:34.040 and devices and social media, what is the current state of best advice for parents? 07:34.040 --> 07:36.560 JEAN TWENGE: Parents are in a tough position. 07:36.560 --> 07:41.560 This is one reason we need more policy and regulation in this area, because 07:42.400 --> 07:47.400 you have the fear that, if your kid doesn't use social media, then they will be left out, and 07:47.600 --> 07:52.600 if they do use social media, then there's these mental health issues, negative content, and so on. 07:52.960 --> 07:57.960 So I think there's two important things. First, put off having your kid get social media for 07:58.720 --> 08:03.720 as long as you can. Ten is too young. It's actually the law. You need to be 13. Even 08:04.880 --> 08:09.880 13 is pretty young to start with social media. So, try to put it off to 15 or 16 or even later. 08:11.920 --> 08:16.920 And then the second aspect is just to make sure that they're using 08:18.160 --> 08:21.920 social media and video platforms in moderation, 08:21.920 --> 08:26.720 that it's not taking over their life, crowding out time that could be spend on other things. 08:26.720 --> 08:30.720 If they want to spend an hour or two a day outside school on these platforms, 08:30.720 --> 08:35.280 not a big deal. It's not really linked to depression. It's when the use gets to four, 08:35.280 --> 08:40.280 five, six hours and beyond that it's much more concerning for mental health and other issues. 08:41.280 --> 08:46.280 WILLIAM BRANGHAM: Really is a remarkable social experiment we're conducting right now. 08:46.320 --> 08:50.880 Jean Twenge of San Diego State University, always good to see you. Thanks for being here. 08:50.880 --> 08:53.840 JEAN TWENGE: Thank you.