JUDY WOODRUFF: A Senate committee is
widening its investigation into the

impact social media platforms have
on children, teens and young adults,

 

with more apps facing
congressional scrutiny.

William Brangham has our coverage,
beginning with this report.

And a warning: This story
contains sensitive subject matter,
including discussion of suicide.

 

SEN. AMY KLOBUCHAR (D-MN): I
don't think parents are going to

 

stand by while our kids and democracy
become collateral damage to a profit game.

 

WILLIAM BRANGHAM: On Capitol
Hill, executives from YouTube,
Snapchat, and TikTok were grilled

 

by lawmakers on what these wildly
popular platforms are doing
to protect children online,

 

and exactly what kinds of
material kids are able to access.

SEN. MARSHA BLACKBURN (R-TN):
Kids as young as 9 have died
doing viral challenges on TikTok.

 

WILLIAM BRANGHAM: Today marks
the first time representatives
from TikTok and Snapchat have

appeared before Congress. Among the
many issues lawmakers asked about,

how to prevent dealers selling
counterfeit pills and illegal
substances to young people.

 

SEN. AMY KLOBUCHAR: If a
kid had just walked into,

say, a pharmacy, he wouldn't be
able to buy that or get that.

JENNIFER STOUT, Vice President of
Global Public Policy, Snapchat:
Senator, it's not just happening

on our platform. It's happening
on others. So, therefore,
we need to work collectively

SEN. AMY KLOBUCHAR: I think there's other
ways to do this too, as creating liability

when this happens, so maybe that
will make you work even faster,
so we don't lose another kid.

 

WILLIAM BRANGHAM: For much of the hearing,

lawmakers pushed the executives to further
limit certain features available to kids,

 

such as autoplay of videos, targeted ad
content, and the like and dislike buttons,

 

which can keep children online longer,
and potentially expose them to bullying.

 

Executives stressed they have
systems in place to flag harmful
content and illegal activity,

 

and that efforts to combat
misinformation have been expanded.
The executives also pledged to

 

share more data and research on how their
platforms impact teens and young adults,

 

but they often fell short of
pledging their full support for a
number of bills already introduced.

 

And lawmakers continued their
calls for more transparency.

SEN. JOHN THUNE (R-SD): What's
your response to the Wall
Street Journal article that

describes in detail how TikTok's algorithm
serves up sex and drug videos to minors?

MICHAEL BECKERMAN, Head of Public Policy
For the Americas, TikTok: We disagree with

that being an authentic experience
that an actual user would have.

WILLIAM BRANGHAM: Another point of
contention was how these platforms can

ensure that children only see content
that's appropriate for their age.

JENNIFER STOUT: The content
that appears on Snapchat

is appropriate for the
age group of 13 and above.

SEN. MIKE LEE (R-UT): I beg to differ.

I had my staff create a Snapchat
account for a 13-year-old
-- for a 15-year-old child.

 

They were immediately bombarded
with content that I can most
politely describe as wildly

 

inappropriate for a child, including
recommendations for, among other things,

 

an invite to play an online sexualized
video game and articles about porn stars.

 

JENNIFER STOUT: Any online sexual video
game should be age-gated to 18 and above,

so I'm unclear why that
content would've shown up.

WILLIAM BRANGHAM: Lawmakers sought
clarity on how these companies police

content that poses serious risks to users.

LESLIE MILLER, YouTube: We heavily invest
in making sure that all of our users,

 

but particularly kids on the
platform, have a safe experience.

SEN. MARSHA BLACKBURN: I'm quoting from

 

searches that we have done: Songs to
slit your wrists by, vertical slit wrist.

 

Do the self-harm and suicide videos
violate YouTube's content guidelines?

 

LESLIE MILLER: Senator,
I would certainly welcome

following up with you on the
video you may be referencing.

WILLIAM BRANGHAM:
Legislators also wanted to

know what data was being collected
about children by these platforms.

MICHAEL BECKERMAN: TikTok
actually collects less in many
categories than many of our peers.

SEN. CYNTHIA LUMMIS (R-WY):
Which of your competitors

or other companies that you're
aware of collect more information?

MICHAEL BECKERMAN: Facebook
and Instagram, for example.

SEN. RICHARD BLUMENTHAL (D-CT): Being
different from Facebook is not a defense.

 

That bar is in the gutter.

WILLIAM BRANGHAM: While the
companies tried to separate
themselves from each other,

lawmakers from both sides
agreed more action is needed
to ensure kids are safe online.

 

For more on how these platforms
are affecting kids' mental
health, we turn to Jean Twenge.

 

She is a professor of psychology
and the author of "iGen: Why
Today's Super-Connected Kids Are

 

Growing Up Less Rebellious, More
Tolerant, Less Happy and Completely
Unprepared for Adulthood."

 

Jean Twenge, great to have
you back on the "NewsHour."

So, as we heard today, a lot
of concern on Capitol Hill

expressed about the potential
for these platforms to be
causing harm to young people.

 

What do we know about the actual
research as to whether or not
these things do cause harm?

 

JEAN TWENGE, Author, "iGen":
Yes, so, generally speaking,
the more time a kid or a teen

 

spends in front of a screen, the
more likely they are to be depressed,
anxious, to harm themselves.

 

There's gradations to this.
Watching videos isn't as
strongly linked to depression as,

 

say, being on social media. But especially
when kids and teens spend a lot of time

 

online, it leaves less time for sleep,
it leaves less time for interacting

with people face to face, leaves less time
for running around outside and exercising.

 

And so, perhaps, as a
result, what we have seen

 

is a huge increase in teen depression
right at the time that these
platforms became very popular.

 

WILLIAM BRANGHAM: So, do you feel
that that -- is this causal or
is this a correlation? I mean,

 

do you feel confident that it's
these platforms themselves or simply,

 

as you're describing, sort of
opportunity cost, that if you
have got a screen in front

of your face, you're not doing
all these other things that we
know are healthier for kids?

 

JEAN TWENGE: Yes. So, yes,
this is complex. There's many,
many issues at stake here.

 

So, one is that time spent, that,
especially when it gets excessive to

 

four, five, six, seven, eight
hours a day, then it crowds out
time for things that are more

 

beneficial. Then there's the question of
content, which was discussed a lot today,

 

that there's a lot of negative
content that kids get exposed
to on these platforms.

 

And as to whether it's causal,
that's been a really hard
question to answer. There have

 

been some studies that have,
say, had college students cut
back on their social media use,

 

and they found, after three weeks, the
ones who cut back on their social media

use were more mentally healthy
than those who continued
their usual high level of use.

 

So that really points in the
direction of at least some of that
causation is going from using these

 

platforms, especially many
hours a day, toward depression
and other mental health issues.

WILLIAM BRANGHAM:

So how does this body of research
translate? If I'm a parent
debating what to do with my child

 

and devices and social media, what is the
current state of best advice for parents?

JEAN TWENGE: Parents
are in a tough position.

This is one reason we need more policy
and regulation in this area, because

 

you have the fear that, if your
kid doesn't use social media,
then they will be left out, and

 

if they do use social media, then
there's these mental health issues,
negative content, and so on.

 

So I think there's two important
things. First, put off having
your kid get social media for

 

as long as you can. Ten is too young. It's
actually the law. You need to be 13. Even

 

13 is pretty young to start with
social media. So, try to put it
off to 15 or 16 or even later.

 

And then the second aspect is just
to make sure that they're using

 

social media and video
platforms in moderation,

that it's not taking over their
life, crowding out time that
could be spend on other things.

If they want to spend an hour or two a
day outside school on these platforms,

not a big deal. It's not really
linked to depression. It's
when the use gets to four,

five, six hours and beyond that
it's much more concerning for
mental health and other issues.

 

WILLIAM BRANGHAM: Really is a
remarkable social experiment
we're conducting right now.

 

Jean Twenge of San Diego State
University, always good to see
you. Thanks for being here.

JEAN TWENGE: Thank you.