Senator Tom Cotton (02:57:45):

Okay. So what we have here, we have a company that’s a tool of the Chinese Communist Party that is poisoning the minds of America’s children, in some cases, driving them to suicide, and that at best, the Biden administration is taking a pass on, at worse, maybe in collaboration with. Thank you, Mr. Chew.

Senator Dick Durbin (02:58:02):

Thank you, Senator Cotton. So we’re going to take a break now. We’re on the second roll call. Members can take advantage as they wish. The break will last about 10 minutes. Please do your best to return.

Senator Alex Padilla (03:17:29):

Thank you, Mr. Chair. Colleagues, as we reconvene, I’m proud once again to share that I am one of the few senators with younger children.

(03:17:44)
And I lead with that because as we are having this conversation today, it’s not lost on me that between my children, who are all now in the teen and preteen category, and their friends, I see this issue very up close and personal. And in that spirit, I want to take a second to just acknowledge and thank all the parents who are in the audience today, many of whom have shared their stories with our offices, and I credit them for finding strength through their suffering, through their struggle, and channeling that into the advocacy that is making a difference. I thank all of you. Now, I appreciate, again, personally the challenges that parents and caretakers, school personnel, and others face in helping our young people navigate this world of social media and technology in general.

(03:18:47)
Now, the services our children are growing up with provide them unrivaled access to information. This is beyond what previous generations have experienced, and that includes learning opportunities, socialization, and much, much more. But we also clearly have a lot of work to do to better protect our children from the predators and predatory behavior that these technologies have enabled.

(03:19:17)
And yes, Mr. Zuckerberg, that includes exacerbating the mental health crisis in America. Nearly all teens we know have access to smartphones and the internet and use the internet daily.

(03:19:35)
And while guardians do have primary responsibility for caring for our children, the old adage says it takes a village. And so society as a whole, including leaders in the tech industry, must prioritize the health and safety of our children.

(03:19:56)
Now, dive into my questions now and be specific, platform by platform, witness by witness on the topic of some of the parental tools you have each made reference to.

(03:20:06)
Mr. Citron, how many minors are on Discord and how many of them have caretakers that have adopted your Family Center tool? And if you don’t have the numbers, just say that quickly and provide that to our office.

Mr. Jason Citron (03:20:18):

We can follow up with you on that.

Mr. Padilla (03:20:22):

How have you ensured that young people and their guardians are aware of the tools that you offer?

Mr. Jason Citron (03:20:27):

We make it very clear to teens on our platform what tools are available and our Teen Safety Assist is enabled by default.

Mr. Padilla (03:20:36):

What specifically do you do? What may be clear to you is not clear to the general public, so what do you do in your opinion to make it very clear?

Mr. Jason Citron (03:20:41):

So our Teen Safety Assist, which is a feature that helps teens keep themselves safe in addition to blocking and blurring images that may be sent to them, that is on by default for teen accounts and it cannot be turned off.

(03:20:53)
We market and to our teen users directly in our platform, we launched our Family Center, we created a promotional video and we put it directly on our product. So when every teen opened the app, in fact, every user opened the app, they got an alert like, “Hey, Discord has this and they want you to use it.”

Mr. Padilla (03:21:10):

Thank you. Look forward to the data that we’re requesting-

Mr. Jason Citron (03:21:12):

[Inaudible 03:21:13]

Mr. Padilla (03:21:13):

Mr. Zuckerberg, across all of Meta services from Instagram, Facebook, Messenger, and Horizon, how many minors use your applications, and of those minors, how many have a caretaker that has adopted the parental supervision tools that you offer?

Mark Zuckerberg (03:21:29):

Sorry, I can follow up with the specific stats on that.

Mr. Padilla (03:21:31):

Okay. It would be very helpful not just for us to know, but for you to know as a leader of your company. And same question, how are you ensuring that young people and their guardians are aware of the tools that you offer?

Mark Zuckerberg (03:21:44):

We run pretty extensive ad campaigns both on our platforms and outside. We work with creators and organizations like Girl Scouts to make sure that there’s broad awareness of the tools.

Mr. Padilla (03:21:57):

Okay. Mr. Spiegel, how many minors use Snapchat and of those minors, how many have caretakers that are registered with your Family Center?

Mr. Spiegel (03:22:04):

Senator, I believe in the United States, there are approximately 20 million teenage users of Snapchat. I believe approximately 200,000 parents use Family Center and about 400,000 teens have linked their account to their parents using Family Center.

Mr. Padilla (03:22:18):

So 200,000, 400,000 sounds like a big number, but small in percentage of the minors using Snapchat. What are you doing to ensure that young people and their guardians are aware of the tools you offer?

Mr. Spiegel (03:22:29):

Senator, we create a banner for Family Center on the user’s profiles, so that accounts, we believe maybe of the age that they could be parents can see the entry point into Family Center easily.

Mr. Padilla (03:22:40):

Okay. Mr. Shou, how many minors are on TikTok and how many of them have a caregiver that uses your family tools?

Mr. Shou Chew (03:22:46):

Senator, I’d need to get back to you on the specific numbers, but we were one of the first platforms to give what we call Family Pairing to parents. You go to Settings, you turn on the QR code, your teenager’s QR code and yours. You scan it, and what it allows you to do is you can set screen time limits, you can filter out some keywords, you can turn on a more restricted mode, and we are always talking to parents. I met a group of parents and teenagers and high school teachers last week to talk about what more we can provide in the Family Pairing mode.

Mr. Padilla (03:23:15):

Ms. Yaccarino, how many minors use X and are you planning to implement safety measures or guidance for caretakers like your peer companies have?

Ms. Linda Yaccarino (03:23:24):

Thank you, Senator. Less than 1% of all US users are between the ages of 13 and 17.

Mr. Padilla (03:23:32):

Less than 1% of how many?

Ms. Linda Yaccarino (03:23:34):

Of 90 million US users.

Mr. Padilla (03:23:37):

Okay, so still hundreds of thousands? Continue.

Ms. Linda Yaccarino (03:23:38):

Yes. Yes, and every single one is very important. Being a 14-month-old company, we have reprioritized child protection and safety measures, and we have just begun to talk about and discuss how we can enhance those with parental controls.

Mr. Padilla (03:23:56):

Let me continue with a follow-up question for Mr. Citron. In addition, keeping parents informed about the nature of various internet services, there’s a lot more we obviously need to do for today’s purposes.

(03:24:09)
While many companies offer a broad range of quote, unquote “user empowerment” tools, it’s helpful to understand whether young people even find these tools helpful.

(03:24:18)
So appreciate you sharing your Teen Safety Assist and the tools and how you’re advertising it, but have you conducted any assessment of how these features are impacting minor’s use of your platform?

Mr. Jason Citron (03:24:32):

Our intention is to give teens tools, capabilities that they can use to keep themselves safe and also so our teams can help keep teens safe. We recently launched Team Safety Assist last year, and I do not have a study off the top of my head, but we’d be happy to follow up with you on that.

Mr. Padilla (03:24:47):

Okay. My time is up. I’ll have follow up questions for each of you either in the second round or through statements for the record on a similar assessment of the tools that you’ve proposed. Thank you, Mr. Chairman.

Mr. Chairman (03:24:58):

Thank you, Senator Padilla. Senator Kennedy.

Mr. Kennedy (03:25:03):

Thank you all for being here. Mr. Spiegel, I see you hiding down there. What does yada, yada, yada mean?

Mr. Spiegel (03:25:21):

I’m not familiar with the term Senator.

Mr. Kennedy (03:25:24):

Very uncool. Can we agree that what you do not what you say, what you do is what you believe and everything else is just cottage cheese?

Mr. Spiegel (03:25:40):

Yes, Senator.

Mr. Kennedy (03:25:42):

You agree with that? Speak up. Don’t be shy. I’ve listened to you today. I’ve heard a lot of yada, yada, yading, and I’ve heard you talk about the reforms you’ve made and I appreciate them. And I’ve heard you talk about the reforms you’re going to make, but I don’t think you’re going to solve the problem. I think Congress is going to have to help you.

(03:26:15)
I think the reforms you’re talking about to some extent are going to be like putting paint on rotten wood, and I’m not sure you’re going to support this legislation. I’m not.

(03:26:29)
The fact is that you and some of your internet colleagues who are not here are no longer… You’re not companies, you’re countries. You’re very, very powerful and you and some of your colleagues who are not here have blocked everything we have tried to do in terms of reasonable regulation, everything from privacy to child exploitation.

(03:27:03)
And in fact, we have a new definition of recession. A recession is when… We know we’re in a recession when Google has to lay off 25 members of Congress. That’s what we’re down to. We’re also down to this fact that your platforms are hurting children.

(03:27:27)
I’m not saying they’re not doing some good things, but they’re hurting children. And I know how to count votes, and if this bill comes to the floor of the United States Senate, it will pass.

(03:27:43)
What we’re going to have to do, and I say this with all the respect that I can muster, is convince my good friend, Senator Schumer to go to Amazon, buy spine online and bring this bill to the Senate floor and the House will then pass it. Now, that’s one person’s opinion. I may be wrong, but I doubt it.

(03:28:09)
Mr. Zuckerberg, let me ask you a couple of questions. Might wax a little philosophical here. I have to hand it to you. You have convinced over 2 billion people to give up all of their personal information, every bit of it, in exchange for getting to see what their high school friends had for dinner Saturday night. That’s pretty much your business model, isn’t it?

Mark Zuckerberg (03:28:48):

It’s not how I would characterize it, and we give people the ability to connect with the people they care about and to engage with the topics that they care about.

Mr. Kennedy (03:28:58):

And you take this information, this abundance of personal information, and then you develop algorithms to punch people’s hot buttons and steer to them information that punches their hot buttons again and again and again to keep them coming back and to keep them staying longer. And as a result, your users see only one side of an issue, and so to some extent, your platform has become a killing field for the truth, hasn’t it?

Mark Zuckerberg (03:29:40):

I mean, Senator, I disagree with that, that characterization. We build ranking and recommendations because people have a lot of friends and a lot of interests and they want to make sure that they see the content that’s relevant to them.

(03:29:53)
We’re trying to make a product that’s useful to people and make our services as helpful as possible for people to connect with the people they care about and the interest they care about. That’s what we-

Mr. Kennedy (03:30:02):

But you don’t show them both sides. You don’t give them balanced information. You just keep punching their hot buttons, punching their hot buttons. You don’t show them balanced information so people can discern the truth for themselves and you rev them up so much that so often your platform and others becomes just cesspools of snark where nobody learns anything, don’t they?

Mark Zuckerberg (03:30:29):

Well, Senator, I disagree with that. I think people can engage in the things that they’re interested in and learn quite a bit about those. We have done a handful of different experiments and things in the past around news and trying to show content on diverse set of perspectives. I think that there’s more that needs to be explored there, but I don’t think that we can solve that by ourselves. One of the things that I saw-

Mr. Kennedy (03:30:54):

Do you think… I’m sorry to cut you off, Mr. President, but I’m going to run out time. Do you think your users really understand what they’re giving to you, all their personal information and how you process it and how you monetize it? Do you think people really understand?

Mark Zuckerberg (03:31:14):

Senator, I think people understand the basic terms. I mean, I think that there’s-

Mr. Kennedy (03:31:20):

Let me put it-

Mark Zuckerberg (03:31:20):

I actually think that a lot of people overestimate how much information [inaudible 03:31:24]-

Mr. Kennedy (03:31:23):

… put it another way. It’s been a couple of years since we talked about this. Does your user agreement still suck?

Mark Zuckerberg (03:31:30):

I’m not sure how to answer that, Senator. I think basic-

Mr. Kennedy (03:31:33):

Can you still have a dead body and all that legalese where nobody can find it?

Mark Zuckerberg (03:31:40):

Senator, I’m not quite sure what you’re referring to, but I think people get the basic deal of using these services. It’s a free service. You’re using it to connect with the people you care about.

(03:31:49)
If you share something with people, other people will be able to see your information. It’s inherently… and If you’re putting something out there to be shared publicly or with a private set of people, you’re inherently putting it out there, so I think people get that basic part of how the service works.

Mr. Kennedy (03:32:04):

But Mr. Zuckerberg, you’re in the foothills of creepy. You track people who aren’t even Facebook users. You track your own people, your own users who are your product even when they’re not on Facebook.

(03:32:24)
I’m going to land this plane pretty quickly, Mr. Chairman. I mean, it’s creepy. And I understand you make a lot of money doing it, but I just wonder if our technology is greater than our humanity. I mean, let me ask you this final question. Instagram is harmful to young people, isn’t it?

Mark Zuckerberg (03:32:52):

Senator, I disagree with that. That’s not what the research shows on balance. That doesn’t mean that individual people don’t have issues and that there aren’t things that we need to do to help provide the right tools for people, but across all of the research that we’ve done internally, I mean the survey that the Senator previously cited, there are 12 or 15 different categories of harm that we asked teens if they felt that Instagram made it worse or better. And across all of them, except for the one that Senator Hawley cited, more people said that using Instagram [inaudible 03:33:29]-

Mr. Kennedy (03:33:29):

I’ve got a land the plane.

Mark Zuckerberg (03:33:30):

… issues they face.

Mr. Kennedy (03:33:30):

Mr. Zuckerberg-

Mark Zuckerberg (03:33:31):

… either positive or-

Mr. Kennedy (03:33:33):

… we just have to agree to disagree. If you believe that Instagram… I’m not saying it’s intentional, but if you agree that Instagram… if you think that Instagram is not hurting millions of our young people, particularly young teens, particularly young women, you shouldn’t be driving. It is. Thanks.

Mr. Chairman (03:33:56):

Senator Butler.

Ms. Butler (03:33:59):

Thank you, Mr. Chair, and thank you to our panelists who’ve come to have an important conversation with us. Most importantly, I want to appreciate the families who have shown up to continue to be remarkable champions of your children and your loved ones for being here, in particular to California families that I was able to just talk to on the break, the families of Sammy Chapman from Los Angeles and Daniel Puerta from Santa Clarita.

(03:34:32)
They are here today and are doing some incredible work to not just protect the memory and legacy of their boys, but the work that they’re doing is going to protect my 9-year-old, and that is indeed why we are here.
There are a couple of questions that I want to ask some individuals. Let me start with a question for each of you. Mr. Citron, have you ever sat with a family and talked about their experience and what they need from your product? Yes or no?

Mr. Jason Citron (03:35:06):

Yes. I have spoken with parents about how we can build tools to help them.

Ms. Butler (03:35:10):

Mr. Spiegel, have you sat with families and young people to talk about your products and what they need from your product?

Mr. Spiegel (03:35:16):

Yes, Senator.

Ms. Butler (03:35:18):

Mr. Shou?

Mr. Shou Chew (03:35:19):

Yes. I just did it two weeks ago, for example, I did-

Ms. Butler (03:35:22):

I don’t want to know what you did for the hearing prep, Mr. Chew. I just wanted to know if-

Mr. Shou Chew (03:35:27):

No, it’s an example.

Ms. Butler (03:35:27):

… anything-

Mr. Shou Chew (03:35:28):

Senator, it’s an example.

Ms. Butler (03:35:29):

… in terms of designing the product that you are creating. Mr. Zuckerberg, have you sat with parents and young people to talk about how you design product for your consumers?

Mark Zuckerberg (03:35:47):

Yes. Over the years, I’ve had a lot of conversations with parents.

Ms. Butler (03:35:50):

You know that’s interesting, Mr. Zuckerberg, because we talked about this last night and you gave me a very different answer. I asked you this very question.

Mark Zuckerberg (03:36:01):

Well, I told you that I didn’t know what specific processes our company had for answer-

Ms. Butler (03:36:08):

No, Mr. Zuckerberg, you said to me that you had not.

Mark Zuckerberg (03:36:13):

I must have misspoke.

Ms. Butler (03:36:14):

I want to give you the room to misspeak Mr. Zuckerberg, but I asked you this very question. I asked all of you this question and you told me a very different answer when we spoke, but I won’t belabor it. A number of you have talked about the… I’m sorry, X, Ms. Yaccarino, have you talked to parents directly, young people about designing your product?

Ms. Linda Yaccarino (03:36:40):

As a new leader of X, the answer is yes. I’ve spoken to them about the behavioral patterns because less than 1% of our users are in that age group, but yes, I have spoken to them.

Ms. Butler (03:36:54):

Thank you, ma’am. Mr. Spiegel, there are a number of parents who’ve children have been able to access illegal drugs on your platform. What do you say to those parents?

Mr. Spiegel (03:37:10):

Well, Senator, we are devastated that we cannot-

Ms. Butler (03:37:13):

To the parents. What do you say to those parents, Mr. Spiegel?

Mr. Spiegel (03:37:17):

I’m so sorry that we have not been able to prevent these tragedies. We work very hard to block all search terms related to drugs from our platform. We proactively look for and detect drug related content. We remove it from our platform, preserve it as evidence, and then we refer it to law enforcement for action.

(03:37:35)
We’ve worked together with nonprofits and with families on education campaigns because the scale of the fentanyl epidemic is extraordinary. Over 100,000 people lost their lives last year, and we believe people need to know that one pill can kill. That campaign was viewed more than 260 million times on Snapchat. We also-

Ms. Butler (03:37:52):

Mr. Spiegel, there are two fathers in this room who lost their sons. They’re 16 years old. Their children were able to get those pills from Snapchat.

(03:38:07)
I know that there are statistics and I know that there are good efforts. None of those efforts are keeping our kids from getting access to those drugs on your platform. As California company, all of you, I’ve talked with you about what it means to be a good neighbor and what California families and American families should be expecting from you. You owe them more than just a set of statistics, and I look forward to you showing up on all pieces of these legislation, all of you showing up on all pieces of legislation to keep our children safe.

(03:38:40)
Mr. Zuckerberg, I want to come back to you. I talked with you about being a parent to a young child who doesn’t have a phone, is not on social media at all, and one of the things that I am deeply concerned with as a parent to a young black girl is the utilization of filters on your platform that would suggest to young girls utilizing your platform, the evidence that they are not good enough as they are.

(03:39:24)
I want to ask more specifically and refer to some unredacted court documents that revealed that your own researchers concluded that these face filters that mimic plastic surgery negatively impact youth mental health indeed and wellbeing.

(03:39:47)
Why should we believe, why should we believe that because that you are going to do more to protect young women and young girls when it is that you give them the tools to affirm the self-hate that is spewed across your platforms? Why is it that we should believe that you are committed to doing anything more to keep our children safe?

Mark Zuckerberg (03:40:12):

Sorry, there’s a lot to unpack there.

Ms. Butler (03:40:14):

There is a lot.

Mark Zuckerberg (03:40:15):

We give people tools to express themselves in different ways and people use face filters and different tools to make media and photos and videos that are fun or interesting across a lot of the different products that are [inaudible 03:40:30]-

Ms. Butler (03:40:30):

Plastic surgery pins are good tools to express creativity?

Mark Zuckerberg (03:40:36):

Senator, I’m not speaking to that specific-

Ms. Butler (03:40:38):

Skin lightning tools are tools to express creativity?

Mark Zuckerberg (03:40:41):

I’m not defending-

Ms. Butler (03:40:43):

This is the direct thing that I’m asking about.

Mark Zuckerberg (03:40:44):

I’m not defending any specific one of those. I think that the ability to filter and an edit images is generally a useful tool for expression. For that specifically, I’m not familiar with the study that you’re referring to, but we did make it so that we’re not recommending this type of content to teens [inaudible 03:41:06]-

Ms. Butler (03:41:06):

I may know no reference to a study to court documents that revealed your knowledge of the impact of these types of filters on young people, generally young girls in part particular.

Mark Zuckerberg (03:41:20):

Senator, I disagree with that characterization. I think that there’s… There have been hypothesis-

Ms. Butler (03:41:22):

With court documents?

Mark Zuckerberg (03:41:25):

I haven’t seen any document that says… but-

Ms. Butler (03:41:26):

Okay, Mr. Zuckerberg, my time is up. I hope that you hear what is being offered to you and are prepared to step up and do better. I know this Senate committee is going to do our work to hold you to greater account. Thank you, Mr. Chair.

Mr. Chairman (03:41:44):

Senator Tillis.

Mr. Tillis (03:41:47):

Thank you Mr. Chair. Thank you all for being here. I don’t feel like I’m going to have an opportunity to ask a lot of questions, so I’m going to reserve the right to submit some for the record, but I have heard… We’ve had hearings like this before.

(03:42:03)
I’ve been in the Senate for nine years. I’ve heard hearings like this before. I’ve heard horrible stories about people who have died, committed suicide. I’ve been embarrassed. Every year, we have an annual flogging. Every year. And what materially has occurred over the last nine years.

(03:42:29)
Just yes or no question, do any of y’all participate in an industry consortium trying to make this fundamentally safe across platforms? Yes or no? Mr. Zuckerberg.

Mark Zuckerberg (03:42:38):

Yes.

Mr. Tillis (03:42:38):

[inaudible 03:42:39]

Ms. Linda Yaccarino (03:42:40):

There’s a variety of organizations that we work-

Mr. Tillis (03:42:41):

Do you participate?

Ms. Linda Yaccarino (03:42:43):

… which organizations-

Mr. Tillis (03:42:44):

I should say, does anyone here not participate in an industry? I actually think it would be immoral for you all to consider it a strategic advantage to keep say, or to keep private something that would secure all these platforms to avoid this sort of problem. Do you all agree with that?

(03:43:02)
That anybody that would be saying, “You want ours because ours is the safest and these haven’t figured out the secret sauce,” that you as an industry realize this is an existential threat to you all if we don’t get it right. Right?

(03:43:12)
I mean, you’ve got to secure your platforms. You got to deal with this. Do you not have an inherent mandate to do this? Because it would seem to me if you don’t, you’re going to cease to exist.

(03:43:23)
I mean, we could regulate you out of business if we wanted to, and the reason I’m saying… It may sound like a criticism, it’s not a criticism. I think we have to understand that there should be an inherent motivation for you to get this right.

(03:43:37)
Our Congress will make a decision that could potentially put you out of business. Here’s the reason I have a concern with that though. I just went on the internet while I was listening intently to all the other members speaking, and I found a dozen different platforms outside the United States, 10 of which are in China, two of which are in Russia.

(03:43:59)
Their daily average active membership numbers in the billions. Well, people say you can’t get on China’s version of TikTok. It took me one quick search on my favorite search engine to find out exactly how I could get an account on this platform today, and so the other thing that we have to keep in mind…

(03:44:23)
I come from technology. Ladies and gentlemen, I could figure out how to influence your kid without them ever being on a social media platform. I can randomly send texts and get a bite and then find out an email address and get compromising information.

(03:44:41)
It is horrible to hear some of these stories and I’ve had these stories occur in my hometown down in North Carolina, but if we only come here and make a point today and don’t start focusing on making a difference, which requires people to stop shouting and start listening and start passing language here, the bad actors are just going to be off our shores.

(03:45:06)
I have another question for you all. How many people, roughly… If you don’t know the exact number it’s okay, roughly, how many people do you have looking 24 hours a day at these horrible images? And just go real quick with an answer down the line and filtering it out.

Mark Zuckerberg (03:45:20):

It’s most of the 40,000 about people who work on safety and-

Mr. Tillis (03:45:24):

And again?

Ms. Linda Yaccarino (03:45:25):

We have 2300 people all over the world.

Mr. Tillis (03:45:27):

Okay.

Mr. Shou Chew (03:45:27):

We have 40,000 trust and safety professionals around the world.

Mr. Jason Citron (03:45:33):

We have approximately 2000 people dedicated to trust and safety and content moderation.

Speaker X (03:45:39):

Our platform is much smaller than these folks. We have hundreds of people and it’s looking at the content and 15% of our workforce focused on it.

Mr. Tillis (03:45:45):

I’ve already, these people have a horrible job. Many of them experience… They have to get counseling for all the things they see. We have evil people out there and we’re not going to fix this by shouting past or talking past each other.

(03:46:00)
We’re going to fix this by every one of y’all being at the table and hopefully coming closer to what I heard one person say supporting a lot of the good bills, like one that I hope Senator Blackburn mentions when she gets a chance to talk.

(03:46:12)
But guys, if you’re not at the table and securing these platforms, you’re going to be on it. And the reason why I’m not okay with that is that if we ultimately destroy your ability to create value and drive you out of business, the evil people will find another way to get to these children. And I do have to admit, I don’t think my mom’s watching this one, but there is good.

(03:46:36)
We can’t look past good that is occurring. My mom who lives in Nashville, Tennessee, and I talked yesterday and we talked about a Facebook post that she made a couple of days ago. We don’t let her talk to anybody else. That connects my 92-year-old mother with her grandchildren and great-grandchildren. That lets a kid who may feel awkward in school to get into a group of people and relate to people. Let’s not throw out the good because we have it all together focused on rooting out the bad.

(03:47:09)
Now, I guarantee you, I could go through some of your governance documents and find a reason to flog every single one of you because you didn’t place the emphasis on it that I think you should. But at the end of the day, I find it hard to believe that any of you people started this business, some of you in your college dorm rooms for the purposes of creating the evil that is being perpetrated on your platforms, but I hope that every single waking hour, you’re doing everything you can to reduce it.

(03:47:38)
You’re not going to be able to eliminate it, and I hope that there are some enterprising young tech people out there today that are going to go to parents and say, “Ladies and gentlemen, your children have a deadly weapon. They have a potentially deadly weapon, whether it’s a phone or a tablet. You have to secure it. You can’t assume that they’re going to be honest and say that they’re 16 when they’re 12.”

(03:48:05)
We all have to recognize that we have a responsibility to play and you guys are at the tip of the spear, so I hope that we can get to a point to where we are moving these bills.

(03:48:17)
If you got a problem with them, state your problem. Let’s fix it. No is not an answer, and know that I want the United States to be the beacon for innovation, to be the beacon for safety, and to prevent people from using other options that have existed since the internet has existed to exploit people. And count me in as somebody that will try and help out. Thank you, Mr. Chair.

Mr. Chairman (03:48:43):

Thank you. Senator Tillis. Next is Senator Ossoff.

Mr. Ossoff (03:48:46):

Thank you, Mr. Chairman, and thank you to our witnesses today. Mr. Zuckerberg, I want to begin by just asking a simple question, which is, do you want kids to use your platform more or less?

Mark Zuckerberg (03:49:01):

Well, we don’t want people under the age of 13 using-

Mr. Ossoff (03:49:03):

Do you want teenagers 13 and up to use your platform more or less?

Mark Zuckerberg (03:49:09):

Well, we would like to build a product that is useful and that people want to use more.

Mr. Ossoff (03:49:13):

My time is going to be limited, so it’s just… Do you want them to use it more or less? Teenagers, 13 to 17 years old, do you want them using Meta products more or less?

Mark Zuckerberg (03:49:23):

I’d like them to be useful enough that they want to use them more.

Mr. Ossoff (03:49:26):

You want them to use it more. I think herein we have one of the fundamental challenges. In fact, you have a fiduciary obligation, do you not to try to get kids to use your platform more?

Mark Zuckerberg (03:49:43):

It depends on how you define that. We obviously are a business, but-

Mr. Ossoff (03:49:49):

I’m… Mr. Zuckerberg, our time, it’s not… It’s self-evident that you have a fiduciary obligation to get your users, including users under 18, to use and engage with your platform more rather than less. Correct?

Mark Zuckerberg (03:50:04):

Over the long term, but in the near term, we often take a lot of steps, including we made a change to show less videos on the platform that reduced amount of time by more than 50 million hours.

Mr. Ossoff (03:50:16):

Okay, but if your shareholders ask you, “Mark…” I wouldn’t, Mr. Zuckerberg here, but your shareholders might be on a first name basis with you, “Mark, are you trying to get kids to use Meta products more or less?” You’d say more, right?

Mark Zuckerberg (03:50:29):

Well, I would say that over the long term, we’re trying to create the most value-

Mr. Ossoff (03:50:33):

I mean, let’s look… So the 10K you file with the SEC. A few things I want to note. Here are some quotes, and this is a filing that you signed, correct?

Mark Zuckerberg (03:50:40):

Yes.

Mr. Ossoff (03:50:40):

Yeah. “Our financial performance has been and will continue to be significantly determined by our success in adding, retaining, and engaging active users.”

(03:50:49)
Here’s another quote: “If our users decrease their level of engagement with our products, our revenue, financial results, and business may be significantly harmed.”

(03:50:57)
Here’s another quote: “We believe that some users, particularly younger

Speaker 17 (03:51:00):

Younger users are aware of and actively engaging with other products and services similar to as a substitute for ours, continues in the event that users increasingly engage with other products and services, we may experience a decline in use and engagement in key demographics or more broadly, in which case our business would likely be harmed.

(03:51:16)
You have an obligation as the chief executive to encourage your team to get kids to use your platform more.

Zuckerberg (03:51:29):

Senator, I think this-

Speaker 17 (03:51:31):

Fundamental, is that not self-evident? You have a fiduciary-

Zuckerberg (03:51:33):

Senator, I think it’s not.

Speaker 17 (03:51:34):

… obligation to your shareholders to get kids to use your platform more.

Zuckerberg (03:51:36):

I think that the thing that’s not intuitive is the direction is to make the products more useful so that way people want to use them more. We don’t give the teams running the Instagram feed or the Facebook feed a goal to increase the amount of time that people spend.

Speaker 17 (03:51:52):

Yeah, but you don’t dispute and your 10k makes clear you want your users engaging more and using more the platform. And I think this gets to the root of the challenge because it’s the overwhelming view of the public certainly in my home state of Georgia, and we’ve had some discussions about the underlying science that this platform is harmful for children.

(03:52:16)
I mean you are familiar with… And not just your platform by the way, social media in general 2023 report from the Surgeon General about the impact of social media on kids’ mental health, which cited evidence that kids who spend more than three hours a day on social media have double the risk of poor mental health outcomes, including depression and anxiety. You’re familiar with that Surgeon General report in the underlying study?

Zuckerberg (03:52:37):

I read the report, yes.

Speaker 17 (03:52:39):

Do you dispute it?

Zuckerberg (03:52:40):

No, but I think it’s important to characterize it correctly. I think what he was flagging in the report is that there seems to be a correlation and obviously the mental health issue is very important, so it’s something that needs to be studied further.

Speaker 17 (03:52:52):

The thing is everyone knows there’s a correlation. Everyone knows that kids who spend a lot of time, too much time on your platforms are at risk and it’s not just the mental health issues. I mean, let me ask you another question. Is your platform safe for kids?

Zuckerberg (03:53:08):

I believe it is, but there’s important-

Speaker 17 (03:53:09):

Hold on a second.

Zuckerberg (03:53:10):

… difference between correlation and causation.

Speaker 17 (03:53:12):

Because we’re not going to be able to get anywhere. We want to work in a productive, open, honest, and collaborative way with the private sector to pass legislation that will protect Americans, that will protect American children above all, and that will allow businesses to thrive in this country. If we don’t start with an open, honest, candid, realistic assessment of the issues, we can’t do that.

(03:53:35)
The first point is you want kids to use the platform more. In fact, you have an obligation to, but if you’re not willing to acknowledge that it’s a dangerous place for children. The internet is a dangerous place for children. Not just your platform, isn’t it? Isn’t the internet a dangerous place for children?

Zuckerberg (03:53:50):

I think it can be. Yeah. There’s both great things that people can do and there are harms that we need to work today.

Speaker 17 (03:53:54):

Yeah, it’s a dangerous place for children. There are families here who have lost their children. There are families across the country whose children have engaged in self-harm, who have experienced low self-esteem, who have been sold deadly pills on the internet. The internet’s a dangerous place for children, and your platforms are dangerous places for children. Do you agree?

Zuckerberg (03:54:13):

I think that there are harms that we need to work to mitigate. I mean, I’m not going to… I think overall the-

Speaker 17 (03:54:17):

Why not? Why not? Why not just acknowledge it? Why do we have to do the very careful code?

Zuckerberg (03:54:23):

I disagree with the characterization that you have.

Speaker 17 (03:54:25):

Which character that the Internet’s a dangerous place for children?

Zuckerberg (03:54:28):

I think you’re trying to characterize our products as inherently dangerous, and I think that-

Speaker 17 (03:54:32):

Inherently or not, your products are places where children can experience harm. They can experience harm to their mental health, they can be sold drugs, they can be preyed upon by predators. They’re dangerous places, and yet you have an obligation to promote the use of these platforms by children. All I’m trying to suggest to you, Mr. Zuckerberg, and my time is running short, is that in order for you to succeed, you and your colleagues here, we have to acknowledge these basic truths.

(03:55:07)
We have to be able to come before the American people, the American public, the people in my state of Georgia and acknowledge the internet is dangerous, including your platforms. There are predators lurking. There are drugs being sold. There are harms to mental health that are taking a huge toll on kids’ quality of life.

(03:55:26)
And yet you have this incentive, not just you, Mr. Zuckerberg, all of you have an incentive to boost, maximize use, utilization and engagement, and that is where public policy has to step in to make sure that these platforms are safe for kids so kids are not dying, so kids are not overdosing. So kids are not cutting themselves or killing themselves because they’re spending all day scrolling instead of playing outside. And I appreciate all of you for your testimony. We will continue to engage as we develop this legislation. Thank you.