Dick Durbin (01:13:24):

Thank you very much, Ms. Yaccarino. Now we’ll go into rounds of questions, seven minutes each for the members as well. I would like to make note of your testimony, Ms. Yaccarino. I believe you are the first social media company to publicly endorse the CSAM Act.

Linda Yaccarino (01:13:40):

It is our honor, Chairman.

Dick Durbin (01:13:42):

That is progress, my friends. Thank you for doing that. I’m still going to be asking some probing questions, but let me get down to the bottom line here. I’m going to focus on my legislation on CSAM.

(01:13:56)
What it says is civil liability if you intentionally or knowingly host or store child sexual abuse materials or make child sex abuse materials available. Secondly, intentionally or knowingly promote or aid and abet a violation of child sexual exploitation laws.

(01:14:21)
Is there anyone here who believes you should not be held civilly liable for that type of conduct? Mr. Citron.

Jason Citron (01:14:32):

Good morning, Chair. We very much believe that this content is disgusting and that there are many things about the STOP CSAM bill that I think are very encouraging and we very much support adding more resources for the cyber tip line and modernizing that along with giving more resources to NCMEC. And I’d be very open to having conversations with you and your team to talk through the details of the bill some more.

Dick Durbin (01:15:04):

I sure would like to do that, because if you intentionally or knowingly host or store CSAM, I think you ought to at least be civilly liable. I can’t imagine anyone who would disagree with that.

Jason Citron (01:15:15):

Yeah, it’s disgusting content.

Dick Durbin (01:15:17):

It certainly is. That’s why we need you supporting this legislation.

(01:15:21)
Mr. Spiegel, I want to tell you, I listened closely to your testimony here and it’s never been a secret that Snapchat is used to send sexually explicit images. In 2013, early in your company’s history, you admitted this in an interview. Do you remember that interview?

Evan Spiegel (01:15:45):

Senator, I don’t recall the specific interview.

Dick Durbin (01:15:50):

You said that when you were first trying to get people on the app, you would, quote, “Go up to the people and be like, ‘Hey, you should try this application. You can send disappearing photos.’ And they would say, ‘Oh, for sexting?'” Do you remember that interview?

Evan Spiegel (01:16:04):

Senator, when we first created the application, it was actually called Picaboo, and the idea was around disappearing images. The feedback we received from people using the app is that they were actually using it to communicate, so we changed the name of the application to Snapchat and we found that people were using it to talk visually.

Dick Durbin (01:16:20):

As early as 2017, law enforcement identified Snapchat as the pedophiles go-to sexual exploitation tool. The case of a twelve-year-old girl identified in court only as LW shows the danger. Over two and a half years, a predator sexually groomed her, sending her sexually explicit images and videos over Snapchat. The man admitted that he only used Snapchat with LW and not any other platforms because he, quote, “knew the chats would go away.”

(01:16:50)
Did you and everyone else at Snap really fail to see that the platform was the perfect tool for sexual predators?

Evan Spiegel (01:16:59):

Senator, that behavior is disgusting and reprehensible. We provide in-app reporting tools so that people who are being harassed or who have been shared inappropriate sexual content can report it. In the case of harassment or sexual content, we typically respond to those reports within 15 minutes so that we can provide help.

Dick Durbin (01:17:14):

When LW, the victim, sued Snapchat, her case was dismissed under Section 230 of the Communications Decency Act. Do you have any doubt that had Snap faced the prospect of civil liability for facilitating sexual exploitation, the company would’ve implemented even better safeguards?

Evan Spiegel (01:17:34):

Senator, we already work extensively to proactively detect this type of behavior. We make it very difficult for predators to find teens on Snapchat. There are no public friends lists, no public profile photos. When we recommend friends for teens, we make sure that they have several mutual friends in common before making that recommendation. We believe those safeguards are important to preventing predators from misusing our platform.

Dick Durbin (01:17:59):

Mr. Citron, according to Discord’s website, it takes a, quote, “proactive and automated approach to safety only on servers with more than 200 members. Smaller servers rely on server owners and community moderators to define and enforce behavior.”

(01:18:16)
So how do you defend an approach to safety that relies on groups of fewer than 200 sexual predators to report themselves for things like grooming, trading a CSAM or sextortion?

Jason Citron (01:18:28):

Chair, our goal is to get all of that content off of our platform and ideally prevent it from showing up in the first place or from people engaging in these kind of horrific activities. We deploy a wide array of techniques that work across every surface on Discord. I mentioned we recently launched something called Teen Safety Assist, which works everywhere and it’s on by default for teen users. That kind of acts like a buddy that lets them know if they’re in a situation or talking with someone that may be inappropriate so they can report that to us and block that user.

Dick Durbin (01:19:05):

Mr. Citron, if that were working, we wouldn’t be here today.

Jason Citron (01:19:09):

Chair, this is an ongoing challenge for all of us. That is why we’re here today. But we do have 15% of our company is focused on trust and safety, of which this is one of our top issues. That’s more people than we have working on marketing and promoting the company. So we take these issues very seriously, but we know it’s an ongoing challenge and I look forward to working with you and collaborating with our tech peers and the non-profits to improve our approach.

Dick Durbin (01:19:35):

I certainly hope so.

(01:19:37)
Mr. Chew, your organization business is one of the more popular ones among children. Can you explain to us what you are doing particularly and whether you’ve seen any evidence of CSAM in your business?

Shou Chew (01:19:54):

Yes, Senator. We have a strong commitment to invest in trust and safety, and as I said in my opening statement, I intend to invest more than $2 billion in trust and safety this year alone. We have 40,000 safety professionals working on this topic. We have built a specialized child safety team to help us identify specialized issues, horrific issues like material like the ones you have mentioned. If we identify any on our platform and we proactively do do detection, we will remove it and we will report them to NCMEC and other authorities.

Dick Durbin (01:20:30):

Why is it TikTok allowing children to be exploited into performing commercialized sex acts?

Shou Chew (01:20:37):

Senator, I respectfully disagree with that characterization. Our live-streaming product is not for anyone below the age of 18. We have taken action to identify anyone who violates that and we remove them from using that service.

Dick Durbin (01:20:54):

At this point, I’m going to turn to my ranking member, Senator Graham.

Senator Graham (01:20:59):

Thank you. Mr. Citron, you said we need to start a discussion. Be honest with you, we’ve been having this discussion for a very long time. We need to get a result, not a discussion. Do you agree with that?

Jason Citron (01:21:16):

Ranking Member, I agree this is an issue that we’ve also been very focused on since we started our company 2015, but this is the first time we’ve-

Senator Graham (01:21:23):

Are you familiar with the EARN IT Act by myself and Senator Blumenthal?

Jason Citron (01:21:28):

A little bit, yes.

Senator Graham (01:21:29):

Okay. Do you support that?

Jason Citron (01:21:33):

We-

Senator Graham (01:21:35):

Yes or no?

Jason Citron (01:21:36):

We are not prepared to support it today, but we believe that section-

Senator Graham (01:21:38):

Okay. Do you support the CSAM Act?

Jason Citron (01:21:41):

The STOP CSAM Act, we are not prepared to support today, but we-

Senator Graham (01:21:47):

Okay. Do you support the SHIELD Act?

Jason Citron (01:21:48):

We believe that the cyber tip line-

Senator Graham (01:21:50):

Do you support it, yes or no?

Jason Citron (01:21:53):

We believe that the cyber tip line and NCMEC-

Senator Graham (01:21:54):

I’ll take that to be no. The Project Safe Childhood Act, do you support it?

Jason Citron (01:22:00):

We believe that-

Senator Graham (01:22:02):

I’ll take that to be no. The REPORT Act, do you support it?

Jason Citron (01:22:06):

Ranking Member Graham, we very much look forward to having conversations with you and your team.

Senator Graham (01:22:09):

Thank you.

Jason Citron (01:22:10):

We want to be part of the solution-

Senator Graham (01:22:11):

I look forward to passing a bill that will solve the problem. Do you support removing Section 230 liability protections for social media companies?

Jason Citron (01:22:18):

I believe that Section 230 needs to be updated. It’s a very old law.

Senator Graham (01:22:23):

Do you support repealing it so people can sue if they believe they’re harmed?

Jason Citron (01:22:28):

I think that Section 230, as written, while it has many downsides, has enabled innovation on the internet, which I think has largely been-

Senator Graham (01:22:35):

Thank you very much. So here you are. If you’re waiting on these guys to solve the problem, we’re going to die waiting.

(01:22:43)
Mr. Zuckerberg, Mr… Trying to be respectful here. The representative from South Carolina, Mr. Guffey’s son got caught up in a sex extortion ring in Nigeria using Instagram. He was shaken down, paid money. That wasn’t enough and he killed himself using Instagram. What would you like to say to him?

Mark Zuckerberg (01:23:19):

That’s terrible. No one should have to go through something like that.

Senator Graham (01:23:24):

You think he should be allowed to sue you?

Mark Zuckerberg (01:23:31):

I think that they can sue us.

Senator Graham (01:23:33):

Well, I think he should and he can’t. So the bottom line here, folks, is that this committee is done with talking. We passed five bills unanimously in their different ways and look at who did this, Graham-Blumenthal, Durbin-Hawley, Klobuchar-Cornyn, Cornyn-Klobuchar, Blackburn and Ossoff. We’ve found common ground here that just is astonishing, and we’ve had hearing after hearing, Mr. Chairman, and the bottom line is I’ve come to conclude, gentlemen, that you’re not going to support any of this.

(01:24:14)
Linda, how do you say your last name?

Linda Yaccarino (01:24:19):

Yaccarino.

Senator Graham (01:24:21):

Do you support the EARN IT Act?

Linda Yaccarino (01:24:26):

We strongly support the collaboration to raise industry practices to-

Senator Graham (01:24:33):

No, no, no, no. Do you support the EARN IT Act?

Linda Yaccarino (01:24:34):

… prevent CSAM.

Senator Graham (01:24:36):

Do you support the… In English, do you support the EARN IT Act, yes or no? We don’t need double-speak here.

Linda Yaccarino (01:24:40):

We look forward to supporting and continue our conversations. As you can see-

Senator Graham (01:24:43):

Okay, so I’ll take that as no. The reason the EARN IT Act is important, you can actually lose your liability protections when children are exploited and you didn’t use best business practices. See, the EARN IT Act means you have to earn liability production. You’re given it no matter what you do. So to the members of this committee, it is now time to make sure that the people who holding up the signs can sue on behalf of their loved ones. Nothing will change until the courtroom door is open to victims of social media. $2 billion, Mr. Chew, how much… What percentage is that of what you made last year?

Shou Chew (01:25:26):

Senator, it’s a significant and increasing investment. As a private company, we’re not sharing our financials.

Senator Graham (01:25:31):

You pay taxes. 2% is what percent of your revenue?

Shou Chew (01:25:36):

Senator, we’re not ready to share our financials in public.

Senator Graham (01:25:38):

Well, I just think $2 billion sounds a lot unless you make a hundred billion. So the point is, when you tell us you’re going to spend $2 billion, great, but how much do you make? It’s all about eyeballs. Well, our goal is to get eyeballs on you, and it’s just not about children. The damage being done, do you realize, Mr. Chew, that your TikTok representative in Israel resigned yesterday?

Shou Chew (01:26:06):

Yes, I’m aware.

Senator Graham (01:26:07):

Okay, and he said, “I resigned from TikTok. We’re living in a time in which our existence as Jews in Israel and Israel is under attack and in danger.” Multiple screenshots taken from TikTok’s internal employee chat platform, known as Lark, show how TikTok’s trust and safety officers celebrate the barbaric acts of Hamas and other Iranian-backed terror groups including the Houthis in Yemen.

Shou Chew (01:26:36):

Senator, need to make it very clear that pro-Hamas content and hate speech is not allowed on our platform or within our company.

Senator Graham (01:26:42):

Why did he resign? Why did he resign? Why did he quit?

Shou Chew (01:26:45):

Senator, we also do not allow any hateful behavior at work-

Senator Graham (01:26:48):

Do you know why he quit? Do you know why he quit?

Shou Chew (01:26:49):

We do not allow this. We will investigate such crimes.

Senator Graham (01:26:51):

My question is he quit. I’m sure he had a good job. He gave up a good job because he thinks your platform is being used to help people who want to destroy the Jewish state. And I’m not saying you want that. Mr. Zuckerberg, I’m not saying you want, as an individual, any of the harms. I am saying that the products you have created, with all the upside, have a dark side.

(01:27:14)
Mr. Citron, I am tired of talking. I’m tired of having discussions. We all know the answer here and here’s the ultimate answer: Stand behind your product. Go to the American courtroom and defend your practices. Open up the courthouse door. Until you do that, nothing will change. Until these people can be sued for the damage they’re doing, it is all talk. I’m a Republican who believes in free enterprise, but I also believe that every American who’s been wronged has to have somebody to go to complain.

(01:27:46)
There’s no commission to go to that can punish you. There’s not one law in the book because you oppose everything we do, and you can’t be sued. That has to stop, folks. How do you expect the people in the audience to believe that we’re going to help their families if we don’t have some system or a combination of systems to hold these people accountable? Because for all the upside, the dark side is too great to live with. We do not need to live this way as Americans.

Dick Durbin (01:28:20):

Thank you, Senator Graham. Senator Klobuchar is next. She’s been quite a leader on the subject for quite a long time on the SHIELD Act and with Senator Cornyn on the revenge porn legislation. Senator Klobuchar.

Amy Klobuchar (01:28:34):

Thank you very much, Chairman Durbin, and thank you Ranking Member Graham for those words. I couldn’t agree more. For too long we have been seeing the social media companies turn a blind eye when kids have joined these platforms in record numbers. They have used algorithms that push harmful content because that content got popular. They provided a venue, maybe not knowingly at first, but for dealers to sell deadly drugs like Fentanyl. Our own head of our Drug Enforcement Administration has said they basically been captured by the cartels in Mexico and in China.

(01:29:24)
So I strongly support first of all, the STOP CSAM bill. I agree with Senator Graham that nothing is going to change unless we open up the courtroom doors. I think the time for all of this immunity is done because I think money talks even stronger than we talk up here.

(01:29:43)
Two of the five bills as noted are my bills with Senator Cornyn. One has actually passed through the Senate but is waiting action in the House. But the other one is the SHIELD Act, and I do appreciate those supportive acts of that bill. This is about revenge porn. The FBI director testified before this committee, there’s been over 20 suicides of kids attributed to online revenge porn in just the last year.

(01:30:14)
But for those parents out there and those families, this is for them about their own child, but it’s also about making sure this doesn’t happen to other children. I know because I’ve talked to these parents, parents like Bridgette Norring from Hastings, Minnesota who is out there today. Bridgette lost her teenage son after he took a fentanyl-laced pill that he purchased on the internet. Amy Neville is also here. Platform, got the pill. Amy Neville is also here. Her son Alexander was only 14 when he died after taking a pill he didn’t know was actually fentanyl.

(01:30:58)
We’re starting a law enforcement campaign, One Pill Kills in Minnesota, going to the schools with the sheriff’s and law enforcement. But the way to stop it is yes at the border and at the points of entry, but we know that 30%, some of the people that are getting the fentanyl are getting it off the platforms.

(01:31:17)
Meanwhile, social media platforms generated 11 billion in revenue in 2022 from advertising directed at children and teenagers, including nearly two billion in ad profits derived from users age 12 and under. When a Boeing plane lost a door in mid-flight several weeks ago, nobody questioned the decision to ground a fleet of over 700 planes. So why aren’t we taking the same type of decisive action on the danger of these platforms when we know these kids are dying?

(01:31:59)
We have bills that have passed through this incredibly diverse committee when it comes to our political views, that have passed through this committee and they should go to the floor. We should do something finally about liability, and then we should turn to some of the other issues that a number of us have worked on when it comes to the charges for app stores and when it comes to some of the monopoly behavior and the self-preferencing, but I’m going to stick with this today.

(01:32:32)
Facts: One-third of fentanyl cases investigated over five months had direct ties to social media. That’s from the DEA. Facts: Between 2012 and 2022, cyber tip line reports of online child sexual exploitation increased from 415,000 to more than 32 million. And as I noted, at least 20 victims committed suicide in sextortation cases.

(01:33:01)
So I’m going to start with that, with you, Mr. Citron. My bill with Senator Cornyn, the SHIELD Act, includes a threat provision that would help protection and accountability for those that are threatened by these predators. Young kids get a picture, send it in, think they got a new girlfriend or a new boyfriend, ruins their life or they think it’s going to be ruined, and they kill themselves. So could you tell me why you’re not supporting the SHIELD Act?

Jason Citron (01:33:31):

Senator, we think it’s very important that teens have a safe experience on our platforms. I think that the portion to strengthen law enforcement’s ability to investigate crimes against children and hold bad actors accountable is incredible.

Amy Klobuchar (01:33:46):

Are you holding open that you may support it?

Jason Citron (01:33:48):

We very much would like to have conversations with you. We’re open to discussing further. And we do welcome legislation regulation. This is a very important issue for our country, and we’ve been prioritizing safety for teens-

Amy Klobuchar (01:34:01):

Thank you. I’m much more interested in if you support it because there’s been so much talk at these hearings and popcorn throwing and the like, and I just want to get this stuff done. I’m so tired of this. It’s been 28 years, what, since the internet, we haven’t passed any of these bills because everyone’s double-talk, double-talk. It’s time to actually pass them. And the reason they haven’t passed is because of the power of your company. So let’s be really, really clear about that. So what you say matters. Your words matter.

(01:34:29)
Mr. Chew, I’m a co-sponsor of Chair Durbin’s STOP CSAM Act of 2023 along with Senator Hawley, who’s the lead Republican, I believe, which, among other things, empowers victims by making it easier for them to ask tech companies to remove the material and related imagery from their platforms. Why would you not support this bill?

Shou Chew (01:34:52):

Senator, we largely support it. I think the spirit of it is very aligned with what we want to do. There are questions about implementation that I think companies like us and some other groups have, and we look forward to asking those. And of course, if this legislation is law, we will comply.

Amy Klobuchar (01:35:08):

Mr. Spiegel, I know we talked ahead of time. I do appreciate your company’s support for the Cooper Davis Act, which will finally… It’s a bill with Senator Shaheen and Marshall, which will allow law enforcement to do more when it comes to fentanyl. I think you know what a problem this is. Devin Norring, teenagers from Hastings, I mentioned his mom here, suffered dental pain and migraines, so he bought what he thought was a Percocet over Snap, but instead he bought a counterfeit drug laced with a lethal dose of fentanyl. As his mom, who’s here with us today said, “All of the hopes and dreams we as parents had for Devin were erased in the blink of an eye and no mom should have to bury their kid.” Talk about why you support the Cooper Davis Act.

Evan Spiegel (01:35:58):

Senator, thank you. We strongly support the Cooper Davis Act, and we believe it will help DEA go after the cartels and get more dealers off the streets to save more lives.

Amy Klobuchar (01:36:06):

Okay. Are there others that support that bill? No? Okay.

(01:36:12)
Last, Mr. Zuckerberg, in 2021, The Wall Street Journal reported on internal Meta research documents asking: Why do we care about tweens? These were internal documents; I’m quoting the documents. And answering its own question by citing Meta internal emails. They are a valuable but untapped audience.

(01:36:35)
At a commerce hearing, I’m also on that committee, I asked Meta’s Head of Global Safety, why children age 10 to 12 are so valuable to Meta? She responded, “We do not knowingly attempt to recruit people who aren’t old enough to use our apps.” Well, when the 42 state attorneys general, Democrat and Republican, brought their case, they said this statement was inaccurate.

(01:36:59)
Few examples. In 2021, she received an email, Ms. Davis, from Instagram’s research director saying that Instagram is investing and experiencing targeting young age, roughly 10 to 12. In a February, 2021 instant message, one of your employees wrote that Meta is working to recruit Gen Alpha before they reach teenage years. A 2018 email that circulated inside Meta says that you were briefed that children under 13 will be critical for increasing the rate of acquisition when users turn 13.

(01:37:39)
Explain that with what I heard at that testimony at the commerce hearing that they weren’t being targeted. And I just ask again, as the other witnesses were asked, why your company does not support the STOP CSAM Act or the SHIELD Act?

Mark Zuckerberg (01:37:55):

Sure, Senator. I’m happy to talk to both of those. We had discussions internally about whether we should build a kids’ version of Instagram, like the kids versions of YouTube and other services.

Amy Klobuchar (01:38:09):

I remember that.

Mark Zuckerberg (01:38:09):

We haven’t actually moved forward with that and we currently have no plans to do so. So I can’t speak directly to the exact emails that you cited, but it sounds to me like there were deliberations around a project that people internally thought was important and we didn’t end up moving forward with.

Amy Klobuchar (01:38:27):

Okay. And the bills, what are you going to say about the two bills?

Mark Zuckerberg (01:38:30):

Sure. Overall, my position on the bills is I agree with the goal of all of them. There are most things that I agree with within them. There are specific things that I would probably do differently. We also have our own legislative proposal for what we think would be most effective in terms of helping the internet in the various companies give parents control over the experience. So I’m happy to go into the detail on any one of them, but ultimately I think that this is-

Amy Klobuchar (01:38:59):

Again, I think these parents

Ms. Klobuchar (01:39:00):

Parents will tell you that this stuff hasn’t worked to just give parents control. They don’t know what to do. It’s very, very hard and that’s why we are coming up with other solutions that we think are much more helpful to law enforcement, but also this idea of finally getting something going on liability, because I just believe with all the resources you have that you actually would be able to do more than you’re doing or these parents wouldn’t be sitting behind you right now in this Senate hearing room.

Mr. Durbin (01:39:28):

Thank you, Senator Klobuchar-

Mark Zuckerberg (01:39:29):

Senator, can I speak to that or do you want me to come back later?

Ms. Klobuchar (01:39:32):

Yeah, yeah.

Mr. Durbin (01:39:32):

Please, go ahead.

Mark Zuckerberg (01:39:35):

I don’t think that parents should have to upload an ID or prove that they’re the parent of a child in every single app that their children use. I think the right place to do this and a place where it would be actually very easy for it to work is within the app stores themselves. Where my understanding is Apple and Google already, or at least Apple, already requires parental consent when a child does a payment with an app, so it should be pretty trivial to pass a law that requires them to make it so that parents have control anytime a child downloads an app and offers consent of that. And the research that we’ve done shows that the vast majority of parents want that, and I think that that’s the type of legislation in addition to some of the other ideas that you all have that would make this a lot easier for parents.

Ms. Klobuchar (01:40:22):

Just to be clear, I remember one mom telling me with all these things she could maybe do that she can’t figure out, it’s like a faucet overflowing in a sink and she’s out there with a mop while her kids are getting addicted to more and more different apps and being exposed to material. We’ve got to make this simpler for parents so they can protect their kids and I just don’t think this is going to be the way to do it. I think the answer is what Senator Graham has been talking about, which is opening up the halls of the courtroom, so that puts it on you guys to protect these parents and protect these kids and then also to pass some of these laws, it makes it easier for law enforcement.

Mr. Durbin (01:40:58):

Thank you, Senator Klobuchar. We’re going to try to stick to the seven-minute rule. Didn’t work very well, but I’ll try to give additional time on the other side as well. Senator Cornyn.

Mr. Cornyn (01:41:11):

There’s no question that your platforms are very popular, but we know that while here in the United States we have an open society and free exchange of information that there are authoritarian governments, there are criminals who will use your platforms for the sale of drugs, for sex, for extortion and the like. And Mr. Chew, I think your company is unique among the ones represented here today, because of its ownership by ByteDance, a Chinese company. And I know there have been some steps that you’ve taken to wall off the data collected here in the United States. But the fact of the matter is that under Chinese law and Chinese national intelligence laws, all information accumulated by companies in the People’s Republic of China are required to be shared with the Chinese intelligence services. ByteDance, the initial release of TikTok I understand was 2016. These efforts that you made with Oracle under the so-called Project Texas to wall off the US data was in 2021 and apparently allegedly fully walled off in March of ’23. What happened to all of the data that TikTok collected before that?

Shou Chew (01:42:47):

Senator, thank you.

Mr. Cornyn (01:42:49):

From American users.

Shou Chew (01:42:50):

Understand. TikTok is owned by ByteDance, which is majority owned by global investors and we have three Americans on the board out of five. You are right in pointing out that over the last three years we have spent billions of dollars building out Project Texas, which is a plan that is unprecedented in our industry. The wall off, firewall off protected US data from the rest of our staff. We also-

Mr. Cornyn (01:43:14):

I’m asking about all of the data that you collected prior to that event.

Shou Chew (01:43:18):

Yes, Senator. We have started a data deletion plan. I talked about this a year ago. We have finished the first phase of data deletion through our data centers outside of the Oracle cloud infrastructure. We’re beginning phase two where we will not only delete from the data centers, we will hire a third party to verify that work and then we will go into, for example, employees working laptops to delete that as well.

Mr. Cornyn (01:43:41):

Was all of the data collected by TikTok prior to Project Texas shared with the Chinese government pursuant to the national intelligence laws of that country?

Shou Chew (01:43:52):

Senator, we have not been asked for any data by the Chinese government and we have never provided it.

Mr. Cornyn (01:44:03):

Your company is unique again among the ones represented here today, because you’re currently undergoing review by the Committee on Foreign Investment in the United States. Is that correct?

Shou Chew (01:44:14):

Senator, yes. There are ongoing discussions and a lot of our Project Texas work is informed by the discussions with many agencies under the CFIUS umbrella.

Mr. Cornyn (01:44:25):

Well, CFIUS is designed specifically to review foreign investments in the United States for national security risks, correct?

Shou Chew (01:44:33):

Yes, I believe so.

Mr. Cornyn (01:44:34):

And your company is currently being reviewed by this interagency committee at the treasury department for potential national security risks.

Shou Chew (01:44:46):

Senator, this review is on a acquisition of Musical.ly, which is an acquisition that was done many years ago.

Mr. Cornyn (01:44:54):

I mean, is this a casual conversation or are you actually providing information to the Treasury Department about how your platform operates for evaluating a potential national security risk?

Shou Chew (01:45:09):

Senator, it’s been many years across two administrations and a lot of discussions around how our plans are, how our systems work. We have a lot of robust discussions about a lot of detail.

Mr. Cornyn (01:45:24):

63% of teens, I understand, use TikTok. Does that sound about right?

Shou Chew (01:45:31):

Senator, I cannot verify that. We know we are popular amongst many age groups. The average age in the US today for our user base is over 30, but we are aware we are popular.

Mr. Cornyn (01:45:42):

And you reside in Singapore with your family, correct?

Shou Chew (01:45:46):

Yes, I reside in Singapore and I work here in the United States as well.

Mr. Cornyn (01:45:50):

Do your children have access to TikTok in Singapore?

Shou Chew (01:45:54):

Senator, if they lived in the United States, I will give them access to our under 13 experience. My children are below the age of 13.

Mr. Cornyn (01:46:01):

My question is in Singapore, do they have access to TikTok or is that restricted by domestic law?

Shou Chew (01:46:09):

We do not have an under 13 experience in Singapore. We have that in the United States, because we were deemed a mixed audience app and we created under 13 experience in response to that.

Mr. Cornyn (01:46:21):

A Wall Street Journal article published yesterday directly contradicts what your company has stated publicly. According to the journal employees under the Project Texas say that US user data, including user emails, birthdate, IP addresses, continue to be shared with ByteDance staff, again owned by a Chinese company. Do you dispute that?

Shou Chew (01:46:48):

Yes, Senator. There are many things about that article they’re inaccurate. Where it gets right is that this is a voluntary project that we built. We spend billions of dollars. There are thousands of employees involved and it’s very difficult, because it’s unprecedented.

Mr. Cornyn (01:47:06):

Why is it important that the data collected from US users be stored in the United States?

Shou Chew (01:47:15):

Senator, this was a project we built in response to some of the concerns that were raised by members of this committee and others.

Mr. Cornyn (01:47:21):

And that was because of concerns that the data that was stored in China could be accessed by the Chinese Communist Party according to the national intelligence laws, correct?

Shou Chew (01:47:34):

Senator, we are not the only company that does business that has Chinese employees, for example. We’re not even the only company in this room that hires Chinese nationals. But in order to address some of these concerns, we have moved the data into the Oracle cloud infrastructure. We built a 2000 person team to oversee the management of that data based here. We walled it off from the rest of the organization and then we open it up to third parties like Oracle and we will onboard others to give them third party validation. This is unprecedented access. I think we are unique in taking even more steps to protect user data in the United States.

Mr. Cornyn (01:48:09):

Well, you’ve disputed the Wall Street Journal story published yesterday. Are you going to conduct any sort of investigation to see whether there’s any truth to the allegations made in the article or are you just going to dismiss them outright?

Shou Chew (01:48:24):

We’re not going to dismiss them. So we have ongoing security inspections not only by our own personnel, but also by third parties to ensure that the system is rigorous and robust. No system that any one of us can build is perfect, but what we need to do is to make sure that we are always improving it and testing it against bad people who may try to bypass it. And if anyone breaks our policies within our organization, we will take disciplinary action against them.

Mr. Durbin (01:48:52):

Thanks, Senator Cornyn. Senator Coons.

Mr. Coons (01:48:55):

Thank you, Chairman Durbin. First I’d like to start by thanking all the families that are here today. All the parents who are here because of a child they have lost. All the families that are here, because you want us to see you and to know your concern. You have contacted each of us in our offices expressing your grief, your loss, your passion and your concern. And the audience that is watching can’t see this. They can see you, the witnesses from the companies, but this room is packed as far as the eye can see. And when this hearing began, many of you picked up and held pictures of your beloved and lost children. I benefit from and participate in social media as do many members of the committee and our nation and our world. There are now a majority of people on earth participating in and in many ways benefiting from one of the platforms you have launched or you lead or you represent.

(01:49:54)
And we have to recognize there are some real positives to social media. It has transformed modern life, but it has also had huge impacts on families, on children, on nations. And there’s a whole series of bills championed by members of this committee that tries to deal with the trafficking in illicit drugs, the trafficking in illicit child sexual material, the things that are facilitated on your platforms that may lead to self-harm or suicide. So we’ve heard from several of the leaders on this committee, the chair and ranking and very talented and experienced senators, the frame that we are looking at this is consumer protection. When there is some new technology we put in place regulations to make sure that it is not overly harmful. As my friend Senator Klobuchar pointed out, one door flew off of one plane, no one was hurt, and yet the entire Boeing fleet of that type of plane was grounded and a federal fit for purpose agency did an immediate safety review. I’m going to point not to the other pieces of legislation that I think are urgent that we take up and pass, but to the core question of transparency. If you are a company manufacturing a product that is allegedly addictive and harmful, one of the first things we look to is safety information. We try to give our constituents, our consumers warnings, labels that help them understand what are the consequences of this product and how to use it safely or not. As you’ve heard pointedly from some of my colleagues, if you sell an addictive, defective, harmful product in this country in violation of regulations and warnings, you get sued. And what is distinct about platforms as an industry is most of the families who are here are here because there were not sufficient warnings and they cannot effectively sue you. So let me dig in for a moment if I can, because each of your companies voluntarily discloses information about the content and the safety investments you make and the actions you take.

(01:52:09)
There was a question pressed, I think it was by Senator Graham earlier about TikTok, I believe Mr. Chew, you said invest $2 billion in safety. My background memo said your global revenue is $85 billion. Mr. Zuckerberg, my background memo says, you’re investing $5 billion in safety and Meta and your annual revenue is on the order of $116 billion. You can hear some expressions from the parents in the audience. What matters is the relative numbers and the absolute numbers. Your data folks, if there’s anybody in this world who understand data, it’s you guys. So I want to walk through whether or not these voluntary measures of disclosure, of content and harm, are sufficient, because I would argue we’re here because they’re not. Without better information, how can policymakers know whether the protections you’ve testified about, the new initiatives, the starting programs, the monitoring and the take-downs are actually working? How can we understand meaningfully how big these problems are without measuring and reporting data?

(01:53:17)
Mr. Zuckerberg, your testimony referenced to National Academy of Sciences study that said, “At the population level there is no proof about harm for mental health.” Well, it may not be at the population level, but I’m looking at a room full of hundreds of parents who have lost children and our challenge is to take the data and to make good decisions about protecting families and children from harm. So let me ask about what your companies do or don’t report, and I’m going to particularly focus on your content policies around self-harm and suicide. And I’m just going to ask a series of yes or no questions and what I’m getting at is, do you disclose enough? Mr. Zuckerberg, from your policies prohibiting content about suicide or self-harm, do you report an estimate of the total amount of content, not a percentage of the overall, not a prevalence number, but the total amount of content on your platform that violates this policy and do you report the total number of views that self-harm or suicide promoting content that violates this policy gets on your platform?

Mark Zuckerberg (01:54:30):

Yes, Senator, we pioneered a quarterly reporting on our community standards enforcement across all these different categories of harmful content. We focus on prevalence, which you mentioned, because what we’re focused on is what percent of the content that we take down or our systems proactively identifying.

Mr. Coons (01:54:48):

Mr. Zuckerberg, I’m going to interrupt you and you’re very talented. I have very little time left. I’m trying to get an answer to a question, not as a percentage of the total, because remember it’s a huge number, so the percentage is small. But do you report the actual amount of content and the amount of views self-harm content received?

Mark Zuckerberg (01:55:09):

No. I believe we focus on prevalence.

Mr. Coons (01:55:10):

Correct, you don’t. Ms. Yaccarino, yes or no, you report it or you don’t?

Ms. Linda Yaccarino (01:55:16):

Senator, as a reminder, we have less than 1% of our users that are between the ages of 13 and 17 and-

Mr. Coons (01:55:24):

Do you report the absolute number of how many images and how often-

Ms. Linda Yaccarino (01:55:27):

We report of posts and accounts that we’ve taken down in 2023. We’ve taken over almost a million posts down that in regards to mental health and self harm.

Mr. Coons (01:55:38):

Mr. Chew, do you disclose the number of appearances of these types of content and how many are viewed before they’re taken down?

Shou Chew (01:55:46):

Senator, we disclosed the number we take down based on each category of violation and how many of that were taken down proactively before it was reported.

Mr. Coons (01:55:55):

Mr. Spiegel?

Ms. Evan Spiegel (01:55:57):

Yes, Senator, we do disclose.

Mr. Coons (01:55:59):

Mr. Citron?

Mr. Jason Citron (01:56:00):

Yes, we do.

Mr. Coons (01:56:01):

So I’ve got three more questions I’d love to walk through if I had unlimited time. I will submit them for the record. The larger point is that platforms need to hand over more content about how the algorithms work, what the content does and what the consequences are. Not at the aggregate, not at the population level, but the actual numbers of cases so we can understand the content.

(01:56:27)
In closing, Mr. Chairman, I have a bipartisan bill, the Platform Accountability and Transparency Act, co-sponsored by Senators Cornyn, Klobuchar, Blumenthal on this committee and Senator Cassidy and others. It’s in front of the Commerce Committee, not this committee, but it would set reasonable standards for disclosure and transparency to make sure that we’re doing our jobs based on data. Yes, there’s a lot of emotion in this field, understandably, but if we’re going to legislate responsibly about the management of the content on your platforms, we need to have better data. Is there any one of you willing to say now that you support this bill? Mr. Chairman, let the record reflect a yawning silence from the leaders of the social media platforms. Thank you.

Mr. Durbin (01:57:17):

Thanks, Senator Coons. We’re on the first of two roll calls and so please understand if some of the members leave and come back. It’s no disrespect. They’re doing their job. Senator Lee?

Mr. Lee (01:57:29):

Thank you, Mr. Chairman. Tragically survivors of sexual abuse are often repeatedly victimized and revictimized over and over and over again by having non-consensual images of themselves on social media platforms. There’s a NCMEC study that pointed out there was one instance of CSAM that reappeared more than 490,000 times after it had been reported. After it had been reported. So we need tools in order to deal with this. We need, frankly, laws in order to mandate standards so that this doesn’t happen so that we have a systematic way of getting rid of this stuff, because there is literally no plausible justification, no way of defending this.

(01:58:28)
One tool, one that I think would be particularly effective, is a bill that I’ll be introducing later today and I invite all my committee members to join me. It’s called the Protect Act. The Protect Act would in pertinent part, require websites to verify age and verify that they’ve received consent of any and all individuals appearing on their site in pornographic images. And it also require platforms to have meaningful processes for an individual seeking to have images of him or herself removed in a timely manner. Ms. Yaccarino, based on your understanding of existing law, what might it take for a person to have those images removed, say from X?

Ms. Linda Yaccarino (01:59:15):

Senator Lee, thank you. It sounds like what you’re going to introduce into law in terms of ecosystem wide and user consent sounds exactly like part of the philosophy of why we’re supporting the SHIELD Act and no one should have to endure non-consensual images being shared online.

Mr. Lee (01:59:38):

And without that, without laws in place and it is fantastic anytime a company, as you’ve described with yours, wants to take those steps, it’s very helpful. It can take a lot longer than it should and sometimes it does to the point where somebody had images shared 490,000 times after it was reported to the authorities and that’s deeply concerning. But yes, the Protect Act would work in tandem with, it’s a good compliment to the SHIELD Act.

(02:00:15)
Mr. Zuckerberg, let’s turn to you next. As you know, I feel strongly about privacy and believe that one of the best protections for an individual’s privacy online involves end-to-end encryption. We also know that a great deal of grooming and sharing of CSAM happens to occur on end-to-end encrypted systems. Does Meta allow juvenile accounts on its platforms to use encrypted messaging services within those apps?

Mark Zuckerberg (02:00:48):

Sorry, Senator. What do you mean juvenile?

Mr. Lee (02:00:50):

Underage, people under 18.

Mark Zuckerberg (02:00:52):

Under 18. We allow people under the age of 18 to use WhatsApp and we do allow that to be encrypted, yes.

Mr. Lee (02:00:59):

Do you have a bottom level age at which they’re not allowed to use it? Child of any age?

Mark Zuckerberg (02:01:04):

Yeah, I don’t think we allow people under the age of 13.

Mr. Lee (02:01:09):

What about you, Mr. Citron. On Discord, do you allow kids to have accounts to access encrypted messaging?

Mr. Jason Citron (02:01:18):

Discord is not allowed to be used by children under the age of 13, and we do not use end-to-end encryption for text messages. We believe that it’s very important to be able to respond to well-formed law enforcement requests, and we’re also working on proactively building technology. We’re working with a nonprofit called Thorn to build a grooming classifier so that our teen safety assist feature can actually identify these conversations if they might be happening so we can intervene and give those teens tools to get out of that situation or potentially even report those conversations and those people to law enforcement.

Mr. Lee (02:01:51):

And then encryption, as much as it can prove useful elsewhere, it can be harmful, especially if you are on a site where you know children are being groomed and exploited. If you allow children onto an end-to-end encryption enabled app, that can prove problematic.

(02:02:09)
Now, let’s go back to you for a moment, Mr. Zuckerberg. Instagram recently announced that it’s going to restrict all teenagers from access to eating disorder material, suicidal ideation themed material, self-harm content, and that’s fantastic. That’s great. What’s odd, what I’m trying to understand is why it is that Instagram is only restricting, it’s restricting access to sexually explicit content, but only for teens ages 13 to 15. Why not restrict it for 16 and 17 year olds as well?

Mark Zuckerberg (02:02:56):

Senator, my understanding is that we don’t allow sexually explicit content on the service for people of any age.

Mr. Lee (02:03:04):

How is that going?

Mark Zuckerberg (02:03:09):

Our prevalence metrics suggest that I think it’s 99% or so of the content that we remove, we’re able to identify automatically using AI system. So I think that our efforts in this, while they’re not perfect, I think are industry leading. The other thing that you asked about was self-harm content, which is what we recently restricted. And we made that shift, I think the state of the science is shifting a bit. Previously we believed that when people were thinking about self-harm, it was important for them to be able to express that and get support, and now more of the thinking in the field is that it’s just better to not show that content at all, which is why we recently moved to restrict that from showing up for those teens at all.

Mr. Lee (02:03:57):

Is there a way for parents to make a request on what their kid can see or not see on your sites?

Mark Zuckerberg (02:04:07):

There are a lot of parental controls. I don’t think that we currently have a control around topics, but we do allow parents to control the time that the children are on the site and also a lot of it is based on monitoring and understanding what the teen’s experience is, what they’re interacting with et cetera.

Mr. Lee (02:04:28):

Mr. Citron, Discord allows pornography on its site. Now reportedly 17% of minors who use Discord have had online sexual interactions on your platform, 17%. And 10% have those interactions with someone that the minor believed to be an adult. Do you restrict minors from accessing Discord servers that host pornographic material on them?

Mr. Jason Citron (02:04:57):

Senator, yes, we do restrict minors from accessing content that is marked for adults. Discord also does not recommend content to people. Discord is a chat app. We do not have a feed or an algorithm that boosts content. So we allow adults to share content with other adults in adult labeled spaces and we do not allow teens to access that content.

Mr. Lee (02:05:16):

Okay, I see my time’s expired. Thank you.

Speaker 15 (02:05:22):

Welcome everyone. We are here in this hearing, because as a collective, your platforms really suck at policing themselves. We hear about it here in Congress with fentanyl and other drug dealing facilitated across platforms. We see it and hear about it here in Congress with harassment and bullying that takes place across your platforms. We see it and hear about it here in Congress with respect to child pornography, sex exploitation, and blackmail, and we are sick of it.

(02:06:13)
It seems to me that there is a problem with accountability, because these conditions continue to persist. In my view, Section 230, which provides immunity from lawsuit is a very significant part of that problem. If you look at where bullies have been brought to heal recently, whether it’s dominion, finally getting justice against Fox News after a long campaign to try to discredit the election equipment manufacturer. Or whether it’s the moms and dads of the Sandy Hook victims finally getting justice against Infowars in its campaign of trying to get people to believe that the massacre of their children was a fake put on by them. Or even now more recently with a writer getting a very significant judgment against Donald Trump after years of bullying and defamation.

(02:07:38)
An honest courtroom has proven to be the place where these things get sorted out. And I’ll just describe one case, if I may. It’s called Doe v. Twitter. The plaintiff in that case was blackmailed in 2017 for sexually explicit photos and videos of himself then aged 13 to 14. A compilation video of multiple CSAM videos surfaced on Twitter in 2019. A concerned citizen reported that video on December 25th, 2019, Christmas Day. Twitter took no action. The plaintiff, then a minor in high school in 2019, became aware of this video from his classmates in January of 2020. You’re a high school kid and suddenly there’s that. That’s a day that’s hard to recover from. Ultimately, he became suicidal. He and his parents contacted law enforcement and Twitter to have these videos removed on January 21st and again on January 22nd, 2020. And Twitter ultimately took down the video on January 30th, 2020 once federal law enforcement got involved. That’s a pretty foul set of facts.

(02:09:44)
When the family sued Twitter for all those months of refusing to take down the explicit video of this child, Twitter invoked Section 230 and the district court ruled that the claim was barred. There is nothing about that set of facts that tells me that Section 230 performed any public service in that regard. I would like to see very substantial adjustments to Section 230 so that the honest courtroom, which brought relief and justice to E. Jean Carroll after months of defamation, which brought silence, peace, and justice to the parents of the Sandy Hook children after months of defamation and bullying by Infowars and Alex Jones. And which brought significant justice and an end to the campaign of defamation by Fox News to a little company that was busy just making election machines.

(02:11:18)
My time is running out. I’ll turn to, I guess, Senator Cruz is next. But I would like to have each of your companies put in writing what exemptions from the protection of Section 230 you would be willing to accept, bearing in mind the fact situation in Doe v. Twitter. Bearing in mind the enormous harm that was done to that young person and that family by the non-responsiveness of this enormous