Dick Durbin (44:15):

Thank you, Senator Graham. Today we welcome five witnesses whom I’ll introduce. Jason Citron, the CEO of Discord Incorporated, Mark Zuckerberg, the founder and CEO of Meta, Evan Spiegel, the co-founder and CEO of Snap Incorporated, Shou Chew, the CEO of TikTok, and Linda Yaccarino, the CEO of X Corporation, formerly known as Twitter. I will note for the record that Mr. Zuckerberg and Mr. Chew are appearing voluntarily. I’m disappointed that our other witnesses did not offer that same degree of cooperation. Mr. Citron, Mr. Spiegel and Ms. Yaccarino are here pursuant to subpoenas, and Mr. Citron only accepted services of his subpoena after US Marshals were sent to Discord’s headquarters at taxpayers expense. I hope this is not a sign of your commitment or lack of commitment to addressing the serious issue before us. After I swear in the witnesses, each witness will have five minutes to make an opening statement.

(45:20)
Then senators will ask questions in an opening round each of seven minutes. I expect to take a short break at some point during questioning to allow the witnesses to stretch their legs. If anyone is in need of a break at any point, please let my staff know. Before I turn to the witnesses, I’d also like you to take a moment to acknowledge that this hearing has gathered a lot of attention as we expected. We have a large audience, the largest I’ve seen in this room today. I want to make clear as with other judiciary committee hearings, we ask people to behave appropriately. I know there is high emotion in this room, for justifiable reasons, but I ask you to please follow the traditions of the committee. That means no standing, shouting, chanting, or applauding witnesses. Disruptions will not be tolerated. Anyone who does disrupt the hearing will be asked to leave. The witnesses are here today to address a serious topic. We want to hear what they have to say. I thank you for your cooperation.

(46:18)
Could all of the witnesses please stand to be sworn in? Do you affirm the testimony you’re about to give before the committee will be the truth, the whole truth, and nothing but the truth, so help you God? Let the record reflect that all the witnesses have answered in the affirmative. Mr. Citron, please proceed with your opening statement.

Jason Citron (46:43):

Good morning.

Dick Durbin (46:44):

Good morning.

Jason Citron (46:45):

My name is Jason Citron and I am the co-founder and CEO of Discord. We are an American company with about 800 employees living and working in 33 states. Today, Discord has grown to more than 150 million monthly active users. Discord is a communications platform where friends hang out and talk online about shared interests, from fantasy sports to writing music to video games. I’ve been playing video games since I was five years old and as a kid, it’s how I had fun and found friendship. Many of my fondest memories are of playing video games with friends. We built Discord so that anyone could build friendships playing video games, from Minecraft to Wordle and everything in between. Games have always brought us together and Discord makes that happen today. Discord is one of the many services that have revolutionized how we communicate with each other in the different moments of our lives, iMessage, Zoom, Gmail and on and on. They enrich our lives, create communities, accelerate commerce, healthcare, and education.

(48:04)
Just like with all technology and tools, there are people who exploit and abuse our platforms for immoral and illegal purposes. All of us here on the panel today and throughout the tech industry have a solemn and urgent responsibility to ensure that everyone who uses our platforms is protected from these criminals, both online and off. Discord has a special responsibility to do that because a lot of our users are young people. More than 60% of our active users are between the ages of 13 and 24. It’s why safety is built into everything we do. It’s essential to our mission and our business, and most of all, this is deeply personal. I’m a dad with two kids. I want Discord to be a product that they use and love, and I want them to be safe on Discord. I want them to be proud of me for helping to bring this product to the world. That’s why I’m pleased to be here today to discuss the important topic of the online safety of minors.

(49:13)
My written testimony provides a comprehensive overview of our safety programs. Here are a few examples of how we protect and empower young people. First, we’ve put our money into safety. The tech sector has a reputation of larger companies buying smaller ones to increase user numbers and boost financial results, but the largest acquisition we’ve ever made at Discord was a company called Sentropy. It didn’t help us expand our market share or improve our bottom line. In fact, because it uses AI to help us identify, ban, and report criminals and bad behavior, it has actually lowered our user count by getting rid of bad actors. Second, you’ve heard of end-to-end encryption that blocks anyone, including the platform itself, from seeing users’ communications. It’s a feature on dozens of platforms but not on Discord. That’s a choice we’ve made. We don’t believe we can fulfill our safety obligations if the text messages of teens are fully encrypted because encryption would block our ability to investigate a serious situation and when appropriate, report to law enforcement. Third, we have a zero tolerance policy on child sexual abuse material or CSAM. We scan images uploaded to Discord to detect and block the sharing of this abhorrent material. We’ve also built an innovative tool, Teen Safety Assist, that blocks explicit images and helps young people easily report unwelcome conversations. We’ve also developed a new semantic hashing technology for detecting novel forms of CSAM called Clip, and we’re sharing this technology with other platforms through the tech coalition. Finally, we recognize that improving online safety requires all of us to work together, so we partner with nonprofits, law enforcement, and our tech colleagues to stay ahead of the curve in protecting young people online. We want to be the platform that empowers our users to have better online experiences, to build true connections, genuine friendships, and to have fun. Senators, I sincerely hope today is the beginning of an ongoing dialogue that results in real improvements in online safety. I look forward to your questions and to helping the committee learn more about Discord.

Dick Durbin (51:43):

Thank you, Mr. Citron. Mr. Zuckerberg.

Mark Zuckerberg (51:49):

Chairman Durbin, Ranking Member Graham and members of the committee, every day, teens and young people do amazing things on our services. These are apps to create new things, express themselves, explore the world around them, and feel more connected to the people they care about. Overall, teens tell us that this is a positive part of their lives, but some face challenges online, so we work hard to provide parents and teen support and controls to reduce potential harms. Being a parent is one of the hardest jobs in the world. Technology gives us new ways to communicate with our kids and feel connected to their lives, but it can also make parenting more complicated, and it’s important to me that our services are positive for everyone who uses them. We are on the side of parents everywhere working hard to raise their kids. Over the last eight years, we’ve built more than 30 different tools, resources, and features that parents can set time limits for their teens using our apps, see who they’re following or if they report someone for bullying.

(52:46)
For teens, we’ve added nudges that remind them when they’ve been using Instagram for a while or if it’s getting late and they should go to sleep, as well as ways to hide words or people without those people finding out. We put special restrictions on teen accounts on Instagram. By default, accounts for under 16s are set to private, have the most restrictive content settings and can’t be messaged by adults that they don’t follow or people they aren’t connected to. With so much of our lives spent on mobile devices and social media, it’s important to look into the effects on teen mental health and wellbeing. I take this very seriously. Mental health is a complex issue and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes. A recent National Academies of Science report evaluated over 300 studies and found that research, quote, “Did not support the conclusion that social media causes changes in adolescent mental health at the population level,” end quote. It also suggested that social media can provide significant positive benefits when young people use it to express themselves, explore and connect with others. Still, we’re going to continue to monitor the research and use it to inform our roadmap. Keeping young people safe online has been a challenge since the internet began, and as criminals evolve their tactics, we have to evolve our defenses too. We work closely with law enforcement to find bad actors and help bring them to justice, but the difficult reality is that no matter how much we invest or how effective our tools are, there’s always more to learn and more improvements to make, but we remain ready to work with members of this committee, industry and parents to make the internet safer for everyone.

(54:28)
I’m proud of the work that our teams do to improve online child safety on our services and across the entire internet. We have around 40,000 people overall working on safety and security, and we’ve invested more than $20 billion in this since 2016, including around $5 billion in the last year alone. We have many teams dedicated to child safety and teen wellbeing, and we lead the industry in a lot of the areas that we’re discussing today. We build technology to tackle the worst online risks and share it to help our whole industry get better, like Project Lantern, which helps companies share data about people who break child safety rules and we’re founding members of Take It Down, a platform which helps young people prevent their nude images from being spread online. We also go beyond legal requirements and use sophisticated technology to proactively discover abusive material. And as a result, we find and report more inappropriate content than anyone else in the industry.

(55:22)
As the National Center for Missing and Exploited Children put it this week, Meta goes, quote, “Above and beyond to make sure that there are no portions of their network where this type of activity occurs,” end quote. I hope we can have a substantive discussion today that drives improvements across the industry, including legislation that delivers what parents say they want, a clear system for age verification, and control over what apps their kids are using. Three out of four parents want app store age verification, and four out of five want parental approval of whenever teens download apps. We support this. Parents should have the final say on what apps are appropriate for their children and shouldn’t have to upload their ID every time. That’s what app stores are for.

(56:08)
We also support setting industry standards on age-appropriate content and limiting signals for advertising to teens to age and location and not behavior. At the end of the day, we want everyone who uses our services to have safe and positive experiences. Before I wrap up, I want to recognize the families who are here today who have lost a loved one or lived through some terrible things that no family should have to endure. These issues are important for every parent and every platform. I’m committed to continuing to work in these areas and I hope we can make progress today.

Dick Durbin (56:46):

Thank you. Mr. Spiegel.

Evan Spiegel (56:58):

Chairman Durbin, Ranking Member Graham and members of the committee, thank you for convening this hearing and for moving forward important legislation to protect children online. I’m Evan Spiegel, the co-founder and CEO of Snap. We created Snapchat, an online service that is used by more than 800 million people worldwide to communicate with their friends and family. I know that many of you have been working to protect children online since before Snapchat was created, and we are grateful for your long-term dedication to this cause and your willingness to work together to help keep our community safe. I want to acknowledge the survivors of online harms and the families who are here today who have suffered the loss of a loved one. Words cannot begin to express the profound sorrow I feel that a service we designed to bring people happiness and joy has been abused to cause harm. I want to be clear that we understand our responsibility to keep our community safe.

(57:52)
I also want to recognize the many families who have worked to raise awareness on these issues, push for change and collaborate with lawmakers on important legislation like the Cooper Davis Act, which can help save lives. I started building Snapchat with my co-founder, Bobby Murphy, when I was 20 years old. We designed Snapchat to solve some of the problems that we experienced online when we were teenagers. We didn’t have an alternative to social media. That meant pictures shared online were permanent, public and subject to popularity metrics. It didn’t feel very good. We built Snapchat differently because we wanted a new way to communicate with our friends that was fast, fun and private. A picture is worth a thousand words, so people communicate with images and videos on Snapchat. We don’t have public likes or comments when you share your story with friends. Snapchat is private by default, meaning that people need to opt in to add friends and choose who can contact them.

(58:44)
When we built Snapchat, we chose to have the images and videos sent through our service delete by default. Like prior generations who’ve enjoyed the privacy afforded by phone calls which aren’t recorded, our generation has benefited from the ability to share moments through Snapchat that may not be picture perfect, but instead, convey emotion without permanence. Even though Snapchat messages are deleted by default, we let everyone know that images and videos can be saved by the recipient. When we take action on illegal or potentially harmful content, we also retain the evidence for an extended period, which allows us to support law enforcement and hold criminals accountable.

(59:21)
To help prevent the spread of harmful content on Snapchat, we approve the content that is recommended on our service using a combination of automated processes and human review. We apply our content rules consistently and fairly across all accounts. We run samples of our enforcement actions through quality assurance to verify that we’re getting it right. We also proactively scan for known child sexual abuse material, drug-related content, and other types of harmful content, remove that content, deactivate and device block offending accounts, preserve the evidence for law enforcement and report certain content to the relevant authorities for further action. Last year, we made 690,000 reports to the National Center for Missing and Exploited Children, leading to more than 1000 arrests. We also removed 2.2 million pieces of drug-related content and blocked 705,000 associated accounts. Even with our strict privacy settings, content moderation efforts, proactive detection and law enforcement collaboration, bad things can still happen when people use online services. That’s why we believe that people under the age of 13 are not ready to communicate on Snapchat.

(01:00:25)
We strongly encourage parents to use the device-level parental controls on iPhone and Android. We use them in our own household and my wife approves every app that our 13-year-old downloads. For parents who want more visibility and control, we built Family Center in Snapchat, where you can view who your teen is talking to, review privacy settings and set content limits. We have worked for years with members of the committee on legislation like the Kids Online Safety Act and the Cooper Davis Act, which we are proud to support. I want to encourage broader industry support for legislation protecting children online. No legislation is perfect, but some rules of the road are better than none.

(01:01:03)
Much of the work that we do to protect people that use our service would not be possible without the support of our partners across the industry, government, nonprofit organizations, NGOs, and in particular, law enforcement and the first responders who have committed their lives to helping keep people safe. I’m profoundly grateful for the extraordinary efforts across our country and around the world to prevent criminals from using online services to perpetrate their crimes. I feel an overwhelming sense of gratitude for the opportunities that this country has afforded me and my family. I feel a deep obligation to give back and to make a positive difference, and I’m grateful to be here today as part of this vitally important democratic process. Members of the committee, I give you my commitment that we’ll be part of the solution for online safety. We’ll be honest about our shortcomings and we’ll work continuously to improve. Thank you and I look forward to answering your questions.

Dick Durbin (01:01:52):

Thank you, Mr. Spiegel. Mr. Chew.

Shou Chew (01:01:56):

Chair Durbin ranking Member Graham and members of the committee, I appreciate the opportunity to appear before you today. My name is Shou Chew and I’m the CEO of TikTok, an online community of more than 1 billion people worldwide, including well over 170 million Americans who use our app every month to create, to share, and to discover. Now, although the average age on TikTok in the US is over 30, we recognize that special safeguards are required to protect minors and especially when it comes to combating all forms of CSAM. As a father of three young children myself, I know that the issues that we’re discussing today are horrific and the nightmare of every parent. I am proud of our efforts to address the threats to young people online from a commitment to protecting them to our industry leading policies, use of innovative technology and significant ongoing investments in trust and safety to achieve this goal.

(01:02:59)
TikTok is vigilant about enforcing its 13 and up age policy and offers an experience for teens that is much more restrictive than you and I would have as adults. We make careful product design choices to help make our app inhospitable to those seeking to harm teens. Let me give you a few examples of longstanding policies that are unique to TikTok. We didn’t do them last week. First, direct messaging is not available to any users under the age of 16. Second, accounts for people under 16 are automatically set to private, along with their content. Furthermore, the content cannot be downloaded and will not be recommended to people they do not know. Third, every teen under 18 has a screen time limit automatically set to 60 minutes. And fourth, only people 18 and above are allowed to use our livestream feature.

(01:04:02)
I’m proud to say that TikTok was among the first to empower parents to supervise their teens on our app with our family pairing tools. This includes setting screen time limits, filtering out content from the teen’s feeds, amongst others. We made these choices after consulting with doctors and safety experts who understand the unique stages of teenage development to ensure that we have the appropriate safeguards to prevent harm and minimize risk. Now, safety is one of the core priorities that defines TikTok under my leadership. We currently have more than 40,000 trust and safety professionals working to protect our community globally, and we expect to invest more than $2 billion in trust and safety efforts this year alone, with a significant part of that in our US operations. Our robust community guidelines strictly prohibit content or behavior that puts teenagers at risk of exploitation or other harm, and we vigorously enforce them.

(01:05:11)
Our technology moderates all content uploaded to our app to help quickly identify potential CSAM and other material that breaks our rules. It automatically removes the content or elevates it to our safety professionals for further review. We also moderate direct messages for CSAM and related material, and use third party tools like Photo DNA and take it down to combat CSAM to prevent content from being uploaded to our platform. We continually meet with parents, teachers, and teens. In fact, I sat down with a group just a few days ago. We use their insight to strengthen the protections on our platform, and we also work with leading groups like the Technology Coalition. The steps that we’re taking to protect teams are a critical part of our larger trust and safety work as we continue our voluntary and unprecedented efforts to build a safe and secure data environment for us users, ensuring that our platform remains free from outside manipulation and implementing safeguards on our content recommendation and moderation tools.

(01:06:22)
Keeping teens safe online requires a collaborative effort as well as collective action. We share the community’s concern and commitment to protect young people online and we welcome the opportunity to work with you on legislation to achieve this goal. Our commitment is ongoing and unwavering because there is no finish line when it comes to protecting teens.

(01:06:43)
Thank you for your time and consideration today. I’m happy to answer your questions.

Dick Durbin (01:06:48):

Thanks, Mr. Chew. Ms. Yaccarino.

Linda Yaccarino (01:06:53):

Chairman Durbin, Ranking Member Graham, and esteemed members of the committee, thank you for the opportunity to discuss X’s work in protecting-

Dick Durbin (01:07:05):

Ms. Yaccarino. Could you check if your microphone is on?

Linda Yaccarino (01:07:08):

My talk button is on.

Dick Durbin (01:07:10):

And you might-

Linda Yaccarino (01:07:10):

How is that?

Dick Durbin (01:07:10):

Better. Thank you very much.

Linda Yaccarino (01:07:12):

Maybe I adjust my chair. Apologies. Start over.

(01:07:18)
Chairman Durbin, Ranking Member Graham, and esteemed members of the committee, thank you for the opportunity to discuss X’s work to protect the safety of minors online.

(01:07:34)
Today’s hearing is titled A Crisis which Calls for Immediate Action. As a mother, this is personal and I share the sense of urgency. X is an entirely new company, an indispensable platform for the world and for democracy. You have my personal commitment that X will be active and a part of this solution.

(01:08:07)
While I joined X only in June of 2023, I bring a history of working together with governments, advocates, and NGOs to harness the power of media to protect people. Before I joined, I was struck by the leadership steps this new company was taking to protect children. X is not the platform of choice for children and teens. We do not have a line of business dedicated to children. Children under the age of 13 are not allowed to open an account. Less than 1% of the US users on X are between the ages of 13 and 17, and those users are automatically set to a private default setting and cannot accept a message from anyone they do not approve.

(01:09:12)
In the last 14 months, X has made material changes to protect minors. Our policy is clear: X has zero tolerance towards any material that features or promotes child sexual exploitation. My written testimony details X’s extensive policies on content or actions that are prohibited and include grooming, blackmail, and identifying alleged victims of CSE.

(01:09:49)
We’ve also strengthened our enforcement with more tools and technology to prevent those bad actors from distributing, searching for and engaging with CSE content. If CSE content is posted on X, we remove it. And now we also remove any account that engages with CSE content, whether it is real or computer-generated.

(01:10:19)
Last year, X suspended 12.4 million accounts for violating our CSE policies. This is up from 2.3 million accounts that were removed by Twitter in 2022. In 2023, 850,000 reports were sent to NCMEC, including our first ever auto-generated report. This is eight times more than was reported by Twitter in 2022. We’ve changed our priorities. We’ve restructured our trust and safety teams to remain strong and agile. We are building a trust and safety center of excellence in Austin, Texas to bring more agents in-house to accelerate our impact. We’re applying to the Technology Coalition’s Project Lantern to make further industry-wide progress and impact. We’ve also opened up our algorithms for increased transparency.

(01:11:36)
We want America to lead in this solution. X commends the Senate for passing the REPORT Act and we support the SHIELD Act. It is time for a federal standard to criminalize the sharing of non-consensual intimate material. We need to raise the standards across the entire internet ecosystem, especially for those tech companies that are not here today and not stepping up. X supports the STOP-CSAM Act. The Kids Online Safety Act should continue to progress, and we will support the continuation to engage with it and ensure the protections of the freedom of speech.

(01:12:33)
There are two additional areas that require everyone’s attention. First, as the daughter of a police officer, law enforcement must have the critical resources to bring these bad offenders to justice. Second, with artificial intelligence offenders tactics will continue to sophisticated and evolve. Industry collaboration is imperative here. X believes that the freedom of speech and platform safety can and must coexist. We agree that now is the time to act with urgency.

(01:13:21)
Thank you. I look forward to answering your questions.