IE 11 is not supported. For an optimal experience visit our site on another browser.
EVENT ENDED
Last updated

Senate hearing highlights: Lawmakers grill CEOs from TikTok, X and Meta about online child safety

The hearing, “Big Tech and the Online Child Sexual Exploitation Crisis,” lasted roughly four hours.

The Senate Judiciary Committee gave a group of prominent social media CEOs a bipartisan thrashing on Wednesday, pressing them on alleged shortcomings related to the safety of young people on their platforms.

Meta CEO Mark Zuckerberg faced the toughest questioning, at one point turning around to apologize to parents who filled the chamber. Those parents carried pictures of their children whose deaths have been tied to social media through drugs, harassment and other threats.

But unlike in past hearings that primarily focused on the platforms themselves, senators consistently pressed the tech leaders on whether they supported various pieces of legislation meant to address online safety.

The tech executives reaffirmed their commitment to child safety. Many of the social media platforms pointed to various tools they offer as examples of how they are proactive about preventing exploitation online.


The CEOs exited the room pretty quickly after the hearing concluded, leaving through a side door and taking no questions.

That's about as bipartisan a hearing as you're going to see these days.

But the question remains: What legislation will make it past both houses of Congress, the president's desk and a Supreme Court review?

Washington has been talking about these issues for years, and the momentum does seem palpable. But the political reality is what it is.

That hearing was definitely not as painful as similar tech and social media related hearings in the past. But the “yadda yadda yadda” moment from Kennedy was a real head scratcher.

Here are the most notable remarks from senators

Kaetlyn Liddy

  • “If you’re waiting on these guys to solve the problem, we’re gonna die waiting.” — Graham, on the tech platform CEOs taking action.
  • “It’s been 28 years since the internet. We haven’t passed any of these bills … The reason they haven’t passed is because of the power of your companies, so let’s be really, really clear about that. What you say matters. Your words matter.” — Klobuchar, on the lack of legislation.
  • “Your platforms really suck at policing themselves.” — Whitehouse.

Hearing closes with one final lash for Zuckerberg

Sen. Durbin wrapped the hearing with a final criticism for Zuckerberg and his opening statement.

"Mr. Zuckerberg, just a little advice," Durbin said. "I think your opening statement on mental health needs to be explained because I don’t think it makes any sense."

Durbin went on to say that a parent in the room had a child go through "an emotional experience" and, afterward, the parent said their child "changed right in front of my eyes."

"They holed themselves up in their room. They no longer reached out to their friends. They lost all interest in school. These are mental health consequences that" come from technology like Meta platforms, Durbin said.

Sen. Welch expresses optimism over the future of child safety

Sen. Peter Welch, D-Vt., said there's been progress in addressing child safety, but not enough.

"There is a consensus today that didn't exist, say 10 years ago, that there is a profound threat to children to mental health, to safety," he said.

Welch identified solutions for better ensuring child safety online, including setting industry standards, enacting more legislation, reforming Section 230 and potentially adding a government agency to regulate tech companies.

He questioned the CEOs on layoffs in their companies' trust and safety departments.

"That's alarming because it looks like there is a reduction in emphasis on protecting things," he said.

Jonathan Vanian, CNBC

Welch questioned executives about their massive cost-cutting efforts last year that resulted in a significant number of layoffs affecting employees who have worked in their respective trust-and-safety programs.

CNBC previously reported in May that Meta laid off at least 16 employees who were part of Instagram’s well-being group and over 100 positions related to trust, integrity and responsibility, according to documents filed with the U.S. Department of Labor.

Zuckerberg acknowledged that while Meta conducted layoffs “across the board,” the job cuts were “not really focused on that area.”

“I think our investment is relatively consistent over the last couple of years,” Zuckerberg said. "We invested almost $5 billion last year, and I think this year will be on the same order of magnitude.”

In a heated exchange with Sen. Blackburn, Zuckerberg pushed back on the senator's comment that Meta's social media sites are trying to be "the premiere sex trafficking site in the country."

Zuckerberg replied: "That’s ridiculous."

The crowd in the hearing room broke into applause after the heated exchange between Zuckerberg and Sen. Blackburn over corporate lobbying.

Sen. Blackburn focused on lobbying efforts from the present tech companies, citing a laundry list of interest organizations and groups that she said have been paid by the tech industry to push against proposed pieces of legislation.

The senator ended her questioning by asking that tech companies to come to the table in good faith to help figure out regulatory solutions to child safety issues.

Sen. Marsha Blackburn, R-Tenn., pushed back on reported plans of a permanent TikTok campus in Tennessee.

In her questioning, she addressed TikTok's CEO, saying, "Nashville will not be rolling out the welcome mat for you."

Blackburn has been a consistent and intense critic of TikTok and other tech companies.

Blackburn previously opposed TikTok building a campus in Nashville. She renewed concerns over TikTok’s ties to China and potential national security risks that TikTok poses.

“We do not want TikTok operating anywhere in the United States, especially not in our state,” her office previously said in a statement.

Jonathan Vanian, CNBC

Sen. Jon Ossoff, D-Ga., made the general criticism against Meta that critics have made for years: The fundamental business model of driving and optimizing user engagement and growth trumps any safety-related measure the company might consider.

Zuckerberg disagrees, of course, but for years he’s found himself arguing against this core belief about the way Facebook operates. He also disagreed with the characterization that Meta’s apps present “dangerous places” for children.

Sen. Thom Tillis, R-N.C., said he believes that the CEOs testifying didn't get into the tech industry or create their platforms "for the purposes of creating evil."

"At the end of the day, I find it hard to believe that any of you people started this business, some of you in your college dorm rooms, for the purposes of creating the evil that is being perpetrated on your platforms," Tillis said. "But I hope that every single waking hour, you’re doing everything you can to reduce it."

Some concrete answers to a notable question: How many people does each company have working on their trust and safety division?

Zuckerberg said Meta has 40,000 people.

Yaccarino said X has 2,300 people.

Chew said TikTok has 40,000 people.

Spiegel said Snap has 2,000 people.

Citron said Discord has "hundreds of people" but pointed out it has a smaller platform.

Sen. Butler says Zuckerberg answered differently in private

Butler asked the tech CEOs testifying whether they had met with parents about design changes that the platforms could implement to protect children.

All the platform leaders said they had met with parents.

When Zuckerberg answered, though, Butler stopped him and claimed that in a conversation the two had on Tuesday, Zuckerberg answered differently. She said Zuckerberg had told her he'd not spoken with parents about potential design changes.

Zuckerberg responded by saying he must have misspoken.

There have been a few times in this hearing in which senators have posed tough, interesting questions, but given the CEOs little time to answer.

I suppose that's how these things go, but it's leaving some big things unaddressed and sometimes letting the execs off the hook.

Spiegel offers apology to families who lost children after they bought drugs through Snapchat

While speaking to Sen. Laphonza Butler, D-Calif., Spiegel apologized to families who have had children purchase drugs on Snapchat and later die from taking those drugs.

"I’m so sorry that we have not been able to prevent these tragedies," Spiegel said. "We worked very hard to block all search terms related to drugs from our platform. We proactively look for and detect drug-related content."

He added that the DEA's "One Pill Can Kill" campaign was shared on Snapchat and was viewed more than 260 million times.

Kennedy presses Zuckerberg on whether users understand platform

Lora Kolodny, CNBC

Lora Kolodny, CNBC and Ben Goggin

Sen. Kennedy mixed serious critique in his questioning with light moments, bringing some laughter to the hearing room.

Jokingly addressing Zuckerberg as "Mr. President," Kennedy posed an interesting question to Zuckerberg: Do users really understand what they're getting into?

“Do you think your users really understand what they’re giving to you, all their personal information, and how you process it, how you monetize it? Do you think people really understand?” the senator asked.

Zuckerberg initially replied, “I think people understand the basic terms.”

Kaetlyn Liddy

Throughout this hearing, Yaccarino has repeatedly characterized X as a distinct entity from Twitter, calling X a "14-month-old company."

During the hearing, the CEO voiced her support for the Shield Act, but has also distanced the current leadership from the company's actions while it was known as Twitter.

It only took three hours into this hearing for a senator to bring up slang.

When addressing Spiegel of Snap, Sen. John Kennedy, R-La., asked, "What does 'yada yada yada' mean?

The CEO said he is not familiar with the term, to which Kennedy replied: "Very uncool."

Lora Kolodny, CNBC

Kalhan Rosenblatt and Lora Kolodny, CNBC

Sen. Alex Padilla, D-Calif., asked some of the CEOs how many minors were on their platform.

Spiegel said 20 million teenage users on Snapchat.

Chew didn't have an answer for TikTok.

Yaccarino repeated a statistic that less than 1% of users on X were minors and said X currently has 90 million U.S. users total.

“Being a 14-month-old company, we have reprioritized child protection and safety measures and we have just begun to talk about and discuss how we can enhance those with parental controls," she added.

Hearing resumes

We're back. Nine senators to go.

During the break I talked to a Capitol police officer in the hearing room. He said this is a really big crowd for a hearing... but the craziest he’s seen was the crown that gathered when Elon Musk came to testify in a closed hearing.

Christina Wilkie, CNBC

I have logged hundreds of hours in those empty hearings. Upside? Somewhere to plug your computer in and reporters get to sit. Downside? Empty dais and boring and no news.

Here's a link to the emails that Blumenthal referenced.

While Zuckerberg's apology was the big moment, I thought Blumenthal's questioning of these messages was the most probing. Meta has put in place mechanisms and rules meant to keep younger users safe, but these emails do call into question just how committed the company's leader is to those efforts.

We're on a break

A quick break now in the hearing. We've still got quite a few senators to go. Stay with us!

Cotton grills Chew on his background

During a line of questioning aimed at the background of TikTok's CEO, Sen. Tom Cotton, R-Ark., repeatedly asked if Chew was Chinese or had ever applied for any other citizenship.

"Have you ever been a member of the Chinese Communist Party?” Cotton asked.

Chew replied, “Senator, I’m Singaporean.”

Chew lives in Singapore with his wife and children, who are American.

Reminiscent of Zuckerberg's "Senator, we run ads," response from a 2018 hearing.

Zuckerberg has expressed his support for age verification on platforms throughout the hearing as a means of managing minors' experiences on the platform. He shared that he believes the onus should be on app stores to streamline the process.

Parents booed Zuckerberg after he endorsed the content made by teen influencers on Instagram and other Meta platforms.

Sen. Mazie Hirono questioned Zuckerberg on why Meta didn't default all teens to the app's most restrictive settings.

Zuckerberg responded by saying, "A lot of teens make amazing content."

In response, parents in the audience laughed and booed.

Jonathan Vanian, CNBC

Sen. Mazie Hirono, D-Hawaii, asked Zuckerberg whether he would “commit to reporting measurable child safety data on your quarterly earnings reports and calls.”

Zuckerberg said that the company already has a quarterly report on the data, and said: “I’m not sure it would make as much sense to include [the data] in the SEC filing.”

Senators have pushed the tech CEOs to back the Kids Online Safety Act. X CEO Lina Yaccarino and Snap CEO Evan Spiegel have already given their support for the bill.

Zuckerberg turns to speak directly to parents

Hawley compelled Zuckerberg to turn to the audience to apologize and make safety commitments to parents during his questioning.

"There’s families of victims here today. Have you apologized?" Hawley asked. "Would you like to do so now? They’re here, you’re on national television, would you like now to apologize to the victims?"

Zuckerberg stood, turned and committed to ensuring that the company would act on child safety issues.

Mark Zuckerberg, CEO of Meta, speaks to victims and their family members as he testifies during the US Senate Judiciary Committee hearing "Big Tech and the Online Child Sexual Exploitation Crisis" on January 31, 2024.
Andrew Caballero-Reynolds / AFP - Getty Images

That's quite a moment, and one I didn't really expect. Parents held up the pictures of their deceased kids while Zuckerberg spoke.

That will be remembered for some time.

Jonathan Vanian, CNBC

Hawley asked Zuckerberg if he would “commit to compensate” the families who appeared at the hearing. He mentioned whether Zuckerberg would contribute to a “victims compensation fund.”

Zuckerberg, clearly rattled, said “Senator my job is to make good tools.”

Jonathan Vanian, CNBC

Hawley brings up Meta’s internal research, which was revealed via The Wall Street Journal, detailing Instagram’s negative effects on the mental well-being of teenagers. Hawley describes these as “facts,” which Zuckerberg countered “aren’t facts” and that Hawley is cherrypicking the research.

In his questioning of TikTok's CEO, Sen. Ted Cruz, R-Texas, accused the platform of exposing American kids to "self-harm videos and anti-Israel propaganda."

TikTok has denied it has an anti-Israel bias. An NBC News report from last November found that there has been a shift in support for Palestinians among young people in recent years. An October 2023 poll by Echelon Insights, a Republican-leaning polling firm, found that registered voters under 40 used Instagram or X to express support for Palestinians even more than TikTok.

Cruz also accused TikTok of pushing more positive and educational content to children in China compared with kids in America. TikTok is not available in China.

Blumenthal shares emails between Nick Clegg and Zuckerberg

While addressing Zuckerberg, Sen. Richard Blumenthal, D-Conn., shared emails between Zuckerberg and Nick Clegg, global affairs director for Meta.

In those emails, Clegg wrote: "We are not on track to succeed for our core well-being topics: problematic use, bullying and harassment connections and SSI."

SSI stands for "suicidal self-injury."

In a subsequent email, Clegg said the company's efforts on safety were being held back by a lack of investment.

"Nick Clegg was asking you, pleading with you, for resources to back up the narrative to fulfill the commitments," Blumenthal said. He added that Zuckerberg declined to hire a number of engineers Clegg had requested to help with the problem.

"We’ve done a calculation that those potentially 84 engineers would have cost Meta about $50 million in a quarter when it earned $9.2 billion," Blumenthal said.

Blumenthal did not allow Zuckerberg to reply.

Jonathan Vanian, CNBC

In a heated exchange, Cruz blasted Zuckerberg over Meta’s “warning” screens that alert users of the possibility that content they are about to view could contain child sexual abuse images.

Showing a screenshot of the warning screen, Cruz asked why Meta would let users make the choice to see the problematic content.

“Senator, we take down anything that we think is sexual abuse material on the service,” Zuckerberg replied.

But that didn’t satisfy Cruz, who followed up by asking whether Meta has any data about the number of people who have seen the warning screen and then viewed CSAM.

“I’m not sure that we store that,” Zuckerberg said, pledging that he would personally follow up on the matter after the hearing.

Of all the senators so far, Cruz was by far the most combative, but I also had trouble following his points.

What is the National Center for Missing and Exploited Children?

Several of the tech CEOs have mentioned working with the National Center for Missing and Exploited Children to strengthen child safety on their platforms.

The nonprofit center seeks to find "missing children, reduce child sexual exploitation and prevent child victimization," according to its website. It has condemned tech companies for failing to protect children on their apps.

In a blog post published Wednesday, it said that “even those who have engaged in voluntary initiatives, including some companies testifying today, have fallen far short of implementing solutions that significantly protect children from harm.”

The center called today's hearing a "crucial moment in child safety" in the blog post.

The organization also posted its praise for the Senate Judiciary Committee's hearing on X.

"Holding social media companies accountable is a vital step in ensuring a safer online environment for our children," the post read.

Lora Kolodny, CNBC

X CEO's endorsed the Shield Act and said, “No one should have to endure nonconsensual images being shared online.”

Yaccarino's remarks follow a firestorm for X (formerly Twitter) after nonconsensual sexually explicit deepfakes of the pop star Taylor Swift went viral on the platform.

Congress members are sometimes criticized for not being the most tech savvy people, and that can show up in hearings.

So far today, the senators are coming with solid and substantive questions. We haven't really had any groan-worthy moments just yet, and I think that's meant the CEOs have had little chance to grandstand.

Sen. Mike Lee, R-Utah, questioned Zuckerberg about sexually explicit content on the platform.

Meta has strict policies against most sexually explicit content, and it appeared that Lee misunderstood Meta's policies.

When Zuckerberg said, "My understanding is that we don’t allow sexually explicit content,” Lee responded, "How is that going?"

The audience clapped and cheered in response to the question.

Meta frequently reports some of the highest rates of child sexual abuse material on its platforms annually to the National Center for Missing and Exploited Children.

In what might be the most direct statement of today's hearing so far, Sen. Sheldon Whitehouse, D-R.I., told the social media CEOs: "Your platforms really suck at policing themselves.”

Coons asks for support on bill, says CEOs give 'yawning silence'

As he closed his line of questioning, Coons mentioned his bill: The Platform Accountability and Transparency Act. He wrote the bill alongside Sens. Cassidy, Klobuchar, Cornyn, Blumenthal and Romney.

"Now that it’s in front of the Commerce Committee, not this committee, but it would set reasonable standards for disclosure and transparency to make sure that we’re doing our jobs based on data," Coons said.

He then asked if any of the CEOs would be "willing to say now that you support this bill."

After no one responded, Coons said: "Mr. Chairman, let the record reflect a yawning silence from the leaders of the social media platforms."

Jonathan Vanian, CNBC

Here comes the encryption debate.

When Zuckerberg was asked about encryption and children, he brought up WhatsApp, in which people over the age of 13 can have accounts.

“And we do allow that to be encrypted,” he said.

Sen. Chris Coons, D-Conn., pushed for more transparency, asking the CEOs about whether their platforms disclose the exact number of posts that violate their child safety policies, particularly posts that depict or encourage suicide and self-harm.

Coons said platforms need to share more about "how the algorithms work, what the content does, and what the consequences are, not at the aggregate, not at the population level, but the actual numbers of cases so we can understand the content." 

Jonathan Vanian, CNBC

It's interesting that Coons is really pushing back on the classic tech company deflection that vile content or bad actors only represent tiny percentages of their overall platform. 

“Not at the aggregate, not at the population level, but at the actual cases,” Coons said regarding how lawmakers want more concrete numbers on the kinds of harms and abuses that occur on their platform.

Zuckerberg says there are 'no plans' for a kids' version of Instagram

Jonathan Vanian, CNBC

Zuckerberg brought up Meta’s earlier efforts to “build a kids' version of Instagram,” which would be similar to children versions of YouTube and other child-oriented services.

He said that while the company had “discussions internally” about the project, Meta hasn’t “actually moved forward with that, and we currently have no plans to do so.”

Lawmakers have previously criticized Meta for any efforts related to developing a kids' version of Instagram.

TikTok's CEO is getting hit with some familiar questionts that are not particularly on topic.

Sen. John Cornyn, R-Texas, asks Chew about the company's China ownership, data security issues and national security checks.

Cornyn asks about a Wall Street Journal article published yesterday that said the company has struggled to wall of U.S. user data from China. Chew said it contains numerous inaccuracies.

Zuckerberg has repeatedly referenced Meta's own proposed legislation, which has not been picked up by any lawmakers.

In a 2023 blog post, Meta proposed that lawmakers create rules that would compel App Stores, such as those run by Apple and Google, to be the central authority responsible for age verification.

We've seen tech hearings before, and one thing stands out so far — politicians are focusing on legislation. They're pressing the CEOs on whether they support some of the bills working their way through Congress.

That's notable, since in the past it's been more about pressing the companies about their actions. Now, the senators appear motivated to figure out how to get laws passed that will force the companies to change.

Will anything actually get passed? That's a separate issue...

Jonathan Vanian, CNBC

There was a stark contrast between the way TikTok CEO Chew answered a question about support of the Shield act versus Discord CEO Citron.

Citron stumbled a bit, drawing the ire of Klobuchar. Eventually, Citron says he’s open to discussing the bill and gets lectured by Klobuchar. Contrast that to Chew, who said directly that while he supports the spirit of the bill, he has concerns about its mechanics. Klobuchar then pivots to the next executive.

Audience members clapped for Sen. Amy Klobuchar, D-Minn., who delivered an impassioned opener urging action on proposed social media regulation.

Klobuchar has been a longtime leader in tech regulation, and introduced the SHIELD Act, which would criminalize the transmission of nonconsenual intimate images and sexualized depictions of children.

The claps for Klobuchar were the first positive expressions of support from the audience for anyone speaking at the hearing.

Kaetlyn Liddy

Klobuchar emphasized now is the time to pass legislation.

“It’s been 28 years since the internet," she said, addressing the tech executives. "We haven’t passed any of these bills … the reason they haven’t passed is because of the power of your companies, so let’s be really, really clear about that. What you say matters. Your words matter.”

What are the bills lawmakers keep bringing up?

  • The Kids Online Safety Act, or KOSA, seeks to create liability, or a “duty of care,” for apps and online platforms that recommend content to minors that can negatively affect their mental health.
  • The STOP CSAM Act. CSAM refers to child sexual abuse material. The bill would allow victims of online sexual exploitation to sue social media platforms that promoted or facilitated the abuse and make it easier for victims to ask tech companies to remove CSAM.
  • The EARN IT Act. EARN stands for eliminating abusive and rampant neglect of interactive technologies act. The bill would establish a National Commission on Online Child Sexual Exploitation Prevention.
  • The SHIELD ACT. SHIELD stands for stopping harmful image exploitation and limiting distribution. It "ensures that federal prosecutors have appropriate and effective tools to address the nonconsensual distribution of sexual imagery," the Senate Judiciary Committee wrote on its landing page for protecting children online.
  • The Project Safe Childhood Act would "modernizes the investigation and prosecution of online child exploitation crimes," the Senate Judiciary Committee wrote.

We've heard a lot about how this issue is urgent across the aisle, and so far the senators seem aligned.

"We've found common ground here that just is astonishing," Graham said.

A strong moment with Durbin pressing Discord's CEO about sexual content on its platform. 

Discord's Citron said his goal was to get explicit content that violates its policies off the platform. 

“Mr Citron, if that were working we wouldn’t be here today,” Durbin said to audible ooohs from the crowd.

Diana Paulsen

Graham pushed TikTok CEO Chew on the resignation of a TikTok employee in Israel who had alleged that the company was platforming antisemitism.

Chew said he was aware of the resignation.

“Pro-Hamas content and hate speech not allowed on our platform," the CEO added.

Graham grilled the Discord and X CEOs about the company's backing of various pieces of child safety legislation. Despite Citron's own nods to some legislation in previous statements, when asked directly if he supported the passage of legislation he answered "no."

Graham also asked the same of Yaccarino, who made similar nods to legislation that seemed supportive. When asked about the EARN It Act, though, Yaccarino struggled to give a straightforward answer. Graham concluded that he understood Yaccarino's answer as a "no."

Graham was pretty effective there. Made a strong point that until tech companies are taken to court, little will change.

Jonathan Vanian, CNBC

Discord CEO Citron said that 15% of Discord is focused on trust and safety, which is more than the company has "working on marketing and promoting the company."

With opening statements done, Durbin starts his questioning. He's focused on Discord and TikTok — two of the newer entrants to the child safety issue.

Apple is the missing elephant in the room

Linda Yaccarino referred to companies that weren't in the hearing room in her opening statement.

For many advocates and parents, the first missing company that comes to mind is Apple.

In ads ahead of the hearing, a group called the Heat Initiative called out Apple's role in certain child exploitation places.

In interviews ahead of the hearing, several parents and audience members noted they wished they would have seen Apple executives in the hearing room.

Yaccarino says X safer than its predecessor Twitter

X CEO Linda Yaccarino said that the company is new and that it is "not the platform of choice for children and teens."

"As a mother, this is personal and I share this sense of urgency," she said

Yaccarino said X removes child sexual content and the accounts that post it. She said that X suspended 12.4 million accounts that violated these policies last year, compared to 2.3 million removed by its predecessor Twitter in 2022.

Yaccarino shared her support for the REPORT Act, SHIELD Act and Stop CSAM Act. Durbin commended Yaccarino and X for being the first social media company to endorse the Stop CSAM Act.

TikTok plans to invest $2 billion in trust and safety in 2024

TikTok plans to “invest more than $2 billion in trust and safety efforts” in 2024, CEO Shou Zi Chew said. A significant part of that investment will be in the platform's U.S. operations.

"TikTok is vigilant about enforcing its 13-and-up age policy and offers an experience for teens that is much more restrictive than you and I would have as adults," Chew said, noting that safety is one of the platform’s core priorities.

When outlining the safeguards available on TikTok, Chew said: “We didn’t do them last week” — a possible jab at other platforms testifying today.

Snap CEO Evan Spiegel says platform works with law enforcement

Even though pictures and videos expire on Snapchat, Spiegel said that doesn't mean the platform is not paying attention to what is being shared.

In 2023, Snap made 690,000 reports to the National Center for Missing and Exploited Children, which led to more than 1,000 arrests, according to Spiegel.

"When we take action on illegal or potentially harmful content, we also retain the evidence for an extended period, which allows us to support law enforcement and hold criminals accountable," he said.

Snap wants to be part of the solution, Spiegel said, adding that the platform is committed to acknowledging its shortcomings and working with lawmakers.

Zuckerberg emphasizes recent Meta features on child safety

In his opening statement, Zuckerberg told lawmakers that teens have reported positive experiences on Meta apps.

He emphasized new features that have been rolled out on Facebook and Instagram that restrict teens' experiences on the platforms and encourage them to log off at night. Meta has invested $5 billion in child safety over the past year, Zuckerberg said.

Zuckerberg said he supports age verification and parental controls for minors. He also advocated for industry standards for age-appropriate content.

He wrapped up by directly addressed the families of children who lost their lives because of social media.

"These issues are important for every parent and every platform," he said. "I'm committed to continuing to work in these areas and I hope we can make progress today."

Audience members heckle and groan during Zuckerberg's opener

Some audience members in the Senate hearing room groaned and shouted through Zuckerberg's opening statements.

When the Meta CEO said that "the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health," murmurs spread through the room.

As Zuckerberg acknowledged the parents of dead children in the audience, one person shouted "NO THANKS."

'A lot of attention' on hearing

Christina Wilkie, CNBC

This packed hearing is the biggest audience Durbin has ever seen in the committee room in his 22 years on the panel.

“I’d also like to take a moment to to acknowledge that this hearing has gathered a lot of attention," Durbin said. "As we expected, we have a large audience, the largest I’ve seen in this room, today.”

This is Zuckerberg's eighth time testifying before Congress

Diana Paulsen

Next up is Facebook CEO Mark Zuckerberg.

Zuckerberg is a frequent guest on Capitol Hill, having testified on issues like censorship, data privacy, and election integrity.

Discord CEO Jason Citron said in his opening statements that encryption on his platform would disrupt the platform's child safety efforts.

The statement touches a hot-button issue in the tech community, balancing privacy via technologies like end-to-end encryption and the ability to assist law enforcement and do its own proactive scanning.

In 2023, Meta rolled out end-to-end encryption on Messenger, causing controversy among child safety advocates.

SNAP CEO Evan Spiegel plans to announce that SNAP will not further roll out encryption on its platform in ways that would disrupt scanning its platforms for child sexual abuse material.

Discord CEO says platform is about having fun with friends

In his opening remarks, Discord CEO Jason Citron shared how video games enriched his life as a kid, and how his platform aims to do that for other gamers.

"I’ve been playing video games since I was 5 years old. And as a kid, it’s how I had fun and found friendship," he said. "We built Discord so that anyone could build friendships playing video games from Minecraft to Wordle and everything in between."

As a father of two, he said he wants Discord to be a platform his own kids "use and love, and I want them to be safe."

He emphasized that the platform has a "zero tolerance policy on child sexual abuse material."

CEOs are sworn in. Here come the opening statements.

From left, Jason Citron, CEO of Discord, Evan Spiegel, CEO of Snap, Shou Zi Chew, CEO of TikTok, Linda Yaccarino, CEO of X, and Mark Zuckerberg, CEO of Meta, are sworn-in as they testify before the Senate Judiciary Committee on Jan. 31, 2024.
Anna Moneymaker / Getty Images

AI gets its first mention from Graham

Graham says "AI is just starting." It's the first of what will likely be many mentions of AI today.

AI has been a huge topic of concern lately, from AI-generated news spreading misinformation among young people on YouTube to sexually explicit deepfakes of celebrities like Taylor Swift gaining traction on X.

Sexually explicit AI has gotten so serious that a bipartisan effort is underway in the Senate to give victims the ability to sue makers of such AI-generated images.

Yesterday, a group of senators introduced the Disrupt Explicit Forged Images and Non-Consensual Edits Act (also known as the DEFIANCE Act), which aims to give those victims a form of recourse.

What is Section 230?

The lawmakers have already brought up Section 230 a lot this morning. So what is it?

Section 230 of the 1996 Communications Decency Act shields tech companies from liability for the content posted on their platforms by third parties. It has come under scrutiny from lawmakers in recent years.

Meta CEO Mark Zuckerberg called for changes to Section 230 in March 2021.

“Instead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it,” Zuckerberg said in his opening remarks, according to written testimony released on the House Committee website at the time.

In his opening remarks, Sen. Graham emphasized "it is now time to repeal section 230."

Graham: 'Mr. Zuckerberg ... you have blood on your hands'

In his opening statement, Graham called out several CEOs by name.

"Mr. Zuckerberg, you and the companies before us, I know you don’t mean it to be so but you have blood on your hands," Graham said to applause in the room. "You have a product that’s killing people."

Graham acknowledged that he uses Meta products, adding that social media companies need to deal with the issues they've unleashed.

Diana Paulsen

In his opening statement, Ranking Member Sen. Lindsey Graham, R-S.C., said that "while Washington is certainly broken, there is a ray of hope" in bipartisan support for increased child safety regulation for social media sites.

Graham later jokes about how even he and Sen. Elizabeth Warren, D-Mass., agree.

"Now, Elizabeth Warren and Lindsey Graham have almost nothing in common I promised her I would say that publicly," he said.

But the two both "see an abuse here that needs to be dealt with."

Durbin jabbed at tech platforms' last minute changes around child safety ahead of Wednesday's hearing, jokingly calling the changes coincidental.

The line in Durbin's opening remarks yielded chuckles from audience members.

Ahead of the hearing, numerous platforms expressed new interest in pieces of legislation and regulation, while previously being slow to adopt such proposed changes.

Durbin said in 2013, the National Center for Missing & Exploited Children received 1,380 cyber tips per day about child sexual abuse material.

A decade later, those tips have skyrocketed to 100,000 reports per day, Durbin said.

Lawmakers began the hearing promptly at 10 a.m. by showing the room a video about young people sharing how they have been impacted by social media exploitation.

As Meta's CEO Mark Zuckerberg entered the hearing room, parents holding photos of their dead children audibly hissed.

Zuckerberg and Meta have faced intense criticism over the years around child safety issues. In the audience are some parents who say that Instagram contributed to their childrens' suicide or exploitation.

Mark Zuckerberg, CEO of Meta, arrives to testify at a Senate Judiciary Committee hearing, "Big Tech and the Online Child Sexual Exploitation Crisis," on Jan. 31, 2024.
Mark Zuckerberg arrives in the hearing room today.Andrew Caballero-Reynolds / AFP - Getty Images

Hearing room waiting for tech CEOs full of parents and advocates

The hearing room where Senators will grill the CEOs is full of child safety advocates and parents who say their children were killed or affected in part by social media platforms.

Many parents brought photos of their children to hold as the senators question the CEOs, and many are wearing blue ribbons saying "STOP Online Harms! Pass KOSA!" KOSA is the Kids Online Safety Act, which would create a duty of care for social media companies.

Relatives hold pictures of children before the start of the "Big Tech and the Online Child Sexual Exploitation Crisis," on Jan. 31, 2024.
Relatives hold pictures of children before the start of the "Big Tech and the Online Child Sexual Exploitation Crisis," on Wednesday.Andrew Caballero-Reynolds / AFP - Getty Images

Companies have been reluctant to endorse such legislation. This month, SNAP was the first platform to suggest that it was open to the passage of KOSA. In her opening remarks, X CEO Linda Yaccarino will offer support for the bill among others.

According to a source close to the Senate Judiciary Committee, a large amount of seats allocated to Senate officers were given to parents.

Other seats not reserved for members of the public were reserved for child safety advocates, who have worked for years to address child safety issues at social media companies.

Where’s YouTube? It’s a huge part of where kids spend time online

The video platform is noticeably absent from today’s hearing, although it is a popular destination for kids online. 

YouTube, which is owned by Alphabet (which also owns Google), previously came under fire after users found disturbing videos featuring children and comments from child predators under minors’ posts. Advertisers pulled out of the platform after they found their ads alongside inappropriate content. As a result, the company disabled comments on videos with kids and launched a “classifier” to monitor comments for predatory behavior in 2019. 

YouTube hasn’t had any high-profile issues with child safety since they cracked down on these commenters. The company's dedicated family-friendly site YouTube Kids has been deemed "mostly safe" by children's media nonprofit Common Sense.

YouTube's policy prohibits content that could potentially endanger children, including videos that sexualize minors, encourage cyberbullying or promote dangerous activities. It also age-restricts videos that feature sexual themes, profanity or harmful acts that kids might imitate.

I think the most talked about “who is not here” from my POV has been Apple.

Jonathan Vanian, CNBC

It will be interesting if Meta and others try to shift more of the focus to “platform players” like Apple iOS and Google Android regarding age-verification.

Another thought on who else could have been here: Amazon’s Twitch. The platform has been called out before for child-grooming-and-exploitation problems in the past. And it’s a huge platform that’s very popular with kids.

The CEOs of Discord, Snap, X and TikTok have made their way into the hearing room.

Reporters lob questions, but none are answered.

Installation mocks "Big Tech"

NBC News

An installation against "Big Tech" depicts Mark Zuckerberg, CEO of Meta, and Shou Zi Chew, CEO of TikTok, outside the Capitol on Jan. 31, 2024.
Julia Nikhinson / AFP - Getty Images

An installation criticizing “Big Tech” depicts Mark Zuckerberg, CEO of Meta, and Shou Zi Chew, CEO of TikTok, outside the Capitol today.

Even Elmo is feeling the social media strain

Thousands of people have been unloading their life problems on Elmo this week after the red Muppet posed a casual question on X: “How is everybody doing?”

Not well, it seems.

In fact, the question, which was posted to X on Monday, opened the floodgates to a deluge of internet users eager to vent to the children’s show character that had somehow signed himself up to be the internet’s newest therapist.

“Elmo I’m suffering from existential dread over here,” a user replied.

Read the full story here.

Kids’ online safety takes center stage at Senate tech hearing

NBC News

There are many proposed bills in the Senate aimed at protecting children on social media sites. Durbin says he sees bipartisan support for a lot of the ideas.

“To think this diverse Senate Judiciary Committee, would have a unanimous vote — every Democrat and every Republican supporting these five or six bills — tells you that we can come together on something that is so compelling,” he said. 

He also says this is personal for him. He knows there will be many families in the room today who’ve lost children after being harmed on social media.

“Every time I see these families that have gone through this, I put myself in their shoes and say 'OK, as a father, as a grandfather what would you think if your grandson or granddaughter just gave up their life because of the irresponsibility and danger of these media platforms?' I mean it’s personal.”

What to expect at today's hearing

Senators are expected to grill executives of TikTok, Meta (which owns Instagram and Facebook), Discord, X (formerly Twitter) and Snap about what efforts they have made to help stop the exploitation of kids online.

Efforts to regulate social media continue to ramp up across the U.S. amid concerns from some parents that the platforms don’t do enough to keep their kids safe online. 

Many of the platforms have said they don’t tolerate child sexual exploitation on their platforms, and they point to various tools they already offer as examples of their proactive methods.  

“The bottom line is that we will never have what we want in this lifetime: our daughter back. So we’re here advocating for change,” said Tony Roberts, whose daughter died by suicide after, her parents say, she viewed a simulated hanging video on social media.

Ready the full story here.

Emily Wilkins, CNBC

Internal Meta emails that Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn., acquired will be used in questioning to drive home their argument that Meta has not done all it can to help keep kids safe on their platform.

In November, a Meta whistleblower alleged the company had failed to protect teens.

Durbin says hearing is 'long overdue'

Senate Judiciary Chairman Dick Durbin, D-Ill, tells us this hearing, featuring five CEOs of the most popular social media apps, is “long overdue.”

The subject of the hearing today is child sexual exploitation though he expects other issues will certainly come up. 

“Terrible things are happening,” Durbin told us. “The numbers that come back to this tell us the exploitation of children are growing by leaps and bounds. What are we doing about it? We’re clinging to old law that which basically exempts this industry from liability.”

The committee had to issue subpoenas to get three of the CEOs to attend today (Snapchat, Discord and X). Durbin says he was amazed they had to send U.S. Marshals to Discord and X because they refused to cooperate.

“They must think they’re so far above the law it doesn’t matter,” he said. 

His biggest question today? One suggested by his daughter who has 12-year-old twins.

“She said ‘Dad, ask these executives how they protect their own kids?’”

It's not even 9 a.m. yet and the room is already starting to fill with journalists. Nameplates for the five CEOs were just placed in front of the seats facing the senators.

Senate Judiciary Hearing into social media child sexual exploitation in Washington on Jan. 31, 2024.
Kate Snow / NBC News

There is a line outside the room for those hoping to get a seat inside. We don’t know exactly how the CEOs will enter this large hearing room in the Dirksen Senate Office Building but hope they may pass our cameras. We’re aware that there are other back routes for them to enter as well. 

Parents of kids harmed through social media will be in attendance

Sitting in the front row of the hearing room today we expect to see 20 parents wearing black and holding photos of their children. More parents will be behind them.

All of them lost kids after something happened on a social media site — whether harassment, sexual exploitation, drug sales leading to fentanyl overdoses or other issues. 

Sam Chapman lost his son Sammy in 2021. I first spoke with him and his wife, Laura Berman, just days afterward, and their grief was palpable and heartbreaking. 

A dealer connected with Sammy on Snapchat and gave him a pill containing a deadly dose of fentanyl. He died in his bedroom.

Sam Chapman reached out to many of the parents who will attend today to make sure they’d be present in the room. They all want to send a message to these CEOs. 

“We’ve been asked to give questions to the senators. So what we’re hoping is that there’s some very pointed questions about why they’re letting so many children die on their platforms, why they’re letting so many children be abused on their platforms, without changing,” Chapman told me last night.

I asked him what he personally wants to hear today.

“I want to know how they can sleep at night,” Chapman said, “knowing that they’re accessory to murder, over and over again.”

Snap issues support for KOSA

Ahead of the hearing, Snap issued its support for the Kids Online Safety Act, or KOSA.

“Many of the provisions in KOSA are consistent with our existing safeguards: we set teens’ accounts to the strictest privacy settings by default, provide additional privacy and safety protections for teens, offer in-app parental tools and reporting tools, and limit the collection and storage of personal information,” a spokesperson for the platform said in an email statement.

Snap CEO Evan Spiegel is among the five CEOs who will testify.

TikTok's CEO Shou Zi Chew returns to Congress

Last March, when a potential TikTok ban was being floated by lawmakers and the Biden administration, the platform's CEO, Shou Zi Chew, testified before lawmakers in a hearing that lasted roughly five hours.

Chew spoke before a House Energy and Commerce Committee hearing titled “TikTok: How Congress Can Safeguard American Data Privacy and Protect Children from Online Harms.” 

Members grilled the CEO about the Chinese-owned platform, citing concerns about privacy for Americans’ data, protections for children online and TikTok’s connection to the Chinese Communist Party.

In his opening statement, Chew emphasized TikTok is safe and secure and that it shouldn’t be banned.

Chew will appear before Congress again today, this time to specifically address child exploitation and safety concerns alongside other tech CEOs.

Read NBC News' investigation on Discord

Discord, which launched in 2015, quickly emerged as a hub for online gamers, growing through the pandemic. It has since become a destination for communities devoted to topics as varied as crypto trading, YouTube gossip and K-pop.

In a 2023 review of international, national and local criminal complaints, news articles and law enforcement communications published since Discord was founded, NBC News identified 35 cases over the past six years in which adults were prosecuted on charges of kidnapping, grooming or sexual assault that allegedly involved communications on the platform.

Experts have suggested that Discord’s young user base, decentralized structure and multimedia communication tools, along with its recent growth in popularity, have made it a particularly attractive location for people looking to exploit children.  

Ahead of the hearing, a spokesperson for Discord said that it has a zero-tolerance policy for child sexual abuse and that it uses a mix of proactive and reactive tools to moderate the platform. 

“Over 15% of our workforce is dedicated to trust and safety full time. We prioritize issues that present the highest real-world harm to our users and the platform, including child sexual abuse material,” the spokesperson said.

Read more of NBC News' reporting on Discord here.

X says it is 'not the platform of choice for children and minors'

Ahead of Wednesday's hearing, X published a blog post saying the platform has “zero tolerance for Child Sexual Exploitation (CSE), and we are determined to make X inhospitable for actors who seek to exploit minors.”

X also said that the platform, formerly known as Twitter, is "not the platform of choice for children and minors."

"Users between 13-17 account for less than 1% of our U.S daily users," the blog post states.

X CEO Linda Yaccarino, a former NBCUniversal executive who was announced as the platform's new CEO in May, will testify before lawmakers for the first time.

Meta CEO Mark Zuckerberg has been grilled by lawmakers before

Mark Zuckerberg has been in the hot seat before.

Meta, which owns Instagram and Facebook, has faced criticism surrounding how it handles problematic content targeting younger users.

Last year, a Wall Street Journal investigation revealed that Meta knew Instagram created significant mental health issues for its teenage users, citing internal documents.

In October, a bipartisan group of 42 attorneys general sued Meta, alleging features on Facebook and Instagram are addictive and are aimed at kids and teens.

In a blog post published Thursday, Meta said it wants teens to have “age-appropriate experiences on our apps.” 

The company said it has developed more than 30 tools to help teens and their parents cultivate safe experiences on its platforms and that it spent “over a decade developing policies and technology to address content and behavior that breaks our rules.”

Here are the tech CEOs that are testifying

  • Jason Citron, CEO of Discord
  • Mark Zuckerberg, CEO of Meta
  • Evan Spiegel, CEO of Snap
  • Shou Zi Chew, CEO of TikTok
  • Linda Yaccarino, CEO of X