IE 11 is not supported. For an optimal experience visit our site on another browser.

CEOs of TikTok, Meta, Snap and X to testify at Senate online child safety hearing. Here's what to know.

Efforts to regulate social media continue to ramp up across the U.S.
Get more newsLiveon

The heads of five major social media platforms will be in Washington on Wednesday to testify before the Senate Judiciary Committee about protecting children online.

Senators are expected to grill executives of TikTok, Meta (which owns Instagram and Facebook), Discord, X (formerly Twitter) and Snap about what efforts they have made to help stop the exploitation of kids online.

Mark Zuckerberg of Meta, Linda Yaccarino of X (formerly known as Twitter) and Shou Zi Chew of TikTok.
Mark Zuckerberg of Meta, Linda Yaccarino of X (formerly known as Twitter) and Shou Zi Chew of TikTok.Getty Images, AP file

X CEO Linda Yaccarino, Snap CEO Evan Spiegel and Discord CEO Jason Citron will testify before Congress for the first time. TikTok CEO Shou Zi Chew and Meta CEO Mark Zuckerberg, who have previously addressed lawmakers, will also speak.

Follow live updates on the Senate hearing

Efforts to regulate social media continue to ramp up across the U.S. amid concerns from some parents that the platforms don’t do enough to keep their kids safe online. 

Many of the platforms have said they don’t tolerate child sexual exploitation on their platforms, and they point to various tools they already offer as examples of their proactive methods.  

Several parents who say their children died for various reasons connected to social media told NBC News they will be watching from the hearing room. Some of them are suing some of the social media companies whose CEOs will appear Wednesday.

“The bottom line is that we will never have what we want in this lifetime: our daughter back. So we’re here advocating for change,” said Tony Roberts, whose daughter died by suicide after, her parents say, she viewed a simulated hanging video on social media.

In recent years, a handful of bills have been put forward in an attempt to ensure that both minors who appear in content on social media platforms and those who simply use them are as safe as possible. 

In December, more than 200 organizations sent a letter urging Senate Majority Leader Chuck Schumer, D-N.Y., to schedule a vote on the Kids Online Safety Act, or KOSA, which seeks to create liability, or a “duty of care,” for apps and online platforms that recommend content to minors that can negatively affect their mental health.

“The onus must be on the companies to make their products safe and not designed to fill kids’ feeds with posts that promote eating disorders, suicide, drug use, or sexual abuse,” the letter said.

Sens. Marsha Blackburn, R-Tenn., and Richard Blumenthal, D-Conn., the co-authors of the KOSA bill, said the responsibility falls on lawmakers to hold tech companies accountable.

“Without real and enforceable reforms, social media companies will only continue publicly pretending to care about young people’s safety while privately prioritizing profits,” they said in a joint statement ahead of Wednesday’s hearing. “As urged by young people, parents, and experts who are fed up with this tired playbook, we are redoubling our efforts to pass the Kids Online Safety Act to make real change for kids online.” 

Other bipartisan bills, like the Stop CSAM Act, have also been introduced. It seeks to make it easier for victims of child sexual abuse materials, or CSAM, to ask tech companies to remove such materials. It would also allow victims to sue social media platforms that “otherwise knowingly promote or facilitate” the abuse.

Lawmakers in favor of enacting legislation argue that social media companies have failed to police themselves at the expense of kids.

But some experts have warned that legislating social media can be precarious. 

While on paper KOSA seems great, in practice it could lead to platforms’ over-censoring children, said Aliya Bhatia, a policy analyst for the Free Expression Project at the Center for Democracy and Technology. 

She said she and other experts are concerned about the “broad language in the bill and the sort of unclear liability standard.”

In its current form, the bill would also be enforced at the state level, which means the law would be interpreted differently depending on what state a child lives in.

Bhatia said she hopes lawmakers will focus Wednesday on empowering the social media platforms to invest in more tools that will help young people navigate the internet safely.

“I think it will be a constructive hearing if we focus on the what,” she said. “What companies can do more of, rather than sort of dangle the specter of legal action to pressure them to censor speech.”

The tech executives testifying Wednesday are expected to reaffirm their commitment to child safety. 

Ahead of the hearing, Snap issued its support for KOSA.

“Many of the provisions in KOSA are consistent with our existing safeguards: we set teens’ accounts to the strictest privacy settings by default, provide additional privacy and safety protections for teens, offer in-app parental tools and reporting tools, and limit the collection and storage of personal information,” a spokesperson for the platform told NBC News in an email statement.

In a blog post published Friday, X said it has a “zero tolerance for Child Sexual Exploitation (CSE), and we are determined to make X inhospitable for actors who seek to exploit minors." It said users ages 13 to 17 account for less than 1% of its U.S. daily users. 

Discord also said that it has a zero-tolerance policy for child sexual abuse and that it uses a mix of proactive and reactive tools to moderate the platform. 

“Over 15% of our workforce is dedicated to trust and safety full time. We prioritize issues that present the highest real-world harm to our users and the platform, including child sexual abuse material,” a spokesperson for Discord said in a statement. 

The spokesperson added that Discord has a team “specifically dedicated to preventing minor safety issues on our platform and take action when we become aware of this content, including removing it, banning users, shutting down servers, and engaging with the proper authorities.”

TikTok has also been adamant that it doesn’t harm kids, a point that Chew was questioned about last March when he spoke before a House Energy and Commerce Committee hearing titled “TikTok: How Congress Can Safeguard American Data Privacy and Protect Children from Online Harms.” 

To better ensure safety for minors, Chew said at the hearing, TikTok restricted livestreams to accounts registered to users who are 18 or older. 

“We spend a lot of time adopting measures to protect teenagers,” Chew said in his opening remarks. “Many of those measures are firsts for the social media industry.”

In a blog post published Thursday, Meta said it wants teens to have “age-appropriate experiences on our apps.” 

Meta has been at the center of much scrutiny after a Wall Street Journal investigation last year revealed that it knew Instagram created significant mental health issues for its teenage users, according to internal documents.

Meta said in its blog post that it has developed more than 30 tools to help teens and their parents cultivate safe experiences on its platforms and that it spent “over a decade developing policies and technology to address content and behavior that breaks our rules.”