WASHINGTON — A bipartisan group of senators on Wednesday introduced legislation that aims to protect children from any harmful effects posed by using social media.
The Protecting Kids on Social Media Act would set a minimum age of 13 to use social media apps, such as Instagram, Facebook and TikTok, and would require parental consent for 13- to 17-year-olds.
U.S. Surgeon General Vivek Murthy said earlier this year that 13 is too young to join social media.
The bill would ban social media companies from recommending content using algorithms for users under the age of 18. It would also require the companies to employ age verification measures, and instructs them to create a pilot project for a government-provided age verification system that platforms could use.
Under the measure, the Federal Trade Commission and state attorneys general would be given authority to enforce the bill's provisions.
"Big tech has exposed our kids to dangerous content and disturbed people," one of the bill's sponsors, Sen. Tom Cotton, R-Ark., said at a news conference. "Moms and dads have felt helpless while their kids suffer, sometimes leading to devastating tragedies."
Sen. Brian Schatz, D-Hawaii., another lead sponsor, said that the bill is a "commonsense and bipartisan approach to help stop this suffering" that has resulted from teens using social media.
"By instituting these simple, straightforward guidelines, we’ll be able to give the next generation of children what every parent wants for their child, which is a chance to grow up happy and healthy," he said.
The other two main sponsors are Sens. Chris Murphy, D-Conn., and Katie Britt, R-Ala., who said her family "constantly" has conversations about social media.
"We have to take a step back and as parents say, how can we protect our children, teach them how to use this tool, to use it for good, and to be intentional in doing that?" she said.
Some of the most popular social media apps such as Facebook and Instagram require that users who create accounts be at least 13 years old. While TikTok requires that users who post content be at least 13, it also offers "a curated, view-only experience for those under age 13 that includes additional safeguards and privacy protections." The company says it partners with Common Sense Networks, a media company whose mission is to create media that's safe for kids and families, to ensure that content for that age group is appropriate and safe.
Lawmakers recently grilled the CEO of TikTok at a congressional hearing about the company's age requirements and mechanisms put in place to protect children from dangerous content.
In a press release Wednesday, the group of senators pointed to mental health data among young people, contending that there's a clear link to social media. The Centers for Disease Control and Prevention’s Youth Risk Behavior Survey found that 57% of high school girls and 29% of high school boys felt persistently sad or hopeless in 2021, with 22% percent of all high school students reporting they had seriously considered attempting suicide in the preceding year, the lawmakers noted.
Other studies in recent years have suggested that social media has been linked to a rise in mental health disorders in teens and depression in adults.
In February, Sen. Josh Hawley, R-Mo., introduced a bill that would set the minimum age limit to use social media at 16. And recently, Utah Gov. Spencer Cox, a Republican, signed two pieces of sweeping social media regulation into law that require social media companies to get parental consent for minors using their services.