Senate Judiciary Committee Chair Dick Durbin urged Attorney General Merrick Garland in a letter Tuesday to review Twitter's handling of child exploitation material, calling the Justice Department's failure to address the issue "unacceptable."
"Sadly, Twitter has provided little confidence that it is adequately policing its platform to prevent the online sexual exploitation of children," Durbin, D-Ill., wrote. "This puts children at serious risk."
The letter cites reporting from NBC News that found dozens of Twitter accounts and hundreds of tweets using numerous hashtags to promote the sale of child sexual abuse material (CSAM). Some of the tweets were brazen in how they marketed the material, using common terms and abbreviations for CSAM. After the article was published, Twitter said that it was blocking access to several hashtags associated with the posts.
Durbin urged Garland to review public reports about child exploitation on Twitter and consider whether an investigation was warranted.
"I further urge you to consider whether an online platform can be held liable under federal criminal law for failing to take reasonable steps to prevent the foreseeable proliferation of CSAM on its platform, and, if not, to inform the Senate Judiciary Committee whether the Department needs any additional legislative authority to address such criminally negligent behavior," Durbin wrote.
In December, Durbin sent a letter to Twitter CEO Elon Musk raising concerns about child safety on the platform following layoffs and the dissolution of its Trust and Safety Council.
Neither Musk nor Twitter responded publicly to that letter.
Elon Musk meets with House RepublicansJan. 27, 202302:09
Musk has said that he was prioritizing the elimination of child exploitation material from the platform, and criticized Twitter's previous leadership as not doing enough on the issue.
But some child advocates, in addition to Durbin, have raised concerns that Musk's actions at the company could make addressing CSAM on the platform more difficult.
NBC News has reported that as of early January, around 20 people worked within Twitter's Trust & Safety organization. That is less than half of the group's previous workforce.
“If you lay off most of the trust and safety staff, the humans that understand this stuff, and you trust entirely to algorithms and automated detection and reporting means, you’re only going to be scratching the surface of the CSAM phenomenon on Twitter,” Victoria Baines, an expert on child exploitation crimes who has worked with the U.K.’s National Crime Agency, Europol, the European Cybercrime Centre and Facebook, previously told NBC News.
Twitter has pushed back on reporting that it isn't doing enough to promote child safety.
Ella Irwin, Twitter’s vice president of product overseeing trust and safety, told NBC News this month that the company has "roughly 25% more staffing on this issue/problem space now than the company had at its peak last January" and that it is "improving rapidly and detecting far more than Twitter has detected in a long time, but we are deploying a number of things to continue to improve.”