AT&T, Nestlé and Epic Games have withdrawn ads from YouTube after a viral video claimed it had hosted videos featuring young children that included comments from pedophiles.
"Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube," an AT&T spokesperson said in an email.
YouTube, which is owned by Google, said in a statement that any content that endangers minors is "abhorrent" and prohibited on its platform. It said comments that could be considered dangerous to minors are also banned.
"We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling violative comments. There's more to be done, and we continue to work to improve and catch abuse more quickly,” the YouTube statement said.
Nestlé said the food and drink company was pausing its advertisements while YouTube investigated the inappropriate content.
“An extremely low volume of some of our advertisements were shown on videos on Youtube where inappropriate comments were being made," Nestlé said in a statement. "While investigations are on-going directly with YouTube and our partners, we have decided to pause advertising on YouTube globally, already effective in North America and several other markets."
Byers Market Newsletter
Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox.
"We will revise our decision upon completion of current measures being taken by Google to ensure Nestlé advertising standards are met,” it continued.
Epic Games, which makes the wildly popular video game Fortnite, said it had paused its pre-roll advertising, or the promotional video messages that play before content, with YouTube.
"Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service,” an Epic Games spokesperson said.
Disney also pulled its ads from YouTube, according to Bloomberg.
YouTube issued a memo to its advertising partners explaining what it is doing about the situation. The company said it is working on a new version of a technology that helps classify predatory comments and is also looking to hold creators to a higher standard for moderating their content. The memo was first reported by Adweek.
The viral video, which prompted other companies to review their advertising on YouTube, was posted by user Matt Watson, who goes by "MattWhatItIs," on Sunday, Feb. 17. It showed how YouTube was suggesting content that featured underage girls after viewers watched videos of adult women.
Watson also showed how comments on these videos were timestamped to sexually suggestive moments in the videos or were inappropriate. The videos included suggestions that featured children licking Popsicles or doing gymnastics.
Tech-focused media outlet TechCrunch said it was able to reproduce the suggestions featuring young girls on YouTube, but it couldn't reproduce all of Watson's allegations.
Last month, YouTube said it was working to remove content from its suggestions that it considers close to violating its policies, including conspiracy and medically inaccurate material.
YouTube has been working since 2017 to regain advertisers' trust after it was discovered that ads were appearing on content that included hate speech. Brands including HSBC, U.K. retailer Marks and Spencer and L'Oreal pulled advertising from YouTube amid the fallout.
"There have been stories over the past few days about brands appearing against content that they wouldn't like to appear against and particularly on YouTube, and so for me it is a good opportunity for me to say, first and foremost, to say sorry this should not happen and we need to do better," Matt Brittin, Google's president of business and operations in Europe, the Mideast and Africa, said in March 2017.