IE 11 is not supported. For an optimal experience visit our site on another browser.

The Momo challenge is a hoax. But the online culture and financial rewards that made it seem feasible are scary.

Nasty YouTube videos aren't solely the work of child-hating malcontents. They're a result of capitalist contempt for the vulnerable.
\"Mother Bird\"
"Mother Bird" by artist Keisuke Aisawa on display at the Vanilla Gallery in Tokyo.Courtesy of Vanilla Gallery

Everyone worries about their kids.

My kid is two; from the moment he was born (and probably during the few months before), each time I heard about something bad happening to a child, I went into a panicky, sickened research mode, frantically searching for confirmation that the absolute worst had happened and could happen again. Kidnappings! Monstrous priests! Car crashes! Obscure diseases! Keisuke Aisawa’s surrealist sculpture Mother Bird!

Wait, what?

Aisawa, who works for a company that makes puppets and props for horror movies, made an especially scary sculpture of a bug-eyed bird lady monster that was displayed at a commercial art gallery in 2016; it made its way onto Instagram, then to the bowels of Reddit (which at that point was mostly bowel), to Kim Kardashian’s 129 million-follower Instagram, and today you probably know the sculpture as Momo, the horrible creature who tells teenagers to kill themselves on YouTube (originally WhatsApp).

Please note: Momo does not tell teenagers to kill themselves on YouTube (or on WhatsApp). Lots of bad things happen to kids, and some of them happen on the internet, but Momo is not one of them. The reports of a “Momo challenge,” in which Momo — who, I want to stress, cannot talk and is not a computer-generated image but a static picture of a sculpture — "tells" young daredevils to do progressively more dangerous things, ending in an appeal to suicide, appear to be exclusively the product of adult minds that can’t tell fact from fiction.

SIGN UP FOR THE THINK WEEKLY NEWSLETTER HERE

Like most frustrating moral panics, there are various grains of truth to the Momo fiction that add up to problems that are genuinely disturbing, dangerous to children and indicative of a kind of social rot. But they don’t fit into a headline in 48-point font.

First, YouTube’s product for children — in front of which tired parents plunk their kids in exactly the same way the last generation did with Nickelodeon — is a cesspool. Underpoliced, saturated with ads, overflowing with horribly disturbing auto-generated content and constantly monitoring its tiny users for monetizable data, YouTube Kids has been the subject of multiple complaints from consumer groups and (what seems like) a new journalistic expose every month. Most recently, there was a Wired story about pedophiles lurking in the comments of children's YouTube videos that was so serious — and, notably, true — that advertisers pulled up stakes and ran for the hills.

Secondly, adults and older kids who enjoy being cruel are not fictional. Cruelty is epidemic on YouTube, but it isn’t unique to the platform: Nest web-enabled video cameras and other devices are so notoriously insecure that stories of hackers breaching baby monitors and saying mean things to infants have become a crime-blotter staple.

So, it's unsurprising that some YouTube users have uploaded clips of cartoon character Peppa Pig intercut with clips of another YouTuber, who calls himself Filthy Frank, giving instructions on how to slit one's wrists, if not the fictional “Momo Challenge” videos. There's just no evidence that the videos like that — even the real ones — have been widely seen, let alone particularly effective.

Still, I know these things because I’m a parent. When my son is slightly older, I’m sure I’ll find other things to worry about — I’m only up to baby monitors and hacked-together Peppa Pig videos — but there’s clearly much worse in store for me. (And, for the record: Our baby monitor uses Bluetooth, not wifi, and the only shows that my son watches are reruns of Pee-Wee’s Playhouse on blu-ray. I try not to even play the ubiquitous Billboard Hot 100 hit Baby Shark for him, because he cries when I get too sick of it to play it over again, but sometimes I have to).

But I don’t think it’s correct to consider nasty YouTube videos and baby-taunting hackers solely the work of child-hating malcontents. This is a result of the capitalist contempt for vulnerable people — in this case small children — at its most obvious and indefensible.

The people who work for Viacom aren’t somehow more moral than the people who work for Google, but the culture of Nickelodeon’s kids’ programming is, at its worst, kind of fart-centric, whereas the culture of YouTube’s programming for kids and teens includes Logan Paul making fun of a suicide victim’s corpse and PewDiePie screaming ethnic slurs.

This is partly because, when it first went on the air, Nickelodeon re-aired kids’ programming originally broadcast on public airwaves and subject to a whole host of regulations that carried heavy fines. When the network began to produce its own shows, they needed to fit in alongside their earlier acquisitions, and it needed to please advertisers who were comfortable with the highly-policed nature of children’s broadcast programming.

YouTube has had none of these constraints — only a mandate to encourage users to upload the kind of material that children and young teens would prefer to material on Nick, Cartoon Network and the Disney Channel at a tiny fraction of the cost. That material, of course, is vulgar, narcissistic and hateful, partly because people who spend the necessary years mastering the creative pursuits through which one gets jobs making children's programming for major networks generally don’t (and can’t, and possibly think it beneath their station to) tirelessly self-promote in the ways required to succeed on YouTube. Meanwhile, the company's content policies, beyond a few very broad rules, are almost entirely ad hoc.

Because there are straightforward financial incentives to put shocking material aimed at children on YouTube and no human review process that vets that material before it is available to kids, YouTube employs scores of people to review content that its users and algorithms flag as objectionable — but only after the fact. It is miserable, traumatizing, even radicalizing work, and tech companies treat the people who do it the way they treat all their low-level employees: Work them hard, burn them out, hire the next person.

Again, in industries with some form of regulation, whether by regularly-enforced law or collective bargaining from the workers themselves, the burn-out-and-replace school of human resource management doesn’t dominate in the same way.

But we don’t live in a society willing to admit that parents aren’t omnipotent and omnipresent, and we haven't come to a consensus that protecting our children involves electing responsible public officials who will make sure that businesses won’t profit when their negligence brings harm to those children. We’d rather believe in a bird monster that whispers heartless suggestions to our kids' ears than in a money monster that whispers heartless suggestions in ours.