Skip to main content

Advertisers Abandon YouTube Over Concerns That Pedophiles Lurk In Comments Section

caption: Earlier this week, Disney, Nestle and Epic Games — which makes Fortnite — pulled their ads from YouTube, which is owned by Google.
Enlarge Icon
Earlier this week, Disney, Nestle and Epic Games — which makes Fortnite — pulled their ads from YouTube, which is owned by Google.
AP

Big brands are pulling their ads off YouTube over concerns that potential sexual predators are gathering in the comment sections of videos featuring children. In response, YouTube has deleted more than 400 channels and suspended comments on tens of millions of videos as it tries to purge the system of pedophiles.

Editor's note: This story contains content that may be upsetting to some readers.

The controversy emerged after a former YouTube content creator described what he called a "soft-core pedophile ring" on the site. Pedophiles are communicating with each other in the comments and trading links to illegal pornography, Matt Watson said in a video posted this week that has been viewed millions of times.

"They're providing links to actual child porn in YouTube comments," he said. "They're trading unlisted videos in secret. And YouTube's algorithm, through some kind of glitch or error in its programming, is actually facilitating their ability to do this."

Earlier this week, Disney, Nestle and Epic Games — which makes Fortnite — pulled their ads from YouTube, which is owned by Google. AT&T and Hasbro followed suit.

"Until Google can protect our brand from offensive content of any kind, we are removing all advertising from YouTube," AT&T said in a statement, according to AdAge.

The controversy highlights the difficulty that major Internet content companies often have patrolling user-generated content, which can stream in at an incredible pace. A YouTube spokesman told TechCrunch that around 400 hours of video are uploaded each minute. The company has around 10,000 human reviewers who analyze content that's been flagged as inappropriate.

YouTube executives are scrambling to reassure companies that YouTube is doing everything it can to protect children. "Child safety has been and remains our #1 priority at YouTube," YouTube said in a memo sent to major brands, AdWeek reported. YouTube this week suspended comments on millions of videos that "are likely innocent but could be subject to predatory comments," the memo said.

Some of the children in the videos look to be as young as 5 years old, according to a Wired magazine report.

In his video critique, Watson describes how he says the pedophile ring works: YouTube visitors gather on videos of young girls doing innocuous things, such as putting on their makeup, demonstrating gymnastics moves or playing Twister. In the comment section, people would then post timestamps that link to frames in the video that appear to sexualize the children.

YouTube's algorithms would then recommend other videos also frequented by pedophiles. "Once you enter into this wormhole, now there is no other content available," Watson said.

"And of course, there is advertising on some of these videos," he said, showing examples of ads. His video was titled: "Youtube is Facilitating the Sexual Exploitation of Children, and it's Being Monetized."

This is not the first time YouTube has come under scrutiny for this very issue. In 2017, The Times of London reported that "some of the world's biggest brands are advertising on YouTube videos showing scantily clad children that have attracted comments from hundreds of [pedophiles]."

Also in 2017, the BBC reported that YouTube's system for reporting sexualized comments had been broken for more than a year.

And NPR reported in 2017 that if you typed the words "how to have" into the site's search bar, YouTube's autocomplete function suggested the search phrase, "how to have s*x with kids." [Copyright 2019 NPR]

Why you can trust KUOW