YouTube is facing yet another controversy after several high profile companies including Disney, Nestle, andFortnitepublisherEpic Gameshave pulled their advertising from the video platform. It all stems from a recent discovery by Matt Watson where users were using the comments section of innocuous videos to participate in child exploitation and the sharing of underage pornography.
Earlier this week,Watson shared a videoin which he presented evidence that showed YouTube’s algorithm is recommending videos involving minors, monetizing them, and allowing users to provide links to child pornography in its comments. Watson says that it only takes about 5 clicks to get caught in this “wormhole” and that YouTube is profiting off the exploitation of children.
On the surface, it may all seem innocent, but digging through these comments shows a disturbing pattern that starts with sharing inappropriate time stamps and extends out to unlisted child pornography. And, according to Watson, many of these videos do havesupport for monetization, which means that an advertiser’s pre-roll could play on the video.
Since Watson published the video,YouTubehas removed more than 400 channels and deleted millions of videos, but the damage is seemingly done. Not long after Watson’s findings went viral, big name companies like Disney and Epic Games started to pull their advertising.
YouTube content creators fear that this is yet another “adpocalypse” for the platform, named after an instance where advertisers pull their support from individual channels or YouTube as a whole. Many saw a similar thing back whenPewdiepie came under firefor anti-Semitic content in his videos.
Ultimately, things returned to normal after that adpocalypse and it’s likely they will after this. YouTube is still the largest video platform on the planet and advertisers are not going to abandon it whole cloth. But at the very least the hope is that this will show YouTube some fundamental problems with its algorithm and moderation processes.