Last month, YouTube user Matt Watson, @MattsWhatItIs, posted a video titled, “YouTube is Facilitating the Sexual Exploitation of Children, and it’s Being Monetized (2019),” in which he delved into a hidden ring of pedophilia being facilitated through the YouTube comments section. Much of these comments are made on videos of little girls, in particular, that also feature advertisements from various corporations such as Disney, AT&T and Hasbro.
The comments being made include crude “compliments” focusing on various body parts of the little girls within the videos, while other comments consist of sharing social media between each other, as well as time stamps of very specific clips that show little girls in compromising and sexually implicit positions.
Although the revelation of this issue is completely appalling, what is even more astounding is the fact that YouTube has known of this issue since at least November 2017.
Watson provides a screenshot of a blog post made Nov. 22, 2017, in which YouTube stated, “We have historically used a combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors… Starting this week we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”
However, one issue that Watson points out is the fact that even though the comments are being disabled on the videos, the accounts are still up and running, not to mention that most of the accounts will repost the original video to their own. He also mentions how easy it is for pedophiles to find a multitude of videos featuring minors under the age of 10 once they find their way into the recommendation’s algorithm; something that only takes about five videos to get to.
“We know that YouTube has an algorithm in place that detects some kind of unusual predatory behavior on these kinds of videos, and yet all that’s happening is that the comments are being disabled?” Watson said. “Once you are in this loophole, there is nothing but more videos of little girls. How has YouTube not seen this?”
Watson points out that advertisements are still being shown as he clicked through this list of suggested videos, meaning that these videos are therefore monetized as they continue to amass pedophilic attention.
Luckily, several corporations have removed their advertising from YouTube, such as Disney, AT&T, Nestle and Epic Games, as of last month.
As soon as major corporations began to make this decision, YouTube responded by disabling comments from videos that “are likely innocent but could be subject to predatory comments,” according to a memo sent out by the company that was reported by AdWeek. Instead of waiting for the comments to occur before taking action, YouTube has now decided to take it upon themselves to prevent the comments from happening at all.
The issue with this is the fact that YouTube waited only until after they lost ad revenue to make this decision. If the issue has been within public discussion since 2017, why is it that they are barely taking preventative steps two months into 2019? Additionally, disabling comments for a video does nothing when the accounts posting said comments are still up and running.
As Watson pointed out, these accounts still have the ability to repost the videos to their own account. Once these videos are reposted, who knows how many times, the entire concept of protecting children through the disabling of comments is irrelevant—they are still being sexualized by the same accounts, except instead of through comments it will now be done through reposting.
“This is working in pedophile’s favor; they are flaunting the fact that YouTube is doing nothing about this… I’m shocked that when I report channels that are amalgamating this stuff, and when I report guys that are time stamping this stuff, that YouTube hasn’t deleted these channels,” Watson said. “How is this not a violation of their terms of service and how are they not doing more? How does this exist?”
YouTube owes more to the children using YouTube. The response should never be “YouTube has age limits as to who can make accounts,” as some have decided to point out. The response should be to remove any and all form of pedophilic accounts from the platform. This goes further than just an issue of free speech on the internet, this is about protecting the most vulnerable from exploitation.
It is absolutely appalling that the safety of our children is taken seriously only when monetary revenue is at stake. A dollar sign should never be the deciding factor as to when preventative action should be taken.
The sexualization of children seems to be excused when there is some form of revenue being brought in, just look at the copious amounts of corporations that sexualize the innocence and vulnerability of the young to sell products or procure views, such as American Apparel advertisements or TLC’s “Toddlers and Tiaras.”
We need to stop failing our children and hold these platforms accountable for their shoddy responses to such crucial issues. The platform needs to do more than just disabling comments, they need to start removing these accounts, something that should have been done back in 2017 when the issue was first raised.