According to Bloomberg, brands such as Nestle, Walt Disney and Epic Games have halted their spend on YouTube after child video exposé. Nestle US has confirmed the news to Bloomberg. In a statement to the Bloomberg, a spokesperson from Nestle US said it has paused all pre-roll advertising.
“Through our advertising agency, we have reached out to Google/YouTube to determine actions they’ll take to eliminate this type of content from their service,” the spokesperson said.
This is not a problem new to YouTube and to the Google-owned platform’s credit, it has been battling the issue consistently over the past year. In a statement to Marketing, a YouTube spokesperson said, “Any content – including comments – that endangers minors is abhorrent and we have clear policies prohibiting this on YouTube.”
“We took immediate action by deleting accounts and channels, reporting illegal activity to authorities and disabling comments on tens of millions of videos that include minors. There’s more to be done, and we continue to work to improve and catch abuse more quickly,” she added.
Nonetheless, this time around, the news broke following an investigation by a YouTuber Matt Watson which showed video evidence that the suggestive videos of children are being monetised by YouTube with brands such as McDonald’s, Lysol, Disney and Reese’s. The YouTuber cautioned in the description about the “wormhole into a soft-core pedophilia ring on Youtube”.
He said that the platform’s recommended algorithm is “facilitating pedophiles’ ability to connect with each-other, trade contact info, and link to actual child porn in the comments.”
“I can consistently get access to it from vanilla, never-before-used YouTube accounts via innocuous videos in less than ten minutes, in sometimes less than five clicks.” He added that even if the videos are flagged and comments have been turned off, they are still being monetised and available for users to watch.
Meanwhile, research done by Wired.com said that ads appeared from major brands such as Alfa Romeo, Fiat, Fortnite, Grammarly, L’Oreal, Maybelline and Peloton were said to appear alongside disturbing videos of children doing activities such as Twister, yoga challenges, and gymnastics.
What is YouTube doing about it?
The Google-own video streaming platform said in the last 48 hours it has taken “an aggressive approach, beyond its normal protections, and disabled comments on tens of millions of videos that include minors.”
It also claims to have reviewed and removed thousands of inappropriate comments that appeared against videos with young people in them and terminated over 400 channels for the comments they left on videos, reporting illegal comments to National Centre for Missing & Exploited Children (NCMEC).
The platform has been struggling to combat child exploitation content despite investment in new technologies such as CSAI Match tool and hiring more experts to do so. YouTube currently does not allow users under 13 to create or own accounts. Last year alone, YouTube has terminated and reported 46 thousand offender accounts to NCMEC, leading to arrests and convictions.
Earlier in January, YouTube announced that it will be “reducing recommendations of borderline content and content that could misinform users in harmful ways”. The move however, only affects recommendations of what videos to watch, not whether a video is available on YouTube.