YouTube commits to a “roadmap” ahead to ensure child safety after several major brands pull their advertisements from YouTube yesterday, including better child predatory comment detection later this month.
This was communicated in a conference call with representatives from all major ad agency holding companies as well as several unnamed advertisers. A memo was also issued out after the call. Marketing has confirmed with YouTube about the move, which was first revealed through Adweek‘s sources.
Other than the list of recent actions taken by YouTube that Marketing has reported yesterday, the memo detailed other improvements in the pipeline. To find violative comments faster, YouTube’s engineering teams are working on a new version of a predatory comment classifier this quarter, set to go out later this month.
The platform is hoping that account creators can lend strength to the battle against violative comments, as it look into ways such as auto-moderation tools. While it will take some time, YouTube will also be addressing feedback about reducing the discoverability of inappropriate videos, as well as fine-tuning how ads are placed on channels. Lastly, it is working on solutions to make it harder for predators to open new accounts once they have been shut down.
The memo stated, “Over the last 18 months, we have taken significant steps to enforce tougher protections for minors on YouTube. We’ve been more rigorous with controversial content, raised the threshold for where ads appear, and improved tools and support for partners. These changes have resulted in the removal of 4.3 million videos and 3.7 million comments to date for child safety violations.”
The platform has been struggling to combat child exploitation content despite investment in new technologies such as CSAI Match tool and hiring more experts to do so. YouTube currently does not allow users under 13 to create or own accounts. Last year alone, YouTube has terminated and reported 46 thousand offender accounts to National Centre for Missing & Exploited Children, leading to arrests and convictions.
Earlier in January, YouTube announced that it will be “reducing recommendations of borderline content and content that could misinform users in harmful ways”. The move however, only affects recommendations of what videos to watch, not whether a video is available on YouTube.