YouTube is looking to improve its recommendations to help users. In a statement, the video steaming Google owned platform said it will be “reducing recommendations of borderline content and content that could misinform users in harmful ways”. This includes videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.
To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube. Users can still access all videos that comply with its Community Guidelines and, when relevant, these videos may appear in recommendations for channel subscribers and in search results.
On a post, YouTube said this shift will apply to less than 1% of the content on YouTube, but added that it believes in “limiting the recommendation of these types of videos will mean a better experience for the YouTube community”. Currently, YouTube pulls in recommendations from a wider set of topics. On any given day, more than 200 million videos are recommended on the homepage alone.
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users. This change relies on a combination of machine learning and real people. We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations,” said the platform.
YouTube added that the evaluators are trained using public guidelines and provide critical input on the quality of a video. The platform expects that over time, the systems become more accurate.
“When recommendations are at their best, they help users find a new song to fall in love with, discover their next favorite creator, or learn that great paella recipe. That’s why we update our recommendations system all the time—we want to make sure we’re suggesting videos that people actually want to watch,” said YouTube.