YouTube has announced that it will no longer recommend videos that “come close to” violating its community guidelines, such as conspiracy or medically inaccurate videos.
On Saturday, a former engineer for Google, YouTube’s parent company, hailed the move as a “historic victory.”
The original blog post from YouTube, published on Jan. 25, said that videos the site recommends, usually after a user has viewed one video, would no longer lead just to similar videos and instead would “pull in recommendations from a wider set of topics.”
For example, if one person watches one video showing the recipe for snickerdoodles, they may be bombarded with suggestions for other cookie recipe videos. Up until the change, the same scenario would apply to conspiracy videos.
YouTube said in the post that the action is meant to “reduce the spread of content that comes close to — but doesn’t quite cross the line of — violating” its community policies. The examples the company cited include “promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.”
The change will not affect the videos’ availability. And if users are subscribed to a channel that, for instance, produces conspiracy content, or if they search for it, they will still see related recommendations, the company wrote.
Guillaume Chaslot, a former Google engineer, said that he helped to build the artificial intelligence used to curate recommended videos. In a thread of tweets posted on Saturday, he praised the change.
“It’s only the beginning of a more humane technology. Technology that empowers all of us, instead of deceiving the most vulnerable,” Chaslot wrote.