YouTube takes strict stance against conspiracy videos with new algorithm

YouTube published a blog post on January 25 that revealed their new method to shutting down conspiracy content on the platform – by limiting recommendations.

In the blog post, YouTube took a hard stance against such videos that deny historical events, promote “miracle cures” for major illnesses, and make false claims about scientific evidence (a la flat-earthers).

YouTube admitted that, while such content doesn’t violate their community guidelines, it could spread harmful misinformation to users across the site, thus prompting direct action.

Article continues after ad

YouTubeYouTube is taking a strict stance toward conspiracy content on its platform, introducing a new algorithm to prevent recommendations of such videos.
https://youtube.googleblog.com/2019/01/continuing-our-work-to-improve.html

“…we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways,” YouTube said of the issue. “…we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community.”

YouTube went on to claim that their algorithm will affect less than one percent of content across the site, and will not remove the offending videos completely.

Article continues after ad

Movieweb.comYouTube’s new algorithm will reduce recommendations for videos that promote “conspiracy content” on the site.
https://movieweb.com/youtube-shooter-dead-california-4-wounded-victims/

Additionally, YouTube revealed that their upcoming algorithm relies on both machine learning and actual employees, who are trained using the site’s guidelines and “critical input.”

YouTube is set to roll out this change “gradually,” with the algorithm being introduced to countries outside the US as the system becomes more accurate over time.

Article continues after ad

YouTube’s latest response to conspiracy videos follows a string of scams across the platform, where major content creators were being impersonated by scammers who sent messages to fans with the promise of exclusive giveaways. 

YouTube has since replied to concern over the issue, claiming that they are amping up security measures to prevent such impersonations.