YouTube is tweaking its recommendation algorithm to prevent users from seeing potentially “harmful” content including “blatantly false” 9/11 conspiracy theories.
The Google-owned site has come under increasing pressure over the past year to crack down on “problematic” videos containing offensive content or potentially fake news.
YouTube, which has 17 million monthly Australian users, plays more than one billion hours of content every day. The recommendation algorithm determines what video is automatically loaded next, and drives a large amount of traffic on the site.
In a blog post, YouTube said it was “taking a closer look” at how it could reduce the spread of content that “comes close to, but doesn’t quite cross the line” of violating its guidelines.
“To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways — such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11,” the company said.
Polls have found more than half of Americans doubt the official US government explanation of the September 11, 2001 terror attacks, and conspiracy theory videos related to the “9/11 Truth Movement” were once YouTube’s bread and butter.
In a 2017 interview, Dylan Avery, director of the viral 2005 “truther” film Loose Change, estimated at least 100 million people had watched it across various platforms including the now-defunct Google Video.
YouTube said the change would affect “less than 1 per cent” of the content on the platform and only related to recommendations of what videos to watch, not whether a video was available.
“As always, people can still access all videos that comply with our Community Guidelines and, when relevant, these videos may appear in recommendations for channel subscribers and in search results,” the blog post said.
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.”
Despite this, 9/11 conspiracy theory videos now appear to be much harder to find. Searching “9/11” now yields virtually no results containing the word “conspiracy”, a stark contrast to a decade ago.
YouTube said it would implement the change using a combination of machine learning and “human evaluators” who were “trained using public guidelines and provide critical input on the quality of a video”.
“This will be a gradual change and initially will only affect recommendations of a very small set of videos in the United States,” it said.
“Over time, as our systems become more accurate, we’ll roll this change out to more countries. It’s just another step in an ongoing process, but it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube.”
In August last year, YouTube permanently banned Infowars founder and conspiracy theorist Alex Jones, who had 2.4 million subscribers on the platform, citing “hate speech and harassment”.
Jones had been widely condemned for suggesting the 2012 Sandy Hook school shooting in Newtown, Connecticut was a hoax. He is currently being sued for defamation by a number of the victims’ families.
In February, YouTube was forced to apologise after a video falsely claiming Parkland school shooting survivor David Hogg was a “crisis actor” appeared in the number-one trending spot for several hours.
Meanwhile, YouTube and other digital publishers including Facebook and Twitter have been accused of censorship by many on the right. Last year, US President Donald Trump accused Google of rigging search results to make him look bad, suggesting it was “illegal”.