WHAT TOOK THEM SO LONG?
YouTube Tweaks Algorithm to Fight 9/11 Truthers, Flat Earthers, Miracle Cures
Videos about impossible conspiracy theories show up high in YouTube’s recommendations. That’s going to stop, the company says.
YouTube will tweak its algorithm to recommend less content that “comes close” to violating site rules, the company announced Friday.
YouTube’s recommendation algorithm suggests new videos for viewers, but some genres of videos, especially conspiracy videos, have been criticized for their frequent appearance in those recommendations. The company’s new policy will cut back on recommendations of “content that could misinform users in harmful ways,” YouTube said in a Friday memo. It cited three specific conspiracy theories affected by the new policy: Flat Earth theories, miracle cures, and 9/11 trutherism.
The new policy will apply to content on the “borderline” of YouTube’s community guidelines, the company said, adding that the new policy applies to less than one percent of all YouTube videos. YouTube declined to specify the scope of the videos affected.
Those videos will still be available to view on YouTube, and will still appear in search results. But they’ll find their way into fewer viewers’ recommendation feeds.
Currently, conspiracy theories like Flat Earth count YouTube as one of their largest recruitment tools. At the second annual Flat Earth International Conference in November, most participants told The Daily Beast they’d converted to Flat Earth belief after watching YouTube videos on the topic. Some said they’d started watching videos on conspiracies like 9/11, and eventually saw Flat Earth videos recommended in their YouTube feeds; others said they went looking for Flat Earth videos, and were recommended a stream of new Flat Earth videos.
Guillaume Chaslot, a former YouTube employee who worked on the site’s recommendation algorithm in 2010 previously told The Daily Beast that the algorithm can push people down conspiratorial rabbit holes.
“I realized really fast that YouTube’s recommendation was putting people into filter bubbles,” Chaslot said last year. “There was no way out. If a person was into Flat Earth conspiracies, it was bad for watch-time to recommend anti-Flat Earth videos, so it won’t even recommend them.”YouTube has also faced criticism for the prevalence of far-right videos in its recommendations. A BuzzFeed investigation on Thursday found that, over the course of nine recommendations, YouTube took a viewer from a non-partisan clip about Congress to an anti-immigrant video uploaded by a hate group. (When The Daily Beast tried a similar experiment in a cookie-free Incognito browser last month, it took four clicks to travel from a recommended video on YouTube’s homepage to a video on the far-right “red pill” theory.)
YouTube has previously addressed criticism over far-right videos by demonetizing some, or disabling their comments. Still, far-right content remains easy to find on the site, and presently appears in video recommendations. The company will begin rolling out its new policy gradually, and initially just in the United States, YouTube said.
“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,” YouTube said in its statement.
The Flat Earth Society condemned YouTube's decision.
"While it's unfortunate that this will no doubt affect some of the most prominent Flat Earth content creators, the Flat Earth Society has been prepared for years. Any social network can pull the rug from under your feet if it decides that your content is no longer welcome - which is why we've never relied on these businesses too much," the group told The Daily Beast.
"Who knows - perhaps it's time to start looking into a video sharing service of our own."