Far-right trolls are using a network of fake accounts to rig YouTube’s algorithm in their favor, according to chats leaked by an anti-alt-right group.
Conversations uploaded by the Twitter account Alt Right Leaks appear to show members of European alt-right group Reconquista Germanica discussing ways to manipulate YouTube with fake accounts in order to boost visibility of their preferred videos and bury videos they don’t like.
Unlike Twitter and Facebook where posts appear chronologically and can’t be downvoted, YouTube videos are at the mercy of the site’s “dislike” button. Although YouTube has not disclosed the exact algorithm by which videos appear in its searches and “up next” lists, social media experts have long suggested that a video’s like-to-dislike ratio factors into its visibility on the site.
That makes the dislike button a weapon for people operating fake YouTube accounts who want to push a video from sight.
“Can you tell me how many sock [puppet] accounts one can have?” a person asks in German in the leaked chat, using a term for a fake account that poses as a person. “I already have 12.”
Reconquista Germanica’s leaders then advised him to make more in advance of a “raid,” when the group’s members planned to use dozens of accounts each to downvote an opponent’s video into obscurity.
The chats took place on Discord, a voice and messaging platform originally intended for gamers, which has seen a sharp uptick in popularity among the alt-right. The conversations were spearheaded by German alt-right figures on a Discord server that made headlines in September for its leader’s efforts to upend Germany’s elections by spamming Twitter with far-right hashtags. The Twitter campaign bombed, BuzzFeed reported at the time.
“We had several people in the server,” Alt Right Leaks told The Daily Beast, adding that they wanted to raise awareness of far-right intimidation tactics on YouTube. “I see many people on social media who are afraid to speak out against hateful ideologies, because they fear the shitstorm that would follow. Some Youtubers depend on the income from their channels, so they cannot afford to become unpopular due to trolls and hate comments.”
The far-right has found a home on YouTube, which has come under recent scrutiny for hosting conspiracy videos, including those accusing survivors of the Parkland shooting of being crisis actors. Some of YouTube’s biggest stars have faced backlash for racial insensitivity or outright racism, from star vlogger Logan Paul’s recent controversy over filming a dead man in Japan, to YouTuber PewDiePie’s use of Nazi and anti-semitic jokes in his videos. YouTube did not return The Daily Beast’s request for comment.
The Reconquista Germanica Discord group had hundreds of members, according to screenshots shared by Alt Right Leaks. Screenshots of the group’s text logs and ripped footage of their audio conversations is consistent with images and video previously obtained from the server and published by BuzzFeed. A document on Twitter trolling that was circulated in the server was also on the website of an alleged Reconquista Germanica member.
Shortly after its failed Twitter campaign to troll Germany’s elections, the Reconquista Germanica chat started discussing ways to rig YouTube instead.
In a chat around Sept. 1, Reconquista Germanica’s leaders hosted a YouTube trolling tutorial, aimed at burying political opponents’ videos in advance of the German elections.
The server appeared to include members from outside the country, including a well-known Scottish white nationalist and Martin Sellner, an Austrian far-right figure known for his involvement in a campaign to rent a boat and harass humanitarian ships that rescue migrants at sea. (Sellner did not respond to a request for comment.) Sellner’s anti-migrant campaign got a boost from the American right when conservative troll Charles Johnson used his fundraising website to help raise hundreds of thousands of dollars for anti-migrant boats.
Targeted downvoting “can be an extremely powerful tool,” one member said in the chat in German. He suggested the reverse could also be true. “We can push our own videos through likes and comments through the organization we’ve created, so that they are rated more relevant by YouTube’s search algorithm.”
It’s part of what observers of the far-right see as a tussle between trolls and social media platforms.
“What we’ve seen is the alt-right is incredibly adept at navigating and otherwise exploiting loopholes in social media platforms to create these sock puppets and using them to amplify their colleagues,” Ryan Lenz, a senior investigative reporter with the Southern Poverty Law Center told The Daily Beast.
YouTube recently implemented a “three strikes” policy for banning abusive accounts, but it’s not always enforced consistently.
“It all depends on the discretionary reality of someone looking at that content and making a discretionary decision about whether it violates Terms of Service,” Lenz said. “Across the board, there’s been an inconsistent application of that.”
Through a campaign of downvotes and negative comments, the far-right Discord group hoped to encourage a larger pile-on against videos it targeted, potentially bumping them from searches.
“We will observe the like/dislike ratio, and after an hour check how much it changed,” a leader said.
“When you go to a YouTube video and see it has 50 percent or more dislikes and a lot of critical comments, then people won’t even watch the video.”
Despite its members’ anti-immigrant views, the Discord group coached participants to pose as foreigners on the German videos.
“I’ll impersonate some Albanians in there,” one member says in advance of a comment “raid” on a video. “Because they fear nothing more than being critiqued by foreigners.”
“Get yourself accounts with Turkish names,” a member advised, “so that they think the German ‘patriots’ have foreign support. That is horrible for them.”
The person also recommended using a Syrian name and commenting that they were a Syrian migrant who thought Germany had too many migrants.
The trick to burying a YouTube video they didn’t like would be to swarm its comments with sockpuppet accounts simultaneously, as if they were in conversation with each other. The tactic is similar to the one employed at the Internet Research Agency, the Russian troll farm accused of interfering in U.S. elections; former IRA employees told the Associated Press that employees would fake narratives by staging arguments in comments sections.
And YouTube made it easy for a single troll to pose as an army. When one chat member said he only had 12 sock puppet accounts, another person recommended he verify his account. Unlike social media platforms like Facebook and Twitter, which sometimes require a person to reveal identifying information to verify their accounts, a verified YouTube account only requires a phone number. Once verified, that person can make multiple usernames.
With verified accounts, trolls can switch easily between 99 sock puppet accounts, the chat’s leaders advised.
“Holy shit,” another member responded. “Does the original account get banned eventually?”
“No.”
“Even when I abuse it?”
“Well… just don’t let them connect the dots.”
YouTube doesn’t always connect those dots. The website has previously come under fire for allowing the far-right to dodge anti-hate speech and anti-harassment rules. On Monday, The Daily Beast reported that openly neo-Nazi groups like the Traditionalist Worker Party and the Atomwaffen Division are still using the site for recruitment videos. Despite both parties engaging in harassment against minorities and Atomwaffen calling for mass-murder of Jews, YouTube defended keeping the accounts live under the mantle of “free expression.”
After two days of backlash and a takedown request from the Anti-Defamation League, YouTube banned Atomwaffen on Wednesday.