Chase Ross is embroiled in a dispute with YouTube that seems to show the platform’s algorithm discriminates against transgender users like him.
On May 30, Ross uploaded a video comparing his emotional wellbeing now to five years ago, when he underwent surgery and transitioned from female to male.
The video, originally titled “FIVE YEARS POST-OP EMOTIONAL COMPARISON”, was approved for monetization by YouTube, meaning adverts would be served alongside the video. But then he decided to amend the title, adding in “(FTM TRANSGENDER)” to the end.
“The second I added 'trans' in the title the second time, it was demonetized,” he explains. Although the demonetization has affected Ross’s income – one other video that fell foul of YouTube’s algorithm earned him just $1.15 for more than 200,000 views, he revealed on Twitter – the broader issue of censoring trans videos concerned him more.
“It makes me feel depressed and not want to make videos anymore. I feel like a second-class citizen. I feel like trans people and other people under the LGBT umbrella are getting their videos demonetized because they don’t follow the straight, white way of life.”
This isn’t the first time that the video sharing platform has been accused of bias against its minority creators. In March 2017 Tyler Oakley, one of YouTube’s biggest personalities, tweeted that one of his videos highlighting LGBTQ+ trailblazers was blocked because YouTube allegedly did not show videos containing phrases that had anything to do with LGBTQ+.
The same month, British LGBTQ+ creator Rowan Ellis, who has 31,000 subscribers, uploaded a video claiming that at least 40 of her videos had been siloed into YouTube’s restricted content mode, which the platform claims helps remove “potentially objectionable content”.
At one point, a video produced by YouTube celebrating marriage equality was even sectioned off into the restricted content section. The site apologized to its LGBTQ+ creators at the time, saying “Our system sometimes make mistakes in understanding context and nuances when it assesses which videos to make available in Restricted Mode.”
The 2017 anti-LGBTQ+ restriction affected Ross, too. At the time, he had around 650 videos posted on his YouTube channel. But had you visited his channel using YouTube’s restricted mode, you’d have seen just four videos.
“It is filtering out a hell of a lot of LGBT content,” Ellis said. “This is something which goes far beyond just a mistake YouTube might have made.”
After a backlash – including YouTube elder statesman Hank Green tweeting “‘YouTube Restricted’ is for a way for parents to block potentially offensive content. Apparently that includes the existence of gay people?” – YouTube CEO Susan Wojcicki announced a slew of changes meant to placate the LGBTQ+ community, including partnering with LGBTQ+ charity The Trevor Project and hosting feedback sessions and roundtable discussions with minority creators.
A year on from those initiatives, YouTube’s algorithm still seems by default to be flagging up videos containing the word “transgender” as something not suitable for all audiences.
“The hard part as an outsider is figuring out if it is happening internally, meaning that explicit rules were set, or if it has to do with behavior from clicks from the outside world,” says Karrie G. Karahalios, professor of computer science at the University of Illinois.
YouTube insists that all advertisers on its site, regardless of their views, can’t specifically target or opt out of having their advertising appear on LGBTQ+ videos, though creators on the site can block specific advertiser URLs or certain categories of ads, should they choose to.
“We do not have a list of LGBTQ+ related words that trigger demonetization, and we are constantly evaluating our systems to ensure they are enforcing our policies without bias,” a YouTube spokesperson told The Daily Beast. “We use machine learning to evaluate content against our advertiser guidelines” – indicating that the site’s algorithms may have incorrectly flagged up the video for falling foul of rules the site checks videos against to make sure it’s advertiser-friendly.
The video Ross uploaded to see whether “transgender” was a seen by YouTube as a troublesome term – and which showed the seeming discrimination in YouTube’s algorithm – has since been manually reviewed and monetized like any other video.
“Sometimes our systems get it wrong, which is why we’ve encouraged creators to appeal,” the YouTube spokesperson added. “Successful appeals ensure that our systems keep getting better.”
“Many algorithms unfortunately do reflect society,” says Karahalios. “There are many cases of results on Google being racist because of what’s been typed in. What’s even more alarming is when these algorithms amplify certain bigoted behaviors.”
Ross’s latest experience has made him disillusioned with YouTube. “It makes me feel like I don’t belong on this platform that we helped build,” he says. “There are specific videos I want to make but I’m thinking can I even make this because will my video get restricted or demonetized right away?”
He’s also struggling, like many LGTBQ+ creators, to fight against an army of trolls flagging his videos, regardless of their content, as unsuitable because of the trolls’ opinions of trans people.
“Weaponized” flagging of videos for incorrect copyright violations has been a tool used by professional creators looking to take money away from their peers, and is used by trolls to try and disrupt the livelihood of creators they dislike. “It’s important for YouTube to realize there are people out there flagging videos unnecessarily,” explains Ross.
Whether those weaponized flags on his videos have impacted the algorithm’s approach to Ross’s uploads and those of other trans creators is unknown: YouTube’s algorithm is an impenetrable, opaque black box.
But regardless, Ross believes YouTube needs to treat its marginalized communities better. “The queer community on YouTube is gigantic,” he says. “I know that YouTube knows that. I know they care about YouTubers who are LGBT; I know that. But there’s something wrong in your algorithm or with your programmers or robots doing this automatically, where this ‘trans’ word is triggering the algorithm.”
“It’s important we don’t punish marginalized populations,” says Karahalios. “That is something companies are struggling to deal with right now.”