Just a few months after announcing a new focus on groups, Facebook is introducing tools to crack down on the unsavory content that plagues the members-only pages.
The company was bound to have seen this coming. Facebook’s group pages allow users to build communities around niche interests. But they’ve also been hotbeds of hate and disinformation, with groups promoting dangerous medical hoaxes and encouraging Border Patrol agents to joke about migrant deaths.
Facebook also recommends new groups based on an algorithm, potentially drawing users into more fringe groups. When Facebook pivoted to promote group pages in late April, tech journalists warned the move could worsen the problem. In a Wednesday blog post, the social media giant announced new moderating measures for the groups.
In a post titled “How Do We Help Keep Private Groups Safe?” Facebook addressed the difficulty of moderating groups that are closed to the outside. Some of these groups are home to noxious content, including bigotry that violates Facebook’s terms of service. In its Wednesday post, Facebook alluded to rule-breaking in the private groups.
“Being in a private group doesn’t mean that your actions should go unchecked. We have a responsibility to keep Facebook safe, which is why our Community Standards apply across Facebook, including in private groups,” Facebook’s blog post read.
The company said it was trying to use more artificial intelligence to flag posts that violate their rules. The company described new tools that allow group moderators to review content that Facebook removed and increased transparency for potential group members about group moderators and past names a group has used.
Facebook also touched on calls to ban certain groups outright, which it described as a “nuanced” process of determining whether the group contained repeat violations.
Facebook does not always follow through on pledges to uphold its own rules. After a ban on white nationalism, the site declined to ban a well-known racist for a white nationalist video. (It later banned her after public outcry.) Additionally, an investigation by The Daily Beast last year found hundreds of public posts calling for the murder of immigrants and minorities.
The company’s blog post comes as Facebook shells out for a major ad campaign that promotes groups as safe and wholesome. The “More Together” campaign includes a television ad that shows a father taking his daughter to a baseball game and posting about it in a private Facebook group called “Dads with Daughters.”
The ad shows a supportive community for families. But the reality of several popular “Dads with Daughters” Facebook groups highlights the challenges of keeping Facebook groups positive. Rather than photos from baseball games, the groups contain shocking misogyny, including occasional sexualization of what appears to be underage girls, a Mel Magazine investigation found.
Some of Facebook’s new tools to promote groups actually made the problem worse. After Facebook’s TV ad, the “Dads with Daughters” groups saw a surge in membership, some of it from bigots and trolls. A new Facebook feature promotes groups’ most active members, meaning the pages’ loudest trolls often got the most attention.
One group administrator, who tried to keep his group positive after the TV ad, warned members that he would ban people who made disparaging comments.
A group member posted a rebuttal. “Instead of telling us how something on our page offended you, block us, unfollow us, and more importantly call your parents and ask them why you’re an oversensitive little bitch,” the person’s post read.
Because the post was uploaded as an image file, and not a text post, Facebook’s AI likely would not flag the message as hateful.