FB Execs Ignored Research Showing Instagram’s Algorithm Banned Black Users Much More Than White Ones: NBC
DISREGARD
The company’s own researchers found that changes to one of the social network’s moderation algorithms banned Black users 50 percent more than white ones.
Internal research from Instagram found that a test of proposed rules for one of the social network’s moderation algorithms automatically banned Black users 50 percent more than white ones in mid-2019, NBC News reports. The program is one of several steps in evaluating and responding to reports of harassment and bullying on the social network. Facebook executives, instead of scrapping the changes, reportedly told the researchers to conceal the findings from colleagues and to halt their research. Instagram eventually moved forward with a modified version of the rules that its own researchers were not allowed to test. The company said the methodology originally used was flawed and that was the reason for the hold on the work. An audit commissioned by Facebook and released earlier this month found that the company made “vexing and heartbreaking decisions” with regards to civil rights.