The Free Internet Project

discrimination

Facebook and Instagram Studying If Racial Bias in Their Algorithms After Years of Ignoring Issue

Facebook announced it will create teams to study if there is any racial bias within Facebook's and Instagram's algorithms that negatively impact the experience of its minority users experience within the social media platforms. The Equity and Inclusion team at Instagram and the and Inclusivity Product Team for Facebook will tackle a large issue that Facebook has largely ignored in the past. Facebook is under intense scrutiny.  Since July 2020, Facebook has faced a massive advertising boycott called Stop Hate for Profit from over five hundred companies such as Coca-Cola, Disney, and Unilever. Facebook has been criticized for a lack of initiative in handling hate speech and attempts to sow racial discord on their platforms, including to suppress Black voters. An independent audit by civil rights experts found the prevalence of hate speech targeting Blacks, Jews, and Muslims on Facebook "especially acute." “The racial justice movement is a moment of real significance for our company,” said Vishal Shah, Instagram’s product director told the Wall Street Journal. “Any bias in our systems and policies runs counter to providing a platform for everyone to express themselves.” 

The new research teams will cover what has been was blind spot for Facebook. ln 2019, Facebook employees found that a computer automated moderation algorithm on Instagram was 50 percent more likely to suspend the accounts of Black users compared to white users, according to the Wall Street Journal. This finding was supported by user complaints to the company. After employees reported these findings, they were sworn to secrecy and no further research on the algorithm was done by Facebook. Ultimately, the algorithm was changed, but it was not tested any further for racial bias. Facebook officially stated that that research was stopped because an improper methodology was being applied at the time. As reported by NBC News, employees of Facebook leaked that the automated moderation algorithm automatically detects and deletes hate-speech against white users more effectively then it moderates hate speech against black users.  

Facebook's announcement of these teams to study racial bias in these social media platforms are only in the infancy stage. The Instagram Equity and Inclusion team does not have a team leader announced yet. The Inclusivity Product Team will supposedly work closely with a group of Black users and cultural experts to make effective changes. However, Facebook employees previously working on this issue have stated anonymously that they were ignored and discouraged to continue their work. The culture of Facebook as a company and previous inaction to address racial issues have raised skepticism of Facebook's recent initiatives. Time will tell if Facebook is serious about the problem.  

--written by Sean Liu  

 

Blog Search

Blog Archive

Categories