The Free Internet Project

#StopHateforProfit

Meeting Between Facebook, Zuckerberg and Stop Hate for Profit Boycott Group Turns into a Big Fail

Facebook has come under scrutiny due to its handling of hate speech and disinformation posted on the platform. With the Stop Hate for Profit movement, corporations have begun to take steps to hold Facebook accountable for the disinformation that is spread on the platform. So far, more than 400 advertisers, from Coca-Cola to Ford and Lego, have made the pledge to stop advertising on the social media platform, according to NPR. Facebook has faced intense backlash, particularly since the 2016 election, for allowing disinformation and propaganda to be posted freely. The disinformation and hate, or “Fake News” as many may call it, is aimed at misinforming the voters and spreading hateful propaganda, potentially dampening voter participation.

A broad coalition of groups including Color for Change, the Anti-defamation league, and the NAACP, started the campaign Stop Hate for Profit. (For more on the origin, read Politico.) The goal of the campaign is to push Facebook to make much needed changes in its policy guidelines as well as change within the company executive employees. The boycott targets the advertising dollars for which the social media juggernaut relies upon. The campaign has begun to pick up steam with new companies announcing an end to Facebook Ads every day. With this momentum, the group behind the boycott have released a list 10 first steps Facebook can take.   

Stop Hate for Profit is asking that Facebook take accountability, have decency, and provide support to groups most affected by the hate that is spread on the platform. The civil rights leaders behind this movement are focused on making changes at the executive level as well as holding Facebook more accountable for their lackluster terms of service. The top execs currently at Facebook may have a conflict of interests. People contend that Facebook has a duty to make sure misinformation and hate is not spread, but Facebook does not exercise that to the fullest capacity because of their relationships with politicians. Rashad Robinson, president of Color of Change, contends that there needs to be a separation between the people in charge of the content allowed on Facebook and those who are aligned with political figures. The group is asking Facebook to hire an executive with a civil rights background, who can evaluate discriminatory policies and products. Additionally, the group is asking Facebook to expand on what they consider hate speech. The current terms of service that Facebook currently employs are criticized for being ineffective and problematic.   

Facebooks policies and algorithms are among the things the group asks to be changed. Current Facebook policies allow public and private hate groups to exist and also recommend them to many users.  The campaign asks that Facebook remove far-right groups that spread conspiracies, such as QAnon, from the platform. The labeling of inauthentic information that will cause hate and disinformation is also requested. In contrast, Twitter has taken small steps to label hateful content themselves. While many criticize Twitters actions not being far enough, they have taken steps Facebook has yet to take. Through this entire process, Facebook should make transparent to the public all the steps--in the number of ads rejected for hate or disinformation and in the third-party audit of hate spread on the site.  

The group also made a connection between the hate on the Facebook platform and race issues within the company. Stop Hate for Profit, provided a staggering statistic that 42% of Facebook users experience harassment on the platform. This along with the former black employee and two job candidates who filed EEOC complaints points to a culture at Facebook that goes far beyond allowing far-right propaganda and misinformation on the site but highlights a lack of support for users and employees of color. All of this is used to backup why it is essential that Facebook goes beyond making simple statements and actually make steps to create change.

Facebook CEO and cofounder Mark Zuckerberg agreed to meet with the civil rights groups behind the boycott amid the growing number of companies getting behind Stop Profit for Hate. Many have voiced their concerns that Facebook and CEO Zuckerberg are more concerned about messaging that legitimately fixing the underlying problems.  Upon meeting with Mark Zuckerberg on July 7, Stop Hate for Profit released a statement about what they felt was a disappointing and uneventful meeting. The group asserted that Facebook did what they previously feared, only providing surface level rhetoric with no real interest in committing to any real change. Of the ten recommendations, Zuckerberg was only open to addressing hiring a person with a civil rights background. Although he declined to fully commit to that position, if it is created, being a C-suite executive level position. Rashad Robinson tweeted a direct statement, saying that Facebook was not ready to make any changes despite knowing the demands of the group. That view appears to be consistent with a July 2, 2020 report of a remark by Zuckerberg to employees at a virtual town hall: "We're not gonna change our policies or approach on anything because of a threat to a small percent of our revenue, or to any percent of our revenue."

For now, it remains to be seen if the increased pressure from companies pulling advertisements will eventually cause Facebook and Zuckerberg to institute changes that progressive groups have been pushing for years. So far, it appears not.   

--written by Bisola Oni

Over 130 Companies Remove Ads from Facebook in #StopHateforProfit Boycott, forcing Mark Zuckerberg to change lax Facebook policy on misinformation and hate content

In the aftermath of Cambridge Analytica scandal in which the company exploited Facebook to target and manipulate swing voters in the 2016 U.S. election, Facebook did an internal review to examine the company's role in spreading misinformation and fake news that may have affected the election, as CEO Mark Zuckerberg announced. In 2018, Zuckerberg announced that Facebook was making changes to be better prepared to stop misinformation in the 2020 election. Critics criticized the changes as modest, however. As WSJ reporters Jeff Horwitz and Deepa Seetharaman detailed, Facebook executives largely rejected the internal study's recommendations to reduce polarization on Facebook. Doing so might be "paternalistic" and might open Facebook up to criticisms of being biased against conservatives.

Despite the concerns about fake news and misinformation affecting the 2020 election, Facebook took the position that fact checking for misinformation did not apply to the posts and ads by politicians in the same way as they applied to everyone else. Facebook's policy was even more permissive to political ads and politicians. As shown below, Facebook justified this hands-off position as advancing political speech: "Our approach is grounded in Facebook's fundamental belief in free expression, respect for the democratic process, and the belief that, especially in mature democracies with a free press, political speech is the most scrutinized speech there is. Just as critically, by limiting political speech we would leave people less informed about what their elected officials are saying and leave politicians less accountable for their words."

Facebook's Fact-Checking Exception for Politicians and Political Ads

By contrast, Twitter CEO Jack Dorsey decided to ban political ads in 2019 and to monitor the content politicians just as Twitter does with all other users for misinformation and other violations of Twitter's policy.  Yet Zuckerberg persisted in his "hands off" approach: "“I just believe strongly that Facebook shouldn’t be the arbiter of truth of everything that people say online." Zuckerberg even said Twitter was wrong to add warnings to two of President Trump's tweets as misleading (regarding mail-in ballots) and glorifying violence (Trump said, "When the looting starts, the shooting starts" regarding the protests of the Minneapolis police Derek Chauvin killing of George Floyd)  Back in October 2019, Zuckerberg defended his approach in the face of withering questioning by Rep. Alexandria Ocasio-Cortez. 

 

In May and June 2020, Zuckerberg persisted in his "hands off" approach. Some Facebook employees quit in protest, while others staged a walkout.  Yet Zuckerberg still persisted. 

On June 17, 2020, Color of Change, which is "the nation’s largest online racial justice organization," organized with NAACP, Anti-Defamation League, Sleeping Giants, Free Press, and Common Sense Media a boycott of advertising on Facebook for the month of July. The boycott was labeled #StopHateforProfit. Within just 10 days, over 130 companies joined the ad boycott of Facebook.  It included many large companies, such as Ben and Jerry's, Coca-Cola, Dockers, Eddie Bauer, Levi's, The North Face, REI, Unilver, and Verizon. 

On June 26, 2020, Zuckerberg finally announced some changes to Facebook's policy.  The biggest changes:

(1) Moderating hateful content in ads. As Zuckerberg explained on his Facebook page, "We already restrict certain types of content in ads that we allow in regular posts, but we want to do more to prohibit the kind of divisive and inflammatory language that has been used to sow discord. So today we're prohibiting a wider category of hateful content in ads. Specifically, we're expanding our ads policy to prohibit claims that people from a specific race, ethnicity, national origin, religious affiliation, caste, sexual orientation, gender identity or immigration status are a threat to the physical safety, health or survival of others. We're also expanding our policies to better protect immigrants, migrants, refugees and asylum seekers from ads suggesting these groups are inferior or expressing contempt, dismissal or disgust directed at them."

(2) Adding labels to posts, including from candidates, that may violate Facebook's policy. As Zuckerberg explained, "Often, seeing speech from politicians is in the public interest, and in the same way that news outlets will report what a politician says, we think people should generally be able to see it for themselves on our platforms.

"We will soon start labeling some of the content we leave up because it is deemed newsworthy, so people can know when this is the case. We'll allow people to share this content to condemn it, just like we do with other problematic content, because this is an important part of how we discuss what's acceptable in our society -- but we'll add a prompt to tell people that the content they're sharing may violate our policies.

"To clarify one point: there is no newsworthiness exemption to content that incites violence or suppresses voting. Even if a politician or government official says it, if we determine that content may lead to violence or deprive people of their right to vote, we will take that content down. Similarly, there are no exceptions for politicians in any of the policies I'm announcing here today." 

Facebook's new labeling of content of candidates sounds very similar to what Zuckerberg criticized Twitter as being wrong. And Facebook's new policy on moderating hateful content in ads that "are a threat to the physical safety, health or survival of others," including "people from a specific race, ethnicity, national origin, religious affiliation, caste, sexual orientation, gender identity or immigration status," seems a positive step to prevent Facebook being a platform to sow racial discord, which is a goal of Russian operatives according to U.S. intelligence. 

Facebook new policy on moderation of political ads and posts by politicians and others

The organizers of the boycott, however, were not impressed with Facebook's changes. They issued a statement quoted by NPR: "None of this will be vetted or verified — or make a dent in the problem on the largest social media platform on the planet. We have been down this road before with Facebook. They have made apologies in the past. They have taken meager steps after each catastrophe where their platform played a part. But this has to end now."

 

Blog Search

Blog Archive

Categories