The Free Internet Project

Zuckerberg

Mark Zuckerberg: Facebook to suspend political ads week before US election, add a label to premature election claims of victory

On September 3, 2020, Mark Zuckerberg published a lengthy post on his personal Facebook profile, detailing dramatic new measures Facebook is undertaking to safeguard the integrity of the U.S. elections. Zuckerberg wrote [we've added topical descriptions in brackets]:

Today, we're announcing additional steps we're taking at Facebook to encourage voting, connect people with authoritative information, and fight misinformation. These changes reflect what we've learned from our elections work over the past four years and the conversations we've had with voting rights experts and our civil rights auditors:


[Reliable Information at the top of page] We will put authoritative information from our Voting Information Center at the top of Facebook and Instagram almost every day until the election. This will include video tutorials on how to vote by mail, and information on deadlines for registering and voting in your state.

[No political ads starting the week before the election] We're going to block new political and issue ads during the final week of the campaign. It's important that campaigns can run get out the vote campaigns, and I generally believe the best antidote to bad speech is more speech, but in the final days of an election there may not be enough time to contest new claims. So in the week before the election, we won't accept new political or issue ads. Advertisers will be able to continue running ads they started running before the final week and adjust the targeting for those ads, but those ads will already be published transparently in our Ads Library so anyone, including fact-checkers and journalists, can scrutinize them.

[Partnering with state election authorities to identify election misinformation] We're going to extend our work with election officials to remove misinformation about voting. We already committed to partnering with state election authorities to identify and remove false claims about polling conditions in the last 72 hours of the campaign, but given that this election will include large amounts of early voting, we're extending that period to begin now and continue through the election until we have a clear result. We've already consulted with state election officials on whether certain voting claims are accurate.

[Limit the number of chats you can forward on Messenger] We're reducing the risk of misinformation and harmful content going viral by limiting forwarding on Messenger. You'll still be able to share information about the election, but we'll limit the number of chats you can forward a message to at one time. We've already implemented this in WhatsApp during sensitive periods and have found it to be an effective method of preventing misinformation from spreading in many countries.

[Remove both explicit and implicit voting misinformation] No political ads starting the week before the election] We're expanding our voter suppression policies. We already remove explicit misrepresentations about how or when to vote that could cause someone to lose their opportunity to vote -- for example, saying things like "you can send in your mail ballot up to 3 days after election day", which is obviously not true. (In most states, mail-in ballots have to be *received* by election day, not just mailed, in order to be counted.) We're now expanding this policy to include implicit misrepresentations about voting too, like "I hear anybody with a driver's license gets a ballot this year", because it might mislead you about what you need to do to get a ballot, even if that wouldn't necessarily invalidate your vote by itself.

[Remove COVID-misinformation to scare voters from voting] We're putting in place rules against using threats related to Covid-19 to discourage voting. We will remove posts with claims that people will get Covid-19 if they take part in voting. We'll attach a link to authoritative information about Covid-19 to posts that might use the virus to discourage voting, and we're not going to allow this kind of content in ads. Given the unique circumstances of this election, it's especially important that people have accurate information about the many ways to vote safely, and that Covid-19 isn't used to scare people into not exercising their right to vote.

Measure to stop false or premature election results 

Since the pandemic means that many of us will be voting by mail, and since some states may still be counting valid ballots after election day, many experts are predicting that we may not have a final result on election night. It's important that we prepare for this possibility in advance and understand that there could be a period of intense claims and counter-claims as the final results are counted. This could be a very heated period, so we're preparing the following policies to help in the days and weeks after voting ends:

[Facebook Voting Information Center provide information on time it takes to count votes] We'll use the Voting Information Center to prepare people for the possibility that it may take a while to get official results. This information will help people understand that there is nothing illegitimate about not having a result on election night.

[Partner with Reuters and National Election Pool for authoritative information on relection results] We're partnering with Reuters and the National Election Pool to provide authoritative information about election results. We'll show this in the Voting Information Center so it's easily accessible, and we'll notify people proactively as results become available. Importantly, if any candidate or campaign tries to declare victory before the results are in, we'll add a label to their post educating that official results are not yet in and directing people to the official results.

• [Label posts that attempt to deligitimize the election results] We will attach an informational label to content that seeks to delegitimize the outcome of the election or discuss the legitimacy of voting methods, for example, by claiming that lawful methods of voting will lead to fraud. This label will provide basic authoritative information about the integrity of the election and voting methods.

[Expand Facebook policy against content with violence and harm directed at election officials] We'll enforce our violence and harm policies more broadly by expanding our definition of high-risk people to include election officials in order to help prevent any attempts to pressure or harm them, especially while they're fulfilling their critical obligations to oversee the vote counting.

• [Expand Facebook policy against militia and conspiracy groups organizing or supporting violence] We've already strengthened our enforcement against militias, conspiracy networks like QAnon, and other groups that could be used to organize violence or civil unrest in the period after the elections. We have already removed thousands of these groups and removed even more from being included in our recommendations and search results. We will continue to ramp up enforcement against these groups over the coming weeks.

It's important to recognize that there may be legitimate concerns about the electoral process over the coming months. We want to make sure people can speak up if they encounter problems at the polls or have been prevented from voting, but that doesn't extend to spreading misinformation. We'll enforce the policies I outlined above as well as all our existing policies around voter suppression and voting misinformation, but to ensure there are clear and consistent rules, we are not planning to make further changes to our election-related policies between now and the official declaration of the result.

In addition to all of this, four years ago we encountered a new threat: coordinated online efforts by foreign governments and individuals to interfere in our elections. This threat hasn't gone away. Just this week, we took down a network of 13 accounts and 2 pages that were trying to mislead Americans and amplify division. We've invested heavily in our security systems and now have some of the most sophisticated teams and systems in the world to prevent these attacks. We've removed more than 100 networks worldwide engaging in coordinated inauthentic behavior over the past couple of years, including ahead of major democratic elections. However, we're increasingly seeing attempts to undermine the legitimacy of our elections from within our own borders.

I believe our democracy is strong enough to withstand this challenge and deliver a free and fair election -- even if it takes time for every vote to be counted. We've voted during global pandemics before. We can do this. But it's going to take a concerted effort by all of us -- political parties and candidates, election authorities, the media and social networks, and ultimately voters as well -- to live up to our responsibilities. We all have a part to play in making sure that the democratic process works, and that every voter can make their voice heard where it matters most -- at the ballot box.

 

Summary: Mounting Allegations Facebook, Zuckerberg Have Political Bias and Favoritism for Trump and conservatives in content moderation

In the past week, more allegations surfaced that Facebook executives have been intervening in questionable ways in the company's content moderation procedure that show favoritism to Donald Trump, Breitbart, and other conservatives. These news reports cut against the narrative that Facebook has an "anti-conservative bias." For example, according to some allegations, Facebook executives didn't want to enforce existing community standards or change the community standards in a way that would flag conservatives for violations, even when the content moderators found violations by conservatives.  Below is a summary of the main allegations that Facebook has been politically biased in favor of Trump and conservatives.  This page will be updated if more allegations are reported.

Ben Smith, How Pro-Trump Forces Work the Refs in Silicon Valley, N.Y. Times (Aug. 9, 2020): "Since then, Facebook has sought to ingratiate itself to the Trump administration, while taking a harder line on Covid-19 misinformation. As the president’s backers post wild claims on the social network, the company offers the equivalent of wrist slaps — a complex fact-checking system that avoids drawing the company directly into the political fray. It hasn’t worked: The fact-checking subcontractors are harried umpires, an easy target for Trump supporters’ ire....In fact, two people close to the Facebook fact-checking process told me, the vast bulk of the posts getting tagged for being fully or partly false come from the right. That’s not bias. It’s because sites like The Gateway Pundit are full of falsehoods, and because the president says false things a lot."

Olivia Solon, Sensitive to claims of bias, Facebook relaxed misinformation rules for conservative pages, NBC News (Aug. 7, 2020, 2:31 PM): "The list and descriptions of the escalations, leaked to NBC News, showed that Facebook employees in the misinformation escalations team, with direct oversight from company leadership, deleted strikes during the review process that were issued to some conservative partners for posting misinformation over the last six months. The discussions of the reviews showed that Facebook employees were worried that complaints about Facebook's fact-checking could go public and fuel allegations that the social network was biased against conservatives. The removal of the strikes has furthered concerns from some current and former employees that the company routinely relaxes its rules for conservative pages over fears about accusations of bias."

Craig Silverman, Facebook Fired an Employee Who Collected Evidence of Right-Wing Page Getting Preferential Treatment, Buzzfeed (Aug. 6, 2020, 4:13 PM): "[S]ome of Facebook’s own employees gathered evidence they say shows Breitbart — along with other right-wing outlets and figures including Turning Point USA founder Charlie Kirk, Trump supporters Diamond and Silk, and conservative video production nonprofit Prager University — has received special treatment that helped it avoid running afoul of company policy. They see it as part of a pattern of preferential treatment for right-wing publishers and pages, many of which have alleged that the social network is biased against conservatives." Further: "Individuals that spoke out about the apparent special treatment of right-wing pages have also faced consequences. In one case, a senior Facebook engineer collected multiple instances of conservative figures receiving unique help from Facebook employees, including those on the policy team, to remove fact-checks on their content. His July post was removed because it violated the company’s 'respectful communication policy.'”

Ryan Mac, Instagram Displayed Negative Related Hashtags for Biden, but Hid them for Trump, Buzzfeed (Aug. 5, 2020, 12:17 PM): "For at least the last two months, a key Instagram feature, which algorithmically pushes users toward supposedly related content, has been treating hashtags associated with President Donald Trump and presumptive Democratic presidential nominee Joe Biden in very different ways. Searches for Biden also return a variety of pro-Trump messages, while searches for Trump-related topics only returned the specific hashtags, like #MAGA or #Trump — which means searches for Biden-related hashtags also return counter-messaging, while those for Trump do not."

Ryan Mac & Craig Silverman, "Hurting People at Scale": Facebook's Employees Reckon with the Social Network They've Built, Buzzfeed (July 23, 2020, 12:59 PM): Yaël Eisenstat, Facebook's former election ads integrity lead "said the company’s policy team in Washington, DC, led by Joel Kaplan, sought to unduly influence decisions made by her team, and the company’s recent failure to take appropriate action on posts from President Trump shows employees are right to be upset and concerned."

Elizabeth Dwoskin, Craig Timberg, & Tony Romm, Zuckerberg once wanted to sanction Trump. Then Facebook wrote rules that accommodated him., Wash. Post (June 28, 2020, 6:25 PM): "But that started to change in 2015, as Trump’s candidacy picked up speed. In December of that year, he posted a video in which he said he wanted to ban all Muslims from entering the United States. The video went viral on Facebook and was an early indication of the tone of his candidacy....Ultimately, Zuckerberg was talked out of his desire to remove the post in part by Kaplan, according to the people. Instead, the executives created an allowance that newsworthy political discourse would be taken into account when making decisions about whether posts violated community guidelines....In spring of 2016, Zuckerberg was also talked out of his desire to write a post specifically condemning Trump for his calls to build a wall between the United States and Mexico, after advisers in Washington warned it could look like choosing sides, according to Dex Torricke-Barton, one of Zuckerberg’s former speechwriters."  

Regarding election interference: "Facebook’s security engineers in December 2016 presented findings from a broad internal investigation, known as Project P, to senior leadership on how false and misleading news reports spread so virally during the election. When Facebook’s security team highlighted dozens of pages that had peddled false news reports, senior leaders in Washington, including Kaplan, opposed shutting them down immediately, arguing that doing so would disproportionately impact conservatives, according to people familiar with the company’s thinking. Ultimately, the company shut down far fewer pages than were originally proposed while it began developing a policy to handle these issues."

Craig Timberg, How conservatives learned to wield power inside Facebook, Wash. Post (Feb. 20, 2020, 1:20 PM): "In a world of perfect neutrality, which Facebook espouses as its goal, the political tilt of the pages shouldn’t have mattered. But in a videoconference between Facebook’s Washington office and its Silicon Valley headquarters in December 2016, the company’s most senior Republican, Joel Kaplan, voiced concerns that would become familiar to those within the company. 'We can’t remove all of it because it will disproportionately affect conservatives,; said Kaplan, a former George W. Bush White House official and now the head of Facebook’s Washington office, according to people familiar with the meeting who spoke on the condition of anonymity to protect professional relationships."

Related articles about Facebook

Ben Smith, What's Facebook's Deal with Donald Trump?NY Times (June 21, 2020): "Mr. Trump’s son-in-law, Jared Kushner, pulled together the dinner on Oct. 22 on short notice after he learned that Mr. Zuckerberg, the Facebook founder, and his wife, Priscilla Chan, would be in Washington for a cryptocurrency hearing on Capitol Hill, a person familiar with the planning said. The dinner, the person said, took place in the Blue Room on the first floor of the White House. The guest list included Mr. Thiel, a Trump supporter, and his husband, Matt Danzeisen; Melania Trump; Mr. Kushner; and Ivanka Trump. The president, a person who has spoken to Mr. Zuckerberg said, did most of the talking. The atmosphere was convivial, another person who got an account of the dinner said. Mr. Trump likes billionaires and likes people who are useful to him, and Mr. Zuckerberg right now is both."

Deepa Seetharaman, How a Facebook Employee Helped Trump Win--But Switched Sides for 2020, Wall St. J (Nov. 24, 2019, 3:18 PM): "One of the first things Mr. Barnes and his team advised campaign officials to do was to start running fundraising ads targeting Facebook users who liked or commented on Mr. Trump’s posts over the past month, using a product now called 'engagement custom audiences.' The product, which Mr. Barnes hand-coded, was available to a small group, including Republican and Democratic political clients. (The ad tool was rolled out widely around Election Day.) Within the first few days, every dollar that the Trump campaign spent on these ads yielded $2 to $3 in contributions, said Mr. Barnes, who added that the campaign raised millions of dollars in those first few days. Mr. Barnes frequently flew to Texas, sometimes staying for four days at a time and logging 12-hour days. By July, he says, he was solely focused on the Trump campaign. When on-site in the building that served as the Trump campaign’s digital headquarters in San Antonio, he sometimes sat a few feet from Mr. Parscale. The intense pace reflected Trump officials’ full embrace of Facebook’s platform, in the absence of a more traditional campaign structure including donor files and massive email databases."

Over 130 Companies Remove Ads from Facebook in #StopHateforProfit Boycott, forcing Mark Zuckerberg to change lax Facebook policy on misinformation and hate content

In the aftermath of Cambridge Analytica scandal in which the company exploited Facebook to target and manipulate swing voters in the 2016 U.S. election, Facebook did an internal review to examine the company's role in spreading misinformation and fake news that may have affected the election, as CEO Mark Zuckerberg announced. In 2018, Zuckerberg announced that Facebook was making changes to be better prepared to stop misinformation in the 2020 election. Critics criticized the changes as modest, however. As WSJ reporters Jeff Horwitz and Deepa Seetharaman detailed, Facebook executives largely rejected the internal study's recommendations to reduce polarization on Facebook. Doing so might be "paternalistic" and might open Facebook up to criticisms of being biased against conservatives.

Despite the concerns about fake news and misinformation affecting the 2020 election, Facebook took the position that fact checking for misinformation did not apply to the posts and ads by politicians in the same way as they applied to everyone else. Facebook's policy was even more permissive to political ads and politicians. As shown below, Facebook justified this hands-off position as advancing political speech: "Our approach is grounded in Facebook's fundamental belief in free expression, respect for the democratic process, and the belief that, especially in mature democracies with a free press, political speech is the most scrutinized speech there is. Just as critically, by limiting political speech we would leave people less informed about what their elected officials are saying and leave politicians less accountable for their words."

Facebook's Fact-Checking Exception for Politicians and Political Ads

By contrast, Twitter CEO Jack Dorsey decided to ban political ads in 2019 and to monitor the content politicians just as Twitter does with all other users for misinformation and other violations of Twitter's policy.  Yet Zuckerberg persisted in his "hands off" approach: "“I just believe strongly that Facebook shouldn’t be the arbiter of truth of everything that people say online." Zuckerberg even said Twitter was wrong to add warnings to two of President Trump's tweets as misleading (regarding mail-in ballots) and glorifying violence (Trump said, "When the looting starts, the shooting starts" regarding the protests of the Minneapolis police Derek Chauvin killing of George Floyd)  Back in October 2019, Zuckerberg defended his approach in the face of withering questioning by Rep. Alexandria Ocasio-Cortez. 

 

In May and June 2020, Zuckerberg persisted in his "hands off" approach. Some Facebook employees quit in protest, while others staged a walkout.  Yet Zuckerberg still persisted. 

On June 17, 2020, Color of Change, which is "the nation’s largest online racial justice organization," organized with NAACP, Anti-Defamation League, Sleeping Giants, Free Press, and Common Sense Media a boycott of advertising on Facebook for the month of July. The boycott was labeled #StopHateforProfit. Within just 10 days, over 130 companies joined the ad boycott of Facebook.  It included many large companies, such as Ben and Jerry's, Coca-Cola, Dockers, Eddie Bauer, Levi's, The North Face, REI, Unilver, and Verizon. 

On June 26, 2020, Zuckerberg finally announced some changes to Facebook's policy.  The biggest changes:

(1) Moderating hateful content in ads. As Zuckerberg explained on his Facebook page, "We already restrict certain types of content in ads that we allow in regular posts, but we want to do more to prohibit the kind of divisive and inflammatory language that has been used to sow discord. So today we're prohibiting a wider category of hateful content in ads. Specifically, we're expanding our ads policy to prohibit claims that people from a specific race, ethnicity, national origin, religious affiliation, caste, sexual orientation, gender identity or immigration status are a threat to the physical safety, health or survival of others. We're also expanding our policies to better protect immigrants, migrants, refugees and asylum seekers from ads suggesting these groups are inferior or expressing contempt, dismissal or disgust directed at them."

(2) Adding labels to posts, including from candidates, that may violate Facebook's policy. As Zuckerberg explained, "Often, seeing speech from politicians is in the public interest, and in the same way that news outlets will report what a politician says, we think people should generally be able to see it for themselves on our platforms.

"We will soon start labeling some of the content we leave up because it is deemed newsworthy, so people can know when this is the case. We'll allow people to share this content to condemn it, just like we do with other problematic content, because this is an important part of how we discuss what's acceptable in our society -- but we'll add a prompt to tell people that the content they're sharing may violate our policies.

"To clarify one point: there is no newsworthiness exemption to content that incites violence or suppresses voting. Even if a politician or government official says it, if we determine that content may lead to violence or deprive people of their right to vote, we will take that content down. Similarly, there are no exceptions for politicians in any of the policies I'm announcing here today." 

Facebook's new labeling of content of candidates sounds very similar to what Zuckerberg criticized Twitter as being wrong. And Facebook's new policy on moderating hateful content in ads that "are a threat to the physical safety, health or survival of others," including "people from a specific race, ethnicity, national origin, religious affiliation, caste, sexual orientation, gender identity or immigration status," seems a positive step to prevent Facebook being a platform to sow racial discord, which is a goal of Russian operatives according to U.S. intelligence. 

Facebook new policy on moderation of political ads and posts by politicians and others

The organizers of the boycott, however, were not impressed with Facebook's changes. They issued a statement quoted by NPR: "None of this will be vetted or verified — or make a dent in the problem on the largest social media platform on the planet. We have been down this road before with Facebook. They have made apologies in the past. They have taken meager steps after each catastrophe where their platform played a part. But this has to end now."

 

Mark Zuckerberg appeals to India before key decision on Internet.org platform, amid protests of net neutrality violation

Ahead of a key decision by India's telecommunications regulatory body, Mark Zuckerberg wrote a blog post in the Times of India to defend his nonprofit Internet.org, which provides free (but limited) Internet access to under-served areas.  The service is called "Free Basics," which enables users to access the Internet but only for a limited number of apps, such as weather, Wikipedia, and, yes, Facebook. Other app developers can apply to Internet.org to be included in Free Basics.   

Mark Zuckerberg defends nonprofit Internet.org's limited Internet service from criticisms of conflict with net neutrality

In April 2015, Mark Zuckerberg visited India to help promote his nonprofit Internet.org's efforts to provide Internet access to the billions of people around the world (two thirds of the world's popoulation) who still lack Internet access.  The nonprofit provides a smartphone app with free Internet access that is limited to several programs, including Facebook (minus the ads): "The Internet.org app provides free basic services in markets where internet access may be less affordable.

Zuckerberg visits Colombia to launch free Internet.org app, 1st country in Latin America

Mark Zuckerberg visited Colombia and President Juan Manuel Santos to launch a free Internet.org app for smartphones that will enable subscribers of local phone service Tigo to get free Internet access to a limited number of free services, including Facebook and several government sites such as "Instituto Colombiano para la Evaluación de la Educación, an education assessment service and Agronet, a service that provides information on agriculture and rural development."  The list of free services includes:

1doc3
24 Symbols
AccuWeather
Agronet
BabyCenter & MAMA
Facebook
Girl Effect
Instituto Colombiano para la Evaluación de la Educación
Messenger
Mitula
Para la Vida
Su Dinero
Tambero.com
UNICEF
Wikipedia
YoAprendo

 

Mark Zuckerberg, Facebook-led effort to provide free Internet access in Zambia

Mark Zuckerberg and Facebook helped to start a cooperative effort among telecoms and Facebook to provide free Internet access to countries that lack it.  This initiative--called Internet.org--just launched an app for people in Zambia that provides limited free access to  Facebook, Facebook Messenger, Google, AccuWeather, Unicef, job search sites in Zambia, and women's health and rights organizations.  (More here.)

Blog Search

Blog Archive

Categories