The Free Internet Project

QAnon

Facebook announces content moderation policy change in clamp down on QAnon and movements tied to violence

On August 19, 2020, Facebook announced a change to its community standards in moderating content on Facebook for safety reasons. Facebook's community standards already require the removal of content that calls for and advocates violence and the removal of individuals and groups promoting violence. Facebook now will restrict content that doesn't necessarily advocate violence, but is "tied to offline anarchist groups that support violent acts amidst protests, US-based militia organizations and QAnon." Facebook explained, "we have seen growing movements that, while not directly organizing violence, have celebrated violent acts, shown that they have weapons and suggest they will use them, or have individual followers with patterns of violent behavior." U.S. based militia organizations and far-right conspiracy, QAnon, have begun to grow on the social site. As we reported, earlier in July 2020, Twitter suspended 7,000 users who supported Qnon conspiracy theories. Facebook followed suit by removing 790 QAnon groups on Facebook (including one group that had 200,000 members) and 10,000 Instagram accounts in August 2020.

Facebook listed seven actions they planned to take against movements and organizations tied to violence:

  1. Remove From Facebook: Facebook Pages, Groups, and Instagram accounts that are a part of harmful movements and organizations will be removed from the platform when they discuss potential violence. To help identify when violence is taking place, Facebook plans to study the technology and symbolism these groups use.
  2. Limit Recommendations: Pages, Groups, and Instagram accounts associated with harmful organizations that are not removed will not be recommended to people as Pages, Groups, or accounts they might want to follow.
  3. Reduce Ranking in News Feed: Looking forward to the future, content from these Pages and Groups will be ranked lower in the news feeds. This will lessen the amount of people who see these pages on their news feed on Facebook.
  4. Reduce in Search: Hashtags and titles for related content will be ranked lower in search suggestions and will not be suggested in the Search Typehead.
  5. Reviewing Related Hashtags on Instagram: On Instagram specifically the Related Hashtag feature has been removed. This feature allowed people to view hashtags that were similar to those they use. Facebook is clear that this feature could potentially return in the future once they have introduced better safety measures to protect people when utilizing the feature.
  6. Prohibit Use of Ads, Commerce Surfaces and Monetization Tools: Facebook starting softly has planned a two-step action, against the prohibition of Ads and the use of the Marketplace to in relation to these movements. Currently they have stopped Facebook Pages related to these movements from running Ads or selling products through the Marketplace and Shop. In the future, Facebook plans to take stronger action stopping Ads praising or supporting these movements from being run by anyone.
  7. Prohibit Fundraising: Finally fundraising associated with these groups will be prohibited. Nonprofits who identify with these groups will be disallowed from using the Facebook fundraising tools.

With the new policy, Facebook expands its existing policy against violence to include the removal of groups and individuals that impose a risk to public safety. The threshold previously, according to Facebook, would not have allowed these groups to be removed because they did not meet the rigorous criteria to be deemed dangerous to the platform. Facebook is not banning QAnon content from the site in its entirety, Facebook is restricting the ability of the individuals who follow these groups to organize on the platform. QAnon believers can still post these conspiracies on the platform in an individualized manner.

With the expansion of its policy, Facebook takes an important step in stopping the spread of harmful information on its platform. As a result of the expanded policy, Facebook has already been able to take down hundreds of groups and ads tied to QAnon and militia organizations and thousands tied to these movements on Instagram. Whether these changes are effective enough to keep Facebook from being used as a tool to organize violence remains to be seen, however.

--written by Bisola Oni

Cleaning house: Twitter suspends 7,000 accounts of QAnon conspiracy theory supporters

On July 21, 2020, Twitter suspended 7,000 accounts spreading QAnon conspiracy theories. In a tweet about the banning of these QAnon accounts, Twitter reiterated their commitment to taking "strong enforcement actions on behavior that has the potential to lead to offline harm." Twitter identified the QAnon accounts' violations of its community standards against "multi-account[s]," "coordinating abuse around individual victims," and "evad[ing] a previous suspension." In addition to the permanent suspensions, Twitter also felt it necessary to ban content and accounts "associated with Qanon" from the Trends and recommendations on Twitter, as well as to avoid "highlighting this activity in search and conversations." Further, Twitter will block "URLs associated with QAnon from being shared on Twitter." 

These actions by Twitter are a bold step in what has been a highly contentious area concerning the role of social media platforms in moderating hateful or harmful content. Some critics suggested that Twitter's QAnon decision lacked notice and transparency.  Other critics contended that Twitter's actions were too little to stop the "omnisconpiracy theory" that QAnon has become across multiple platforms.

So what exactly is QAnon? CNN describes the origins of QAnon, which began as a single conspiracy theory: QAnon "claim dozens of politicians and A-list celebrities work in tandem with governments around the globe to engage in child sex abuse. Followers also believe there is a 'deep state' effort to annihilate President Donald Trump."  Forbes similarly describes: "Followers of the far-right QAnon conspiracy believe a “deep state” of federal bureaucrats, Democratic politicians and Hollywood celebrities are plotting against President Trump and his supporters while also running an international sex-trafficking ring." In 2019, an internal FBI memo reportedly identified QAnon as a domestic terrorism threat.

Followers of QAnon are also active on Facebook, Reddit, and YouTube. The New York Times reported that Facebook was considering takeingsteps to limit the reach QAnon content had on its platform. Facebook is coordinating with Twitter and other platforms in considering its decision; an announcement is expected in the next month. Facebook has long been criticized for its response, or lack of response, to disinformation being spread on its platform. Facebook is now the subject of a boycott, Stop Hate for Profit, calling for a stop to advertising until steps are taken to halt the spread of disinformation on the social media juggernaut. Facebook continues to allow political ads using these conspiracies on its site. Forbes reports that although Facebook has seemingly tried to take steps to remove pages containing conspiracy theories, a number of pages still remain. Since 2019, Facebook has allowed 144 ads promoting QAnon on its platform, according to Media Matters. Facebook has continuously provided a platform for extremist content; it even allowed white nationalist content until officially banning it in March 2019.

Twitter's crack down on QAnon is a step in the right direction, but it signals how little companies like Twitter and Facebook have done to stop disinformation and pernicious conspiracy theories in the past. As conspiracy theories can undermine effective public health campaigns to stop the spread of the coronavirus and foreign interference can undermine elections, social media companies appear to be playing a game of catch-up.  Social media companies would be well-served by devoting even greater resources to the problem, with more staff and clearer articulation of its policies and enforcement procedures. In the era of holding platforms and individuals accountable for actions that spread hate, social media companies now appear to realize that they have greater responsibilities for what happens on their platforms.

--written by Bisola Oni

Blog Search

Blog Archive

Categories