The Free Internet Project

Reddit

Revisiting Reddit's Attempt to Stop "Secondary Infecktion" Misinformation Campaign from Russia

 

Last year, Reddit announced that it banned 61 accounts in relation to a disinformation campaign dubbed “Secondary Infecktion” led by a Russian group.  The campaign group was exposed by Facebook earlier in June 2019 for creating fake news in multiple languages that involved multiple nations, aiming to “divide, discredit, and distract Western countries” through dissemination of fake information such as assassination plans, attacks on Ukraine and its pro-Western government, and disputes between Germany and US. This time, the operation created fake accounts and uploaded “leaked UK documents” on Reddit. A research firm, Graphika Labs inspected the associated accounts and concluded these were linked to Secondary Infecktion based on same grammatical errors and language patterns.

Reddit’s investigation into suspicious accounts started with its users’ report on questionable posts. It then worked with Graphika and soon found a “pattern of coordination” similar to those reported accounts linked to Secondary Infektion, making it “use these accounts to identify additional suspect accounts that were part of the campaign on Reddit.”

As Reddit’s statement wrote, it “encourage[s] users, moderators, and 3rd parties to report things to us as soon as they see them.” This statement reflects how much Reddit depends on its community to help moderate the site. Reddit is a platform that is heavily community-based. It is a collection of forums where users share content and comments on just about anything. To the left of every post, there are two buttons – the upvote and the downvote, which allow users themselves to rate content. The total score of a post is essentially the number of upvotes minus downvotes, making the content’s position on a page based on its score rank.  Basically, a higher score means more visibility.

The voting system is liked by many users because unlike Facebook or Twitter, Reddit is more like a community curated by its users themselves. However, the voting system has its drawbacks. The system can be gamed and manipulated. First, because everyone has certain moderating power, personal beliefs and agendas may get in the way. For example, a person may create several accounts just to downvote a post with which he does not agree. As a result, information may be downgraded by gaming the systemt. Second, there's a risk of content manipulation by coordinated attacks. As the June security report stated, Reddit has been heavily focused on content manipulation around the 2020 elections and ensure minorities’ voices would be heard. Therefore, Reddit worked much on bot detection and malicious viruses. Admins have vast powers, including flagging fake accounts, and can try to ensure diversity of viewpoints and participation.

Reddit could consider changing some of its platform features. As some redditors pointed out, Reddit's “gilding” feature, which is akin to a “super-upvote,” may enable manipulation. Users can gild posts with their Gold Reddit subscription or just buy Reddit coins.  Together with the voting system, gilding may make content manipulation more easy. A malicious operation can just buy coins to promote content as they wish even without creating fake accounts. Offering subscriptions is apparently Reddit’s way to cover its cost and to profit, and subscriptions do offer other privileges such as having an ad-free experience. Nonetheless, if Reddit wants to stop content manipulation, perhaps the company needs to rethink the gilding power. 

--written by Candice Wang

 

Why Voters Should Beware: Lessons from Russian Interference in 2016 Election, Political and Racial Polarization on Social Media

Overview of the Russian Interference Issue

The United States prides itself on having an open democracy, with free and fair elections decided by American voters. If Americans want a policy change, then the remedy most commonly called upon is political participation--and the vote. If Americans want change, then they should vote out the problematic politicians and choose public officials to carry out the right policies. However, what if the U.S. voting system is skewed by foreign interference? 

American officials are nearly unanimous in concluding, based on U.S. intelligence, that Russia interfered with the 2016 presidential elections [see, e.g., here; here; and Senate Intelligence Report].  “[U]ndermining confidence in America’s democratic institutions” is what Russia seeks. In 2016, few in the U.S. were even thinking about this type of interference. The US’s guard was down. Russia interfered with the election in various ways including fake campaign advertisements, bots on Twitter and Facebook that pumped out emotionally and politically charged content, and through spread of disinformation or “fake news.” Social media hacking, as opposed to physical-polling-center hacking, is at the forefront of discussion because it can not only change who is in office, but it also can shift American voters’ political beliefs and understanding of political topics or depress voters from voting. 

And, if you think Russia is taking a break this election cycle, you'd be wrong. According to a March 10, 2020 New York Times article, David Porter of the FBI Foreign Influence Task Force says: "We see Russia is willing to conduct more brazen and disruptive influence operations because of how it perceives its conflict with the West."

What Inteference Has to Do with Political Polarization

Facebook and Twitter have been criticized countless times by various organizations, politicians, and the media for facilitating political polarization. The U.S. political system of mainly two dominamnt parties is especially susceptible to political polarization. Individuals belonging to either party become so invested in those party’s beliefs that they do not just see the other party’s members as different but also wrong and detrimental to the future of the country. In the past twenty years, the amount of people who consistently hold conservative views or liberal views went from 10% to 20%, thus showing the increasing division, according to an article in Greater Good Magazine.

Political polarization is facilitated by platforms like Facebook and Twitter because of their content algorithms, which are designed to make the website experience more enjoyable. The Facebook News Feed “ranks stories based on a variety of factors including their history of clicking on links for particular websites,” as described by a Brookings article. Under the algorithm, if a liberal user frequently clicks on liberally skewed content, that is what they are going to see the most. Research shows this algorithm reduced the cross-cutting of political “content by 5 percent for conservatives and 8 percent for liberals.” Thus, the algorithm limits your view of other opinions.

So, you might ask, “Why is that bad? I want to see content more aligned with my beliefs.” Democracy is built on the exchange of varying political views and dissenting opinions. The US has long stood by the reputation of freedom of speech and encouraging a free flow of ideas. This algorithmic grouping of like-minded people can be useful when it comes to hobbies and interests, however when it comes to consistently grouping individuals based on political beliefs, it can have a negative impact on democracy. This grouping causes American users to live in “filter bubbles” that only expose them to content that aligns with their viewpoints. Users tend to find this grouping enjoyable due to the psychological theory of confirmation bias, which means that individuals are more likely to consume content that aligns with their pre-existing beliefs. So, all the articles about Trump successfully leading the country will be the first ranked on a conservative user’s Facebook newsfeed and will also be the most enjoyable for them. This filter bubble is dangerous to a democratic system because the lack of diverse perspectives when consuming news content encourages close-mindedness and increases distrust in anyone who disagrees.

During the 2016 presidential election, the Russian hackers put out various types of fake articles, campaign advertisements, and social media posts that were politically charged on either the liberal or conservative side. Because the Facebook algorithm shows more conservative content to conservatives and same for liberals, hackers had no problem reaching their desired audience quickly and effectively. On Facebook they created thousands of robot computer programs that would enter various interest groups and engage with their target audience. For example, in 2016, a Russian soldier successfully entered a U.S. Facebook group pretending to be a 42-year-old housewife, as reported by Time. He responded to political issues discussed on that group and he used emotional and political buzz words when bringing up political issues and stories. On Twitter, thousands of fake accounts run by Russians and computer robots were used to spread disinformation about Hillary Clinton by continuously mentioning her email scandal from when she was Secretary of State and a fake Democratic pedophilic ring called “Pizzagate.” These robots would spew hashtags like “#MAGA” and “#CrookedHillary” that took up more than a quarter of the content within these hashtags.

Facebook and Twitter’s Response to the 2016 Russian Interference

According to a Wall Street Journal article on May 26, 2020 and a Washington Post article on June 28, 2020, Facebook had an internal review of how Facebook could reduce polarization on its platform following the 2016 election, but CEO Mark Zuckerberg and other executives decided against the recommended changes because it was seen as "paternalistic" and would potentially affect conservatives on Facebook more. 

After becoming under increasing fire from critics for allowing misinformation and hate speech to go unchecked on Facebook, the company announced some changes to "fight polarization" on May 27, 2020. This initiative included a recalibration of each user’s Facebook News Feed which would prioritize their family and friends’ content over divisive news content. Their reasoning was that data shows people are more likely to have meaningful discourse with people they know, and this would foster healthy debate rather than ineffective, one-off conversations. They also mentioned a policy directly targeting the spread of disinformation on the platform. They say they have implemented an independent-fact-checking program that will automatically check content in over 50 languages around the world for false information.  Disinformation that will potentially contribute to “imminent violence, physical harm, and voter suppression,” will be removed. 

But those modest changes weren't enough to mollify Facebook's critics. Amidst the mass nationwide protests of the Minneapolis police officer Derek Chauvin's brutal killing of George Floyd, nonprofit organizations including Color for Change organized an ad boycott against Facebook. Over 130 companies agreed to remove their ads from Facebook during July or longer. That led Zuckerberg to change his position on exempting politicians from fact checking or the company's general policy on misinformation. Zuckerberg said that politicians would now be subject to the same policy as every other Facebook user and would be flagged if they disseminated misinformation (or hate speech) that violates Facebook's general policy. 

Twitter’s CEO Jack Dorsey not only implemented a fact-checking policy similar to Facebook, but also admitted that the company needed to be more transparent in their policy making. The fact checking policy “attached fact-checking notices” at the bottom of various tweets alerting users that there could be fake claims in those tweets.  Twitter also decided to forbig all political advertising on its platform. In response to Twitter's flagging of his content, President Trump issued an executive order to increase social media platform regulation and stop them from deleting users’ content and censoring their speech.

With the 2020 U.S. election only four months away, Internet companies are still figuring out how to stop Russian interference and the spread of misinformation, hate speech, and political polarization intended to interfere with the election. Whether Internet companies succeed remains to be seen.  But there's been more policy changes and decisions by Facebook, Twitter, Reddit, Snapchat, Twitch, and other platforms in the last month than all of last year. 

-by Mariam Tabrez

Blog Search

Blog Archive

Categories