The Free Internet Project

Blog

Visit our new project The Law of Internet Platforms at LawofInternetplatforms.org

Before 2022 ends, we are excited to announce the launch of a new project called The Law of Internet Platforms. We will be analyzing two main issues: (1) national and regional laws regulating Internet platforms, and (2) Internet platforms' own policies regulating their users. We are launching a dedicated satellite site for reader convenience.

Twitter controversy highlights EU's new Digital Services Act (DSA), vast oversight over Internet platforms, including content moderation (Part I)

The European Union continues to a leading agenda setter in online governance. In 2019, the EU enacted the Directive on the Digital Single Market (DSM), which includes the controversial filtering and "notice and staydown" requirements for large online content-sharing platforms, such as Facebook and YouTube. This year, the Digital Services Act (DSA) went into effect, which provides sweeping requirements for a variety of Internet platforms. The DSA runs 113 pages long.

EU commissioner Thierry Breton warns Elon Musk that Twitter must follow content moderation rules or risk a ban in EU

According to The Financial Times, the EU commissioner Thierry Breton warned Elon Musk that Twitter must follow content moderation rules or risk a ban in EU. The Financial Times reported that Twitter cannot use an "arbitrary" approach, subject to no rules, when reinstating banned users, plus Twitter must moderate disinformation and be subject to an independent audit. The new EU Digital Services Act (DSA) just went into effect on Nov. 16, 2022.

After Musk acquisition, Twitter announces that it no longer removes COVID misinformation

Twitter announced: “Effective November 23, 2022, Twitter is no longer enforcing the COVID-19 misleading information policy.” According to The Verge, Twitter had "suspended a total of 11,230 accounts and removed nearly 100,000 pieces of content since January 2020." The move wasn't unexpected given Elon Musk's "soft" position on content moderation.

Protests against China's COVID policy marks challenge against censorship

The New York Times reports of how the protests by individuals in China against the country's restrictive "zero COVID" policy has been able to evade China's censorship, to some degree. The NYT article writes: "videos of the marches and rallies have continued to surface on Chinese sites such as WeChat, a chat app, and the short video sharing app, Douyin. Experts say the sheer volume of video clips has likely overwhelmed the automated software and armies of censors China has tasked with policing the internet. 'This is a decisive breach of the big silence,' said Xiao Qiang, a researcher on internet freedom at the University of California, Berkeley."

Public Service Announcements: Tips to Avoid Election Misinformation and Fake Election Results

The Free Internet Project makes the following public service announcements to provide the public with information and tips to avoid election misinformation, including the potential for fake election results and fake news of voting irregularities to delegitimize the election. We plan on sharing these public service announcements on Facebook, Instagram, Twitter, and other social media. We invite you to share them as well.  You can download all the Public Service Announcements by visting this page.

Reason for The Free Internet Project's Public Service Announcements on Election Misinformation

Social media companies including Facebook, Instagram, Twitter, YouTube, TikTok, Pinterest, Reddit, and other companies have content moderation policies that prohibit election misinformation and voter suppression on their platforms in anticipation of the 2020 U.S. elections. These companies fear a repeat of the Russian interference in the 2016 U.S. election. As the bipartisan U.S. Senate Select Committee on Intelligence reported in thousands of pages of its report on "Russian Active Measures Campaigns and Interference in the 2016 U.S. Election," the Russian operative group Internet Research Agency used American-based social media platforms to interfere with the 2016 election. "Masquerading as Americans, these operatives used targeted advertisements, intentionally falsified news articles, self-generated content, and social media platform tools to interact with and attempt to deceive tens of millions of social media users in the United States. This campaign sought to polarize Americans on the basis of societal, ideological, and racial differences, provoked real world events, and was part of a foreign government's covert support of Russia's favored candidate in the U.S. presidential election." (Vol. 2, p. 3).

Facebook, Instagram, Twitter, YouTube, and other social media companies are now working hard to protect American voters from the same kind of fake news and fake content exploited by the Russian operatives in the 2016 election. But no social media platform is entirely immune from such foreign interference or attacks. On September 22, 2020, the FBI and the Cybersecurity and Infrastructure Security Agency (CISA) issued a public service announcement to warn Americans of a new kind of worry for the 2020 election: "the potential threat posed by attempts to spread disinformation regarding the results of the 2020 elections." 

Our goal at The Free Internet Project is to help Americans avoid election misinformation and fake election results. We provide basic information explaining what election misinformation is, give several examples of fake ads and fake accounts used by Russian operatives in the 2016 election, and offer several tips for people to protect themselves on social media. 

Explanation of The Free Internet Project's Public Service Announcements on How to Avoid Election Misinformation

1.  Definition of election misinformation

We first define "election misinformation": "Election misinformation is false, deceptive, or inaccurate information related to the election." It can come in the form of fake news, fake accounts, fake election results, or other misleading content intended to suppress or change your vote, or to discredit the election process. Americans should understand that some election misinformation may expressly include false or misleading claims about the candidates, the voting process, mail-in ballots, voter fraud or purported irregularities in voting, or the election results (who won). But some election misinformation doesn't even mention the election itself, but instead comes in the form of fake content posted from fake accounts on social media intended to make you believe they share your views. These fake accounts pose as Americans supporting particular and often popular causes, such as racial justice, Second Amendment gun rights, support for the police, and LGBTQ rights. In the 2016 U.S. election, Russian operatives used these kinds of fake American accounts to sow discord in U.S. and polarize voters. Volume 2 of the bipartisan Report of the Select Committee on Intelligence United States Senate on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election: Russia's Use of Social Media with Additional View, especially pp. 32-71, is the best source to read for examples of the Russian operatives' extensive election misinformation. This year, the FBI and CISA have special concern about foreign interference that may seek to spread fake election results and fake news about voter fraud or other purported irregularities in voting, in order to cast doubt on the verified election results from the states. We discuss this concern in No. 4 below. The key to remember is that election misinformation comes in many forms--possibly even new forms that won't be easy to recognize or detect. Beware.

2.  Stop: Don't rely on or share unverified sources on social media. They can be fake accounts, manipulated videos, and other misleading content intended to suppress or change your vote, or to delegitimize the election.

        

In their Sept. 22, 2020 public service announcement, "the FBI and CISA urge the American public to critically evaluate the sources of the information they consume and to seek out reliable and verified information from trusted sources, such as state and local election officials." Social media companies, including Facebook, Instagram, and Twitter, have set up voting or election information centers for their users that contain trusted sources identified by those companies. One of the safest practices for social media is to rely only on sources that you know are verified trusted sources. Some companies verify users with a check mark inside a blue circle, but just because users have that symbol does not mean that they are a trusted source of news or election information.  Another safe practice is to rely on multiple trusted sources before you reach any conclusions about a news report.

3.  Caution: Be cautious with "friends" or "followers" on social media you don't know. They might be fake accounts trying to make you believe they are on your side. They aren't.

Many Americans may not realize that, in the 2016 election, Russian operatives posed as Americans on social media and acted like they shared your political views on both sides of the political spectrum. The bipartisan Senate Intelligence Report described in Volume 2, p. 3:  "Masquerading as Americans, these operatives used targeted advertisements, intentionally falsified news articles, self-generated content, and social media platform tools to interact with and attempt to deceive tens of millions of social media users in the United States. This campaign sought to polarize Americans on the basis of societal, ideological, and racial differences, provoked real world events, and was part of a foreign government's covert support of Russia's favored candidate in the U.S. presidential election." We have used examples of posts from the fake accounts run by Russian operatives in the 2016, which were analyzed by the U.S. House of Representatives Permanent Select Committee on Intelligence as HPSCI Minority Open Hearing Exhibits. These examples show that the Russian operatives tried to trick Americans into believing they were on your side, so they could manipulate Americans. As the Senate Intelligence Committee explained in Volume 2 at p. 32-33: "In practice, the IRA's influence operatives dedicated the balance of their effort to establishing the credibility of their online personas, such as by posting innocuous content designed to appeal to like-minded users. This innocuous content allowed IRA influence operatives to build character details for their fake personas, such as a conservative Southerner or a liberal activist, until the opportune moment arrived when the account was used to deliver tailored 'payload content' designed to influence the targeted user. By this concept of operations, the volume and content of posts can obscure the actual objective behind the influence operation. 'If you're running a propaganda outfit, most of what you publish is factual so that you're taken seriously,' Graphika CEO and TAG researcher John Kelly described to the Committee, '[T]hen you can slip in the wrong thing at exactly the right time.'" In other words, your "friends" on social media may be your enemies trying to trick you.

4.  Caution: The FBI warns don't fall for fake election results or fake news about voter fraud or irregularities in voting.

In their Sept. 22, 2020 public service announcement, the FBI and CISA warn that foreign actors and cybercriminals may attempt to spread disinformation about the 2020 election results or other false information "to discredit the electoral process and undermine confidence in U.S. democratic institutions. Given the pandemic, this year's voting may have increased use of mail-in ballots that may require greater time to tabulate. "Foreign actors and cybercriminals could exploit the time required to certify and announce elections’ results by disseminating disinformation that includes reports of voter suppression, cyberattacks targeting election infrastructure, voter or ballot fraud, and other problems intended to convince the public of the elections’ illegitimacy." We offer several tips drawn from the FBI and CISA public service announcement: 

How to avoid fake election results or fake news of voting problems. 

  • “For information about final election results, rely on state and local government election officials."

  • "Verify through multiple reliable sources any reports about problems in voting or election results, and consider searching for other reliable sources before sharing such information via social media or other avenues.”

  • When you are waiting for election results: “Seek out information from trustworthy sources, such as state and local election officials; verify who produced the content; and consider their intent.”

​How to report election misinformation to social media companies or election crimes to the FBI.

  • “If appropriate, make use of in-platform tools offered by social media companies for reporting suspicious posts that appear to be spreading false or inconsistent information about election-related problems or results.” (See the next section below.)

  •  “Report potential election crimes—such as disinformation about the manner, time, or place of voting—to the FBI. The FBI encourages victims to report information concerning suspicious or criminal activity to their local field office www.fbi.gov/contact-us/field-offices"

5.  Go: These safe practices protect everyone on social media.

6.  Go: Vote!

Election Day is Tuesday, November 3. Many states have early voting and mail-in ballots. The U.S. Election Assistance Committee provides a collection of state voting resources at https://www.eac.gov/.

Sources for the Public Service Announcements

We have studied the available information from U.S. government sources on the Russian interference in the 2016 U.S. election and on the current threat of foreign interference in the 2020 U.S. election. We have also reviewed the community standards of the major social media companies and their announced efforts to combat election misinformation and foreign interference. The primary sources of our public service announcements come from the following: 

  1. FBI and CISA, Public Service Announcement, "Foreign Actors and Cybercriminals Likely to Spread Disinformation Regarding 2020 Election Results," Sept. 22, 2020. 
  2. Report of the Select Committee on Intelligence United States Senate on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election, Vol. 2: Russia's Use of Social Media with Additional View 
  3. U.S. House of Representatives Permanent Select Committee on Intelligence, HPSCI Minority Open Hearing Exhibits 
  4. We have also reviewed the community standards of all the major social media companies, as well as news reports of their ongoing efforts to stop election misinformation. 
  5. We also reviewed the Department of Homeland Security's "Homeland Threat Assessment October 2020." which warns: " Threats to our election have been another rapidly evolving issue. Nation-states like China, Russia, and Iran will try to use cyber capabilities or foreign influence to compromise or disrupt infrastructure related to the 2020 U.S. Presidential election, aggravate social and racial tensions, undermine trust in U.S. authorities, and criticize our elected officials." (at 5)

Where you can report election crimes

Where you can report election misinformation on social media

Our Commitment to Nonpartisanship

The Free Internet Project is a Section 501(c)(3) organization. We steadfastly abide by the requirement of avoiding any political campaign on behalf of any candidate for public office. We believe that, in our democracy, every citizen's right to vote their preference should be respected. We have relied on U.S. government sources, including the current FBI and CISA, and the bipartisan Report of the U.S. Senate's Select Committee on Intelligence, to provide verified information. Our public service announcements are intended as a voter education guide, similar to the Sept. 22, 2020 public service announcement by the FBI and CISA. Nothing in our public service announcements should be interpreted as an endorsement or opposition to any candidate for any public office.  

 

Mark Zuckerberg: Facebook to suspend political ads week before US election, add a label to premature election claims of victory

On September 3, 2020, Mark Zuckerberg published a lengthy post on his personal Facebook profile, detailing dramatic new measures Facebook is undertaking to safeguard the integrity of the U.S. elections. Zuckerberg wrote [we've added topical descriptions in brackets]:

Today, we're announcing additional steps we're taking at Facebook to encourage voting, connect people with authoritative information, and fight misinformation. These changes reflect what we've learned from our elections work over the past four years and the conversations we've had with voting rights experts and our civil rights auditors:


[Reliable Information at the top of page] We will put authoritative information from our Voting Information Center at the top of Facebook and Instagram almost every day until the election. This will include video tutorials on how to vote by mail, and information on deadlines for registering and voting in your state.

[No political ads starting the week before the election] We're going to block new political and issue ads during the final week of the campaign. It's important that campaigns can run get out the vote campaigns, and I generally believe the best antidote to bad speech is more speech, but in the final days of an election there may not be enough time to contest new claims. So in the week before the election, we won't accept new political or issue ads. Advertisers will be able to continue running ads they started running before the final week and adjust the targeting for those ads, but those ads will already be published transparently in our Ads Library so anyone, including fact-checkers and journalists, can scrutinize them.

[Partnering with state election authorities to identify election misinformation] We're going to extend our work with election officials to remove misinformation about voting. We already committed to partnering with state election authorities to identify and remove false claims about polling conditions in the last 72 hours of the campaign, but given that this election will include large amounts of early voting, we're extending that period to begin now and continue through the election until we have a clear result. We've already consulted with state election officials on whether certain voting claims are accurate.

[Limit the number of chats you can forward on Messenger] We're reducing the risk of misinformation and harmful content going viral by limiting forwarding on Messenger. You'll still be able to share information about the election, but we'll limit the number of chats you can forward a message to at one time. We've already implemented this in WhatsApp during sensitive periods and have found it to be an effective method of preventing misinformation from spreading in many countries.

[Remove both explicit and implicit voting misinformation] No political ads starting the week before the election] We're expanding our voter suppression policies. We already remove explicit misrepresentations about how or when to vote that could cause someone to lose their opportunity to vote -- for example, saying things like "you can send in your mail ballot up to 3 days after election day", which is obviously not true. (In most states, mail-in ballots have to be *received* by election day, not just mailed, in order to be counted.) We're now expanding this policy to include implicit misrepresentations about voting too, like "I hear anybody with a driver's license gets a ballot this year", because it might mislead you about what you need to do to get a ballot, even if that wouldn't necessarily invalidate your vote by itself.

[Remove COVID-misinformation to scare voters from voting] We're putting in place rules against using threats related to Covid-19 to discourage voting. We will remove posts with claims that people will get Covid-19 if they take part in voting. We'll attach a link to authoritative information about Covid-19 to posts that might use the virus to discourage voting, and we're not going to allow this kind of content in ads. Given the unique circumstances of this election, it's especially important that people have accurate information about the many ways to vote safely, and that Covid-19 isn't used to scare people into not exercising their right to vote.

Measure to stop false or premature election results 

Since the pandemic means that many of us will be voting by mail, and since some states may still be counting valid ballots after election day, many experts are predicting that we may not have a final result on election night. It's important that we prepare for this possibility in advance and understand that there could be a period of intense claims and counter-claims as the final results are counted. This could be a very heated period, so we're preparing the following policies to help in the days and weeks after voting ends:

[Facebook Voting Information Center provide information on time it takes to count votes] We'll use the Voting Information Center to prepare people for the possibility that it may take a while to get official results. This information will help people understand that there is nothing illegitimate about not having a result on election night.

[Partner with Reuters and National Election Pool for authoritative information on relection results] We're partnering with Reuters and the National Election Pool to provide authoritative information about election results. We'll show this in the Voting Information Center so it's easily accessible, and we'll notify people proactively as results become available. Importantly, if any candidate or campaign tries to declare victory before the results are in, we'll add a label to their post educating that official results are not yet in and directing people to the official results.

• [Label posts that attempt to deligitimize the election results] We will attach an informational label to content that seeks to delegitimize the outcome of the election or discuss the legitimacy of voting methods, for example, by claiming that lawful methods of voting will lead to fraud. This label will provide basic authoritative information about the integrity of the election and voting methods.

[Expand Facebook policy against content with violence and harm directed at election officials] We'll enforce our violence and harm policies more broadly by expanding our definition of high-risk people to include election officials in order to help prevent any attempts to pressure or harm them, especially while they're fulfilling their critical obligations to oversee the vote counting.

• [Expand Facebook policy against militia and conspiracy groups organizing or supporting violence] We've already strengthened our enforcement against militias, conspiracy networks like QAnon, and other groups that could be used to organize violence or civil unrest in the period after the elections. We have already removed thousands of these groups and removed even more from being included in our recommendations and search results. We will continue to ramp up enforcement against these groups over the coming weeks.

It's important to recognize that there may be legitimate concerns about the electoral process over the coming months. We want to make sure people can speak up if they encounter problems at the polls or have been prevented from voting, but that doesn't extend to spreading misinformation. We'll enforce the policies I outlined above as well as all our existing policies around voter suppression and voting misinformation, but to ensure there are clear and consistent rules, we are not planning to make further changes to our election-related policies between now and the official declaration of the result.

In addition to all of this, four years ago we encountered a new threat: coordinated online efforts by foreign governments and individuals to interfere in our elections. This threat hasn't gone away. Just this week, we took down a network of 13 accounts and 2 pages that were trying to mislead Americans and amplify division. We've invested heavily in our security systems and now have some of the most sophisticated teams and systems in the world to prevent these attacks. We've removed more than 100 networks worldwide engaging in coordinated inauthentic behavior over the past couple of years, including ahead of major democratic elections. However, we're increasingly seeing attempts to undermine the legitimacy of our elections from within our own borders.

I believe our democracy is strong enough to withstand this challenge and deliver a free and fair election -- even if it takes time for every vote to be counted. We've voted during global pandemics before. We can do this. But it's going to take a concerted effort by all of us -- political parties and candidates, election authorities, the media and social networks, and ultimately voters as well -- to live up to our responsibilities. We all have a part to play in making sure that the democratic process works, and that every voter can make their voice heard where it matters most -- at the ballot box.

 

Facebook enlists independent researchers and Social Science One to study how Facebook, Instagram affect 2020 US elections

On Aug. 31, 2020, Facebook announced a new research initiative it started with Social Science One committee chairs, Professors Talia Stroud of University of Texas at Austin and Joshua Tucker of New York University. The researchers will examine "examine the impact of how people interact with our products, including content shared in News Feed and across Instagram, and the role of features like content ranking systems." The research projects conducted on Facebook or via data from Facebook will start soon and end in December, after the November 2020 election. Facebook "expect[s] between 200,000 and 400,000 US adults may choose to participate in the study, which could include things like taking part in surveys or agreeing to see a different product experience. We will also study trends across Facebook and Instagram – but only in aggregate."

Interestingly, Facebook believes that the research projects will not affect the outcome of the U.S. elections: "With billions of dollars spent on ads, direct mail, canvassing, organizing and get out the vote efforts, it is statistically implausible that one research initiative could impact the outcome of an election. The research has been carefully designed to not impact the outcome of the election or harm participants. The sample of participants represents approximately 0.1% of the entire US eligible voting population spread across the US. By better understanding how people use our platform during an election, we can continually enhance the integrity of the platform moving forward." 

Facebook seems to gloss over the fact that a few swing voters in key swing states or precincts could ultimately determine the outcome of some of the elections. Without knowing the details of the various research projects, it's hard to evaluate the potential effect they may have on voters. 

The independent researchers are: 

  • Hunt Allcott, New York University 
  • Deen Freelon, University of North Carolina at Chapel Hill
  • Matthew Gentzkow, Stanford University
  • Sandra Gonzalez-Bailon, University of Pennsylvania
  • Andrew Guess, Princeton University
  • Shanto Iyengar, Stanford University
  • Young Mie Kim, University of Wisconsin-Madison
  • David Lazer, Northeastern University 
  • Neil Malhotra, Stanford University
  • Brendan Nyhan, Dartmouth College
  • Jennifer Pan, Stanford University
  • Jaime Settle, William & Mary
  • Talia Stroud, The University of Texas at Austin
  • Emily Thorson, Syracuse University
  • Rebekah Tromble, The George Washington University
  • Joshua A. Tucker, New York University
  • Magdalena Wojcieszak, University of California, Davis; University of Amsterdam

Facebook describes the scope of research projects as follows:

The independent academics are collaborating with Facebook researchers to design a diverse set of studies to analyze the role of Facebook and Instagram in the US 2020 election. To collect the information for the study, we are partnering with NORC at the University of Chicago, an objective, non-partisan research institution that has been studying public opinion since 1941. NORC possesses deep expertise in survey research, policy evaluation, data collection, advanced analytics and data science. The study was approved by NORC’s Institutional Review Board.

For people who have explicitly opted in to the study, we plan to combine multiple research methods, including surveys and behavioral data analysis, along with targeted changes to some participants’ experiences with Facebook and Instagram. For example, participants could see more or fewer ads in specific categories such as retail, entertainment or politics, or see more or fewer posts in News Feed related to specific topics. Other participants may be asked to stop using Facebook or Instagram for a period of time. A subset of participants may be asked to install an app on their devices – with their permission – that will log other digital media that they consume. This will allow researchers to understand more comprehensively the information environment that people experience. 

Facebook announces content moderation policy change in clamp down on QAnon and movements tied to violence

On August 19, 2020, Facebook announced a change to its community standards in moderating content on Facebook for safety reasons. Facebook's community standards already require the removal of content that calls for and advocates violence and the removal of individuals and groups promoting violence. Facebook now will restrict content that doesn't necessarily advocate violence, but is "tied to offline anarchist groups that support violent acts amidst protests, US-based militia organizations and QAnon." Facebook explained, "we have seen growing movements that, while not directly organizing violence, have celebrated violent acts, shown that they have weapons and suggest they will use them, or have individual followers with patterns of violent behavior." U.S. based militia organizations and far-right conspiracy, QAnon, have begun to grow on the social site. As we reported, earlier in July 2020, Twitter suspended 7,000 users who supported Qnon conspiracy theories. Facebook followed suit by removing 790 QAnon groups on Facebook (including one group that had 200,000 members) and 10,000 Instagram accounts in August 2020.

Facebook listed seven actions they planned to take against movements and organizations tied to violence:

  1. Remove From Facebook: Facebook Pages, Groups, and Instagram accounts that are a part of harmful movements and organizations will be removed from the platform when they discuss potential violence. To help identify when violence is taking place, Facebook plans to study the technology and symbolism these groups use.
  2. Limit Recommendations: Pages, Groups, and Instagram accounts associated with harmful organizations that are not removed will not be recommended to people as Pages, Groups, or accounts they might want to follow.
  3. Reduce Ranking in News Feed: Looking forward to the future, content from these Pages and Groups will be ranked lower in the news feeds. This will lessen the amount of people who see these pages on their news feed on Facebook.
  4. Reduce in Search: Hashtags and titles for related content will be ranked lower in search suggestions and will not be suggested in the Search Typehead.
  5. Reviewing Related Hashtags on Instagram: On Instagram specifically the Related Hashtag feature has been removed. This feature allowed people to view hashtags that were similar to those they use. Facebook is clear that this feature could potentially return in the future once they have introduced better safety measures to protect people when utilizing the feature.
  6. Prohibit Use of Ads, Commerce Surfaces and Monetization Tools: Facebook starting softly has planned a two-step action, against the prohibition of Ads and the use of the Marketplace to in relation to these movements. Currently they have stopped Facebook Pages related to these movements from running Ads or selling products through the Marketplace and Shop. In the future, Facebook plans to take stronger action stopping Ads praising or supporting these movements from being run by anyone.
  7. Prohibit Fundraising: Finally fundraising associated with these groups will be prohibited. Nonprofits who identify with these groups will be disallowed from using the Facebook fundraising tools.

With the new policy, Facebook expands its existing policy against violence to include the removal of groups and individuals that impose a risk to public safety. The threshold previously, according to Facebook, would not have allowed these groups to be removed because they did not meet the rigorous criteria to be deemed dangerous to the platform. Facebook is not banning QAnon content from the site in its entirety, Facebook is restricting the ability of the individuals who follow these groups to organize on the platform. QAnon believers can still post these conspiracies on the platform in an individualized manner.

With the expansion of its policy, Facebook takes an important step in stopping the spread of harmful information on its platform. As a result of the expanded policy, Facebook has already been able to take down hundreds of groups and ads tied to QAnon and militia organizations and thousands tied to these movements on Instagram. Whether these changes are effective enough to keep Facebook from being used as a tool to organize violence remains to be seen, however.

--written by Bisola Oni

Key Takeaways from Senate Intelligence Committee Report, Vol. 5: Russian Active Measures Campaigns and Interference in the 2016 U.S. Election

With the 2020 U.S. presidential election less than 80 days away and the COVID-19 pandemic affecting the country’s daily operations, the political world has been center stage. Last week, the Senate Intelligence Committee released its lengthy, final report, Volume 5, on the Russian Active Measures Campaigns and Interference in the 2016 U.S. Election. Though the Intelligence Committee was Republican-led, the report is bipartisan, involving both Democrat and Republican input. The report is a three-year culmination of probing countless documents, interviews, and emails. The investigation resulted in what the committee considers, “the most comprehensive description to date of Russia’s activities and the threat they posed.” The report concluded: "The Committee found that the Russian government engaged in an aggressive, multifaceted effort to influence, or attempt to influence, the outcome of the 2016 presidential election." The extensive, nearly 1,000 page report focused on various aspects of Russian interference in the 2016 election. Here are some key issues and takeaways of Volume 5.  (For our prior summaries, visit here for Volume 4 and Volume 1.

Paul Manafort's relationship with Russian individuals was a "grave counterintelligence threat"

Paul Manafort is a political consultant and former lobbyist who is known for having multiple ties to powerful politicians and businesspeople throughout the world, specifically Russia and Ukraine. In March 2016, Manafort was hired by the Trump Campaign as convention manager and by May 2016, he was promoted to chairman and chief strategist. The report stated the Trump campaign conducted “no known vetting of him, including his financial situation or vulnerability to foreign influence” (p. 27). The report further highlights that once Manafort was hired onto the Trump campaign, he quickly sought to use his position to resolve his various multi-million-dollar foreign disputes and acquire other work in Ukraine and other locations. Also, Manafort hired Russian national Konstantin Kilimnik, who served as a close liaison between Manafort and Russian oligarch Oleg Deripaska and other pro-Russia Ukrainian oligarchs. The committee found that “on numerous occasions, Manafort sought to pass sensitive internal polling data and campaign strategy to Kilimnik.” Though the committee was not able to find what Manafort and Kilimnik did with this information, there was continuous engagement between Manafort, an American campaign official and various Russian intelligence personnel. Moreover, "[a]fter the election, Manafort continued to coordinate with Russian persons, particularly Kilimnik and other individuals close to Deripaska, in an effort to undertake activities on their behalf. Manafort worked with Kilimnik starting in 2016 on narratives that sought to undermine evidence that Russia interfered in the 2016 U.S. election." (p.6)  The bipartisan committee described Manafort's relationship “a grave counterintelligence threat” (p. vii). (As a part of the Mueller investigation, Manafort was convicted by a jury of various charges of tax and bank fraud.)

Michael Cohen negotiated with Russians to help secure approval for Trump Tower Moscow; Trump praised Putin

In September 2015, shortly after Donald Trump announced his candidacy, Trump and the Trump organization pursued a business deal in Russia. Michael Cohen, the Trump organization’s then executive vice president and Trump’s personal attorney, received two offers from Russian businessmen to create a Trump Tower in Moscow. The negotiations for this deal commenced from September 2015 to October 2015 until both parties agreed to a Letter of Intent (LOI) that laid out terms involving millions of dollars paid to the Trump organization. In November 2015, the LOI was signed. Cohen kept Trump informed on the status of the deal and while these negotiations were taking place, Trump made positive public comments throughout his campaign about Vladimir Putin, the Russian president. The report states that Cohen, “at various times understood that both Putin’s and Trump’s comments could be useful in advancing the deal,” (p. 407). Even dating back to 2007, Trump wrote a letter to Putin congratulating him on being named Time's "Man of the Year." However, the deal for Trump Tower Moscow fell through in summer of 2016.

2016 Trump Tower Meeting among Donald Trump Jr., Jared Kushner, Paul Manafort, Russian lawyer Natalia Veselnitskaya

In June 2016, a meeting was held in the Trump Tower in New York City. This meeting involved Donald Trump Jr., Jared Kushner, Paul Manafort, Natalia Veselnitskaya, a Russian lawyer, and her four associates. The Committee found it very difficult to find evidence of the contents of this meeting and had to rely on conflicting witness testimony. However, the committee found that Donald Trump Jr. was expecting to receive “derogatory information” that would come from persons connected to the Russian government; however this information was never transmitted. Another alarming aspect of this meeting is the presence of Veselnitskaya. She is not only a Russian lawyer, but she has previously worked with and keeps in contact with various senior members of the Russian government. She has worked as defense counsel for the Russian government and a Russian business, Prevezon Holdings, in cases involving millions of dollars of fraud. The committee found her ties with powerful Russians, “extensive and concerning,” (p. ix).

Putin Ordered the Hack of DNC Computers; Wikileaks helped; Roger Stone was a liaison 

"The Committee found that Russian President Vladimir Putin ordered the Russian effort to hack computer networks and accounts affiliated with the Democratic Party and leak information damaging to Hillary Clinton and her campaign for president. Moscow's intent was to harm the Clinton Campaign, tarnish an expected Clinton presidential administration, help the Trump Campaign after Trump became the presumptive Republican nominee, and undermine the U.S. democratic process." (p. vii) Moreover, "Wikileaks actively sought, and played, a key role in the Russian influence campaign and very likely knew it was assisting Russian intelligence influence effort." (p. vii) In addition, "Trump and senior Campaign offici.als sought to obtain advance information about WikiLeaks's planned releases through Roger Stone." (p. vii)

The FBI’s Response to the DNC Hack

The committee also investigated an aspect of this issue that was not explored in Robert Mueller’s report from 2019. The committee investigated the FBI’s response to the DNC’s computer network hacking issue. From July 2015 to October 2016, the DNC was subject to Russian hacking. This was a novel issue given that a nation-state had never intentionally hacked an American private political party, taken information, and used it against a nation through leaking it publicly. The DNC was hesitant to have the FBI address this issue due to their ongoing investigation of Hillary Clinton’s email servers. What makes this issue even more complicated is during this hacking, the FBI was already looking into the domestic impact of foreign threats during a U.S. presidential election, so why did they not help the DNC sooner? The committee found that both the DNC and the FBI were not as communicative as they needed to be. The FBI also “lacked an appropriate process to escalate their warnings of the DNC hack within the DNC,” (p. 816).

FBI's Reliance on Steele Dossier 

The Committee also found the FBI should have verified the controversial Steele dossier that formed the basis of its investigation of Trump's possible connection with Russia: "Regarding the Steele Dossier, FBI gave Steele's allegations unjustified credence, based on an incomplete understanding of Steele's past reporting record. FBI used the Dossier in a FISA application and renewals and advocated for it to be included in the ICA before taking the necessary steps to validate assumptions about Steele's credibility. Further, FBI did not effectively adjust its approach to Steele's reporting once one of Steele's subsources provided information that raised serious concerns about the source descriptions in the Steele Dossier. The Corhinittee further found that Steele's reporting lacked rigor and transparency about the quality of the sourcing." (p. xix)

Committee Avoids Issue of "Collusion" with Russia

The one major issue that the report avoids specifically addressing is whether the Trump campaign colluded with Russia in the interference. Some Senators appended "Additional Views" on that issue. See pp. 941-952. 

--written by Mariam Tabrez

Pages

Blog Search

Blog Archive

Categories