Section 230 of the Communications Decency Act of 1998 has come under fire in the U.S. Congress. Republican lawmakers contend that Section 230 is being invoked by Internet platforms, such as Facebook, Google, and Twitter, as an improper shield to censor content with a bias against conservative lawmakers and viewpoints. These lawmakers contend that Section 230 requires Internet sites to maintain "neutrality" or be a "neutral public forum." However, some legal experts, including Jeff Kosseth who wrote a book on the legislative history and subequent interpretation of Section 230, contend this interpretation is a blatant misreading of Section 230, which specifically creates immunity from civil liability for ISPs for "any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected." Donald Trump issued an Executive Order that attempts to (re)interpret "good faith" to require political neutrality. The Department of Justice appeared to concede, however, that "good faith" is unclear and recommended that Congress provide a statutory definition of the term. Several Republican lawmakers in the House and the Senate have proposed new legislation that would reform or eliminate Section 230, and limit Internet platforms’ ability to censor content that the platforms feel is harmful, obscene, or misleading. This article summarizes the proposed bills to amend Section 230.
1. Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020 (EARN IT Act, S.3398): no immunity for violation of law on child sexual abuse material unless ISP earns back via best practices
The EARN IT Act was introduced by Senator Lindsey Graham (R-SC) and co-sponsored by Senator Richard Blumenthal (D-CT). The EARN IT Act’s main purpose is to carve out from the ISP immunity under Section 230(c)(2)(A) and thus to expose ISPs to potential civil liability pursuant to 18 U.S.C. section 2255 or state law based on activity that violates 18 U.S.C section 2252 or 2252A (which cover child sexual abuse material (CSAM) distribution or receipt). However, an ISP can "EARN" back its immunity if it follows the requirement's of the Act's newly created safe harbor:
- "(i) an officer of the provider has elected to certify to the Attorney General under section 4(d) of the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020 that the provider has implemented, and is in compliance with, the child sexual exploitation prevention best practices contained in a law enacted under the expedited procedures under section 4(c) of such Act and such certification was in force at the time of any alleged acts or omissions that are the subject of a claim in a civil action or charge in a State criminal prosecution brought against such provider; or
- “(ii) the provider has implemented reasonable measures relating to the matters described in section 4(a)(3) of the Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2020, subject to the exceptions authorized under section 4(a)(1)(B)(ii) of that Act, to prevent the use of the interactive computer service for the exploitation of minors.”
To develop the "child sexual exploitation prevention best practices" required for the new safe harbor, the EARN IT Act would create a commission called the “National Commission on Online Child Sexual Exploitation Prevention,” consisting of sixteen members. The Commission’s duty would be to devise a list of “best practices” for combatting child sexual abuse material (CSAM) and send the list to Attorney General William Barr, the Secretary of Homeland Security, and the Chairman of the Federal Trade Commission—all of whom would be appointed as members of the Commission—for review. These three members, dubbed the “Committee,” would have the power to amend, deny, or approve the list of “best practices” created by the Commission. After the Committee approves a list of “best practices,” the list is sent to Congress, which has ninety days to file a “disapproval motion” to veto the list from going into effect.
Sponsors Sens. Lindsey Graham (R-SC) and Richard Blumenthal (D-CT)
UPDATED July 4, 2020: The Senate Judiciary Committee unanimously approved the bill (22-0). It now will be considered by the Senate.
2. Limiting Section 230 Immunity to Good Samaritans Act: creates civil action against edge providers for "intentionally selective enforcement" of content moderation
In June 2020, Sen. Josh Hawley (R-MO) introduced a bill titled Limiting Section 230 Immunity to Good Samaritans Act. The bill defines a "good faith" requirement in Section 230 for content moderation by a newly defined category of "edge providers," Internet platforms with more than 30 million users in the U.S. or more than 300 million users worldwide, plus more than $1.5 billion in annual global revenue. It does not include 501(c)(3) nonprofits. The bill defines good faith so it does not include "intentionally selective enforcement of the terms of service," including by an algorithm that moderates content. The term is vague. Presumably, it is meant to cover politically biased moderation (see Ending Support for Internet Censorship Act below), but it might apply to situations that ISPs selectively enforce their policies simply because of the enormous amount of content (billions of posts) on their platforms in a kind of triage. The bill also creates a cause of action for users to sue Internet platforms that intentionally selectively enforce and to recover $5,000 in statutory damages or actual damages.
Sponsors: Sen. Josh Hawley (R-MO); Sens. Marco Rubio (R-FL), Mike Braun (R-IN), Tom Cotton (R-AR); Sen. Kelly Loeffler (R-GA)
3. Ending Support for Internet Censorship Act (“Hawley Bill," S.1914): ISPs must get "immunity certification" from FTC that ISP doesn't moderate content in "politically biased manner"
The Hawley bill, Ending Support for Internet Censorship Act, introduced by Senator Josh Hawley (R-MO) and co-sponsored by Senators Marco Rubio (R-FL), Mike Braun (R-IN), and Tom Cotton (R-AR), seeks to require ISPs to obtain an "immunity certification from the Federal Trade Commission"; the certiication requires the ISP "not [to] moderate information provided by other information content providers in a manner that is biased against a political party, political candidate, or political viewpoint." The ISP must "prove to the Commission by clear and convincing evidence that the provider does not (and, during the 2-year period preceding the date on which the provider submits the application for certification, did not) moderate information provided by other information content providers in a politically biased manner."
The bill defines "politically biased moderation" as:
POLITICALLY BIASED MODERATION.—The moderation practices of a provider of an interactive computer service are politically biased if—
- “(I) the provider moderates information provided by other information content providers in a manner that—
- “(aa) is designed to negatively affect a political party, political candidate, or political viewpoint; or
- “(bb) disproportionately restricts or promotes access to, or the availability of, information from a political party, political candidate, or political viewpoint; or
- “(II) an officer or employee of the provider makes a decision about moderating information provided by other information content providers that is motivated by an intent to negatively affect a political party, political candidate, or political viewpoint."
Sponsor: Senator Josh Hawley (R-MO)
4. Stop the Censorship Act (“Gosar Bill,” H.R.4027): removes "objectionable" from Good Samaritan provision for content moderation, limiting it to "unlawful material"
The Gosar billl, Stop the Censorship Act, seeks to eliminate Section 230 immunity for Internet platforms like Facebook, Google, and Twitter, for censoring content that the platforms deem “objectionable.” US Representative Paul Gosar (R-AZ), joined by fellow Conservative Congressmen Mark Meadows (R-NC), Ralph Norman (R-SC), and Steve King (R-IA), believe the language of Section 230's Good Samaritan blocking is too broad. The Gosar bill would strike language in Section 230 that allows Internet platforms to censor content deemed “objectionable”; the only content that should be censored, the sponsors argue, is “unlawful” content (i.e. CSAM). Further, the bill would establish an option for platform users to choose between a safe space on the platform (would feature content-moderated feeds controlled by the platform) and an unfettered platform (would include all objectionable content). The bill would change Section 230(c)(2) as follows:
(2) Civil liability. No provider or user of an interactive computer service shall be held liable on account of— (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or..."
(2) Civil liability. No provider or user of an interactive computer service shall be held liable on account of—(A) any action voluntarily taken in good faith to restrict access to or availability of unlawful material;
(B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1); and
(C) any action taken to provide users with the option to restrict access to any other material, whether or not such material is constitutionally protected.”
Sponsors: Rep. Gosar. Cosponsors: Rep. Mark Meadows (R-NC), Rep. Steve King (R-IA), Rep. Ralph Norman (R-SC), Rep. Ted Yoho (R-FL), Rep. Ron Wright (R-TX), Rep. Glenn Grothman (R-WI)
5. Stopping Big Tech Censorship Act (Sen. Kelly Loeffler (R-GA): adds conditions to both Section 230(c)(1) and (c)(2 immunities, including subjecting content moderation of Internet platforms to First Amendment-style limitations on government restrictions of speech
US Senator Kelly Loeffler (R-GA) recently introduced the “Stopping Big Tech’s Censorship Act,” which would amend language in Section 230 of the Communications Decency Act to “protect First Amendment Rights” of users on social media platforms.
The first change is to the immunity in Section 230(c)(1). The bill would require Internet platforms to "take reasonable steps to prevent or address the unlawful use of the interactive computer service or unlawful publication of information on the interactive computer service,’’ in order to qualify for the immunity from defamation and other claims based on the content of their users.
The second change is to the immunity in Section 230(c)(2). Internet platforms will only enjoy Section 230(c) immunity for their content moderation if: “(I) the action is taken in a viewpoint-neutral manner; (II) the restriction limits only the time, place, or manner in which the material is available; and (III) there is a compelling reason for restricting that access or availability.” This set of requirements is substantial and might be hard to put into place with current community standards. For example, removing hate speech, white supremacist propaganda, neo-Nazi content, racist speech, and other offensive content might be viewed as viewpoint discrimination under this approach.
Duty to take reasonable steps to moderate unlawful content. Loeffler's bill also adds a requirement that the Internet platforms "take reasonable steps to prevent or address the unlawful use of the interactive computer service or unlawful publication of information on the interactive computer service."
Disclosure of policies. Further, the bill requires Internet platforms to disclose their content moderation policy: “(A) a provider of an interactive computer service shall, in any terms of service or user agreement produced by the provider, clearly explain the practices and procedures used by the provider in restricting access to or availability of any material; and (B) a provider or user of an interactive computer services that decides to restrict access to or availability of any material shall provide a clear explanation of that decision to the information content provider that created or developed the material.”
-written by Adam Wolfe