Ray Mestad, MJLST Staffer
The practicality, ease of use, and sheer addictiveness of social media has led to its massive explansion around the world. Approximately 65% of the world uses the internet, and of that group, only 5% does not use social media. So 60% of the world is on social media, around 4.76 billion people. For most, social media is one of the simplest ways to stay connected and communicate with friends, family, and other people in their circle. But along with the growing use of social media, questions have been raised regarding the potential liability social media corporations may have for the content that is posted on their platforms. Recently, lawsuits have been filed against companies like Google, Twitter, and Facebook for allegedly allowing groups accused of terrorism to spread their message or plan on their platforms. The question we are left with is to what extent are social media companies responsible for posts on their sites that lead to violence?
The family of Nohemi Gonzales, an American student killed in Paris during a 2015 Islamic State attack, is suing Google for platforming the Islamic State by allowing them to post videos on Youtube, and then recommending them to people with the Google algorithm. And the family of Nawras Alassaf, a Jordanian citizen killed in a 2017 Istanbul Islamic State attack, is suing Twitter, Google, and Facebook, for not doing more to prevent the organization from using their platform as a communications and messaging tool. Gonzales v. Google and Twitter v. Taamneh will both be presenting their oral arguments to the Supreme Court this month, February 2023.
The legal issues in these cases are rooted in Section 230 of the Communications Decency Act, part of the Telecommunications Act of 1996. 47 U.S.C. 230 intends to protect freedom of expression by protecting intermediaries that publish information posted by users. Section 230(c)(1) states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” This protects web platforms from liability for the content that users post.
Further, Section 230(c)(2) states that “No provider or user of an interactive computer service shall be held liable on account of…any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected…” This is known as the “Good Samaritan” provision. Like 230(c)(1), Section 230(c)(2) gives internet providers liability protection, allowing them to moderate content in certain circumstances and then providing a safeguard from the free speech claims that would be made against them.
The question is whether or not defendant social media platforms should be shielded from liability for platforming content that has allegedly led to or facilitated violent attacks. In Gonzales, the Justice department stated that although the company is protected against claims for hosting ISIS videos, a claim may be filed against Google for allowing Youtube to provide targeted recommendations of the videos. And in Taamneh, the 9th Circuit agreed with the plaintiffs that there was room for the claim to go forward under the Anti-Terrorism Act because Twitter had generalized knowledge of the Islamic State’s use of their services.
Section 230 has drawn an eclectic mix of critics and supporters. For example, although many conservatives and free speech advocates support the protections of Section 230, there have also been conservatives that oppose the code due to perceived restriction of conservative viewpoints on social media platforms. For example, prominent Republican Josh Hawley from Missouri has come out against the code, stating that the tech platforms ought to be treated as distributors and lose Section 230 protections. In fact, Hawley introduced a piece of legislation opposing Section 230 called the Federal Big Tech Tort Act to impose liability on tech platforms. And on the left, Section 230 is supported by those who believe the voices of the marginalized are protected by 230 and would otherwise be at the whim of tech companies, but opposed by people who fear that the code enables political violence and hate speech.
The Supreme Court has now granted certiorari in both Gonzales and Taamneh. In Gonzales, the plaintiffs are arguing that Section 230 should not protect the actions of Google because the events occurred outside the US, it is preempted by the Justice Against Sponsors of Terrorism Act (JASTA), and the algorithmic recommendations transform Google / Youtube from an interactive computer service to an information content provider. Google is arguing that they should be protected by 230, particularly 230(c)(1). The 9th Circuit stated that although 230 did apply abroad, that JASTA shouldn’t supersede 230. Instead, 230 and JASTA should run parallel to each other. The 9th Circuit further stated that the claims based on revenue sharing (rather than ad targeting) should be dismissed. They did not think Google was contributing to terrorism, because they were motivated by financial enrichment rather than ideology, and affirmed the dismissal, partially because there was not clear enough information of how much support Google had provided to ISIS. Future decisions regarding this case will implicate things like whether algorithmic recommendations should apply to 230.
In Taamneh, the defendants argued that there was no proximate cause, as well as arguing about the inapplicability of Section 230. Unlike in Gonzales, Taamneh had adequately stated a claim for aiding and abetting because the social media companies had more explicit knowledge of how their platforms were being used by these groups. The Taamneh dismissal was reversed. The Supreme Court review of this case will have implications on what it means to support or have a relationship with a group via a social media platform. In both of these cases, fears regarding the scope of 230 were expressed, which could reflect poorly on its applicability going forward.
Gonzales and Taamneh will hit the Supreme Court soon. If 230 is restricted, it would enable greater free speech but risks exposing more people to harms like hate speech or violence. However, if 230 is preserved as is, it could restrict the accessibility and openness that has made the internet what it is today. Whichever decision is made, there will be massive implications for what the internet looks like in the future.
 Supa Washington Post
 Supa Washington Post