The End of an Era: The Uncertain Future of Section 230 Immunity for Social Media Platforms
Lillian H. Rucker | 26 Vand. J. Ent. & Tech. L. 241 (2023)
Major social media platforms (SMPs), such as Facebook, Instagram, and TikTok, have become the primary means of communication for billions of people worldwide. They are the largest modern news distributors and the primary curators of online public discourse. However, the expanding influence of SMPs has led many to publicly scrutinize the content moderation decisions of such platforms, as SMPs regularly remove, block, censor, and ban user-generated content (UGC), including third-party written messages, photos, and videos, at their discretion. Because SMPs exercise immense power and are largely self-regulated, there has been growing public sentiment that SMP content moderation violates Users’ free speech rights. Nevertheless, SMP content moderation decisions are protected by Section 230 of the Communications Decency Act of 1996 and the First Amendment of the United States Constitution.
Congress enacted Section 230 to promote the development of the internet by granting “Good Samaritan” online services the authority to moderate UGC without potential liability. However, “Bad Samaritan” providers have also benefited from this immunity, bringing the law to the forefront of public debate regarding online free speech. Despite repeated congressional efforts to narrowly tailor Section 230’s protections, the future of SMP immunity and online speech is in the hands of the United States Supreme Court. How the Court decides this question of statutory interpretation could have widespread, unintended consequences for the modern internet. That being so, because policymakers are tasked with redressing societal ills, this Note proposes that Congress articulate a specific liability standard applicable only to SMPs, drawing on the immunity framework in Section 230(c) and narrowly tailored to the unique issues arising from SMPs. This solution evades constitutional concerns and is consistent with the congressional intent to safeguard the ability of SMPs to regulate content moderation for their platforms, subject to moderately heightened standards for immunity.