•  
  •  
 
Vanderbilt Law Review

First Page

1265

Abstract

The modern internet is vast, with more than 2.5 quintillion bytes of data created every day. Content is created, uploaded, downloaded, and shared across an increasingly large number of platforms. Most of this content is legal; however, some is illegal, including hate speech, child sexual abuse material, and content that violates intellectual property rights. Section 230 of the Communications Decency Act ("CDA") provides that websites are not liable for content posted to their platform by third parties. Instead, websites determine their own content moderation policies, and the law assumes that they will do just that (given that exposure to graphic or otherwise upsetting content deters the average user).

This approach has been largely successful, but there are growing concerns about a proliferation of child sexual abuse material and sex trafficking content, and whether platforms are doing enough to prevent the spread of illegal content online. Adult content websites such as Pornhub and OnlyFans, which are hosts to legal pornography in addition to illegal content, have been a primary target of this concern. Congress's attempts to legislate the issue have been ineffective: FOSTA-SESTA, passed in 2018, created an exception to § 230's blanket grant of immunity for sex trafficking content, but has not been used in any prosecutions to date. Instead, private companies-most frequently, payment processors like Visa, Mastercard, and PayPal-are making decisions regarding vice and illegal content. In practice, this has involved shutting down payments to a website until the platform agrees to comply with the payment processor's policies regarding content moderation and verification. While technically effective for victims of illegal content, this approach entails the mass disenfranchisement of legal sex workers.

This Note proposes a reconsideration of § 230's blanket grant of immunity through a statutory revision resembling the Digital Millennium Copyright Act. The Act implements a notice-and-takedown model for copyright infringement, and a statutory revision to § 230 could do the same for illegal content. The notice-and-takedown model creates a content moderation strategy that prevents the rapid dissemination of illegal content. This Note argues that a notice-and-takedown model of liability for illegal content would respond to the needs of both victims and platforms without undermining the foundations of the free and open internet or disenfranchising legal sex workers.

Share

COinS