Section 230 back in the spotlight

unnamed.png

Section 230 back in the spotlight

TLDR: The Trump administration and policymakers are putting growing pressure on a bedrock Internet law that allows companies of all sizes to host and moderate user-generated content. Startups depend on this framework—known as Section 230 of the Communications Decency Act—to grow without the fear of being sued into bankruptcy over the user-generated content they host and moderate. Weakening this law would have a disastrous effect on the Internet ecosystem. 

What’s Happening This Week: The Trump administration petitioned the Federal Communications Commission to regulate the content moderation practices of Internet firms, the latest step in a push to weaken platforms’ ability to host and moderate their users’ content. At issue is Section 230 of the Communications Decency Act, the 1996 law that provides companies of all sizes with liability protections when it comes to hosting and moderating user-generated content.

The Commerce Department’s petition for rulemaking yesterday came after President Donald Trump issued an executive order on “preventing online censorship” that pushed federal agencies to clarify the meaning of “good faith” moderation under the law. The order requests that the FCC review when Internet firms qualify for Section 230 protections and determine what constitutes moderating in “good faith” under the statute.

This morning, the Senate Commerce Subcommittee on Communications, Technology, Innovation, and the Internet also held a hearing on the Platform Accountability and Consumer Transparency (PACT) Act, which has requirements around content moderation practices, appeals, and transparency, and would force companies to remove illegal content from their sites within 24 hours.

Why it Matters to Startups: Content moderation is an inherently difficult and time-consuming task for even the largest Internet firms, but efforts to roll back Section 230 will disproportionately harm digital startups that rely on the bedrock Internet law to host and moderate user content in ways that best serve their users. Startups are keenly aware of how expensive it is to fight off even frivolous lawsuits over users' speech. Section 230 allows these platforms to host and moderate user speech without the risk of crippling suits. 

Internet platforms currently face a lose-lose situation. On the one hand, policymakers are calling on the tech industry to crack down on problematic content—including hate speech, violence, and dangerous misinformation—especially in light of the ongoing pandemic. On the other hand, platforms’ decisions to remove problematic content is quickly derided as political bias. These competing demands are leading to an influx of ill-conceived, legally dubious, and politically motivated policies that could have serious repercussions on the broader Internet ecosystem. 

Take President Trump’s executive order, which is supposedly meant to curtail allegations of online censorship but is widely viewed as a politically motivated response to Twitter’s efforts in recent months to add fact-checking and warning labels to several of the president’s tweets. As we noted in a statement when the order was released, the president’s move “will encourage bad faith lawsuits, and dismantle the fundamental and commonsense legal framework that startups depend on to compete in today’s Internet ecosystem and keep their platforms free of objectionable content.” And there are questions as to whether or not the FCC even has the legal authority to regulate this type of online speech.

The current legal framework provided by Section 230 gives Internet platforms the ability to host user-generated content without fear of ruinous litigation, and it provides for flexibility in how they moderate that content. While discussions of Section 230 often focus on some of the biggest players in the industry, Section 230 is what allows for every comment section, review website, photo sharing service, and more, on the Internet. Under the current framework, every Internet platform can create its own community, write rules that make sense for that community, and use the tools at its disposal to enforce those rules. Changing Section 230 with only the biggest companies and their seemingly endless resources in mind will ensure that only the biggest companies can afford to host users’ content.

On the Horizon.

  • The Senate Committee on Environment and Public Works is holding a hearing at 10 a.m. tomorrow to discuss the lessons learned from remote working during the pandemic, and whether or not the government can save money by maximizing the efficient use of leased space. 

  • The House Small Business Subcommittee on Rural Development, Agriculture, Trade, and Entrepreneurship is holding a hearing at 10 a.m. tomorrow on “Kick Starting Entrepreneurship and Main Street Economic Recovery.”

  • The House Judiciary Antitrust Subcommittee is holding a hearing tomorrow at noon with the CEOs of Amazon, Apple, Google, and Facebook to discuss allegations of anti-competitive practices by the country’s largest tech firms.

  • 1863 Ventures is holding a virtual town hall this Friday at 2 p.m. to discuss the state of black entrepreneurship and supporting New Majority entrepreneurs.