Content Moderation Debate Needs Startup Voice

unnamed.png

Content Moderation Debate Needs Startup Voice

TLDR: A key House panel is holding a hearing this Thursday with the CEOs of Facebook, Google, and Twitter to examine the spread of extremism and misinformation online. It’s critical that policymakers concerned about the proliferation of harmful content across the Internet understand how potential legislative remedies would impact startups and other Internet companies’ ability to moderate user-generated content. 

What’s Happening This Week: Two House Energy and Commerce subcommittees are holding a joint hearing this Thursday at noon with Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai, and Twitter CEO Jack Dorsey to discuss “Social Media's Role in Promoting Extremism and Misinformation.” The high-profile hearing comes as policymakers continue to criticize the content moderation practices of large Internet companies, with many lawmakers framing their complaints around Section 230—a bedrock law that lets online companies of all sizes host and moderate user-generated content without being held liable for that content. 

While policymakers have spent the past several years expressing concerns about the proliferation of harmful content online, the spread of pandemic and election misinformation—particularly in the context of the January 6th insurrection attempt—has renewed discussions in Congress about Internet companies’ moderation efforts. The hearing comes after the panel’s Democrats and Republicans both sent letters to large Internet companies expressing concerns about their content moderation practices, albeit for different reasons. Democrats claim that online companies are not doing enough to moderate harmful content, while Republicans have attacked Section 230 on the grounds that it lets Internet companies censor conservative voices, despite reports to the contrary. 

Why it Matters to Startups: Any attempts to reform or revoke Section 230’s liability limitations would have an outsized impact on U.S. startups that rely on the bedrock Internet law in order to host and moderate user-generated content. 

Content moderation is an inherently difficult task for even the largest tech companies, and it’s impossible for any online company—particularly a nascent startup—to identify and remove all harmful user-generated content, let alone define “harmful” in a way that appeases lawmakers on both sides of the aisle. Startups, in particular, cannot afford to employ teams of human moderators or purchase unreliable technological moderation tools to identify and remove offending content. Instead, Section 230’s legal framework encourages online companies to spend the time and resources they have to moderate content in ways that make the most sense for their users. 

Even with Section 230, a company can still spend tens of thousands of dollars to get a lawsuit over user-generated content dismissed. Revoking or changing the current framework could cause these legal costs to rise into the hundreds of thousands of dollars, making it financially ruinous for startups on bootstrap budgets to even host user-generated content. And changing Section 230’s liability limitations would not increase free speech on the Internet or squash the spread of harmful content. Since policymakers cannot force companies to host any specific speech, Section 230 simply lets Internet companies dismiss frivolous lawsuits at the earliest stage, before the legal costs become untenable for most nascent companies.

Policymakers are, understandably, concerned about the proliferation of health and election misinformation across the Internet. But if policymakers care about combating misinformation and other harmful content—while also promoting competition across the tech industry—they should work with the broader online community, especially startups, to address concerns about content moderation in ways that do not undermine Section 230’s critical liability limitations.

On the Horizon.

  • The Federal Communications Commission is holding a tech startup roundtable at 10 a.m. tomorrow featuring minority, women, and small business tech entrepreneurship support organizations.

  • NetChoice is holding a virtual policy webinar at 1 p.m. ET tomorrow to discuss free speech, privacy, and innovation online.

  • The National Venture Capital Association is holding a webinar tomorrow at 1 p.m. to discuss the impact of antitrust scrutiny and proposed regulatory changes on VCs and startups.

  • Join Engine tomorrow at 4 p.m. ET for a virtual policy seminar on Section 512 of the DMCA. We’ll discuss how Section 512 impacts startups and how entrepreneurs can get involved in the policy debate. You can RSVP here.