Despite Big Tech Focus, Startups Stand to Lose in 230 Debate

unnamed.png

Despite Big Tech Focus, Startups Stand to Lose in 230 Debate

TLDR: A Senate panel is holding a hearing this morning with the CEOs of Facebook and Twitter to discuss Section 230 and allegations of political bias in the context of the 2020 presidential election. Although policymakers are continuing to scrutinize Section 230 because of supposed censorship by the largest tech companies, any changes to the law would have an outsized impact on U.S. startups that rely on the bedrock Internet law in order to host and moderate user content without the fear of potentially crippling lawsuits. 

What’s Happening This Week: The Senate Judiciary Committee is holding a hearing this morning with Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey to discuss “the platforms’ censorship and suppression of New York Post articles” and “review the companies’ handling of the 2020 election.” Republican members of the panel, led by Chairman Lindsey Graham (R-S.C.), pushed for the hearing—scheduled before President Donald Trump lost the presidential election earlier this month—after both platforms moderated controversial New York Post articles about President-elect Joe Biden’s son.

The hearing comes as the Trump administration and other Republican lawmakers continue to make unsubstantiated claims of anti-conservative bias by large Internet platforms. Both Twitter and Facebook’s decision to moderate the New York Post articles led to renewed calls from Trump and other Republican lawmakers to repeal Section 230. Meanwhile, across the aisle, dozens of Senate Democrats sent a letter to Facebook CEO Mark Zuckerberg this week expressing concerns about anti-Muslim content spreading across the platform.

The Senate Commerce Committee held a hearing last month with the CEOs of Facebook, Google, and Twitter that was supposed to discuss Section 230 and content moderation through the lens of big tech and alleged online censorship, but policymakers used much of the hearing to make unfounded claims of political bias on the part of large tech companies without offering any substantive legislative proposals. Earlier this year, Trump issued an executive order on “preventing online censorship” that directed federal agencies—including the Federal Communications Commission—to examine Section 230’s liability limitations. FCC Chairman Ajit Pai announced last month that the agency will move forward with the Trump administration’s petition. Several leading House Democrats have already asked the FCC to “stop work on all partisan, controversial items,” including the agency’s politically-motivated review of Section 230, ahead of the presidential transition. 

Why it Matters to Startups: Although policymakers have framed their criticisms of Section 230 around the actions of the largest tech companies, repealing or revising the law would have an outsized impact on the U.S. startups that host and moderate user-generated content. As we saw during last month’s Senate Commerce hearing, lawmakers have largely ignored the impact that changes to Section 230 would have on the startup community in favor of leveling politically-motivated and baseless attacks against the largest tech companies.

Emerging startups rely on the liability framework created by Section 230 to attract funding, grow, and attain users. Even with Section 230, it can cost tens of thousands of dollars to get lawsuits over user content dismissed. Changing the law would cause these legal costs to quickly rise into the hundreds of thousands of dollars, making it financially ruinous for startups on bootstrap budgets. And revising Section 230 can’t require platforms to host content that they would otherwise remove. Instead, Republican policymakers pushing to have Section 230 amended to address claims of anti-conservative bias would effectively be using the threat of private lawsuits to strong-arm platforms into hosting legal—though still problematic—content. 

And those costs loom large regardless of the content moderation efforts companies undertake to keep their platforms safe, appealing, and relevant. Content moderation is already a difficult task for the largest Internet companies, but it’s nearly impossible for startups and other early-stage Internet platforms to hire content moderators or purchase expensive and imprecise filtering tools as they scale. Despite policymakers’ focus on big tech platforms like Twitter and Facebook, Section 230 is what allows every online comment section, file-sharing service, digital messenger, and e-commerce platform to exist. Changing this law in response to politically-driven claims of censorship by large tech companies would have a devastating impact on the Internet ecosystem, making it more difficult for early-stage startups to compete and creating a situation where only the largest and most financially secure companies would be able to exist.

Policymakers who truly care about promoting competition in the tech industry and combating problematic content online should work with the broader ecosystem—especially startups—to address real problems with digital content and content moderation without undermining the Section 230’s liability framework. 

On the Horizon.

  • Engine and Stand Together are hosting a virtual discussion today at noon to discuss how the tech sector is addressing society’s biggest challenges.

  • The Software and Information Industry Association is holding a virtual panel discussion at 1 p.m. this afternoon on how industry and government are working to combat deepfakes and other inauthentic content online.

  • The Senate Judiciary Committee is scheduled to consider the Online Content Policy Modernization Act this Thursday at 10 a.m. As we previously noted, this legislation would make it easier to sue startups out of existence and open doors to abusive copyright claims against everyday Internet users.

  • Engine is hosting a webinar tomorrow at 4 p.m. to discuss access to capital policy and how startups can work with policymakers to drive solutions to capital access barriers. RSVP here.

  • Join Engine and the Charles Koch Institute this Friday at noon for a virtual panel on the Nuts and Bolts of Content Moderation. We'll discuss some of the misperceptions about how content moderation actually works and separate the myths and the facts around content moderation and Section 230. RSVP here.