2017 Year in Review: Intermediary Liability

This post is one in a series of reports on significant issues for startups in 2017. In the past year, the startup community's voice helped drive notable debates in tech and entrepreneurship policy, but many of the startup world's policy goals in 2017, such as immigration reform and an open internet, remain unfulfilled. Check back here for more year-end updates and continue to watch this space in 2018 as we follow policy issues affecting the startup community.



In the span of a few short decades, the Internet has quickly become the greatest medium for public engagement in the history of the world. The participatory nature of Web 2.0 is a direct product of policy decisions made in the Internet’s infancy to refrain from holding websites legally responsible for the actions and speech of users that they cannot fully control. Laws like the Digital Millennium Copyright Act and Section 230 of the Communications Decency Act have allowed tens of thousands of startups to build online platforms where users can create, post, and share media of all kinds without facing ruinous legal liability if one rogue user posts something illegal.

Although these laws have been unquestionably successful in spurring the Internet’s growth and are narrowly tailored to ensure that platforms are held liable when they actually engage in wrongdoing, 2017 saw an increase in efforts to roll back these critical protections and subject online intermediaries to legal liability for their users’ conduct.

EU Copyright

Early in the year, as the U.S.’s new government was getting up to speed, most of the action around intermediary liability was taking place overseas as the European Parliament was still considering changes to its copyright system. Most dangerously, EU policymakers were considering a proposal from the European Commission to require all websites hosting user generated content to implement technological systems to identify and remove copyrighted material from their sites. In February, Engine published a white paper explaining why filtering technologies are inherently limited in their capacity to address online copyright infringement and are unduly expensive for the small platforms that don’t already implement some content moderation tools. Although the Commission failed to adequately explain why technologies that can only identify a small range of copyrightable content with significant error rates should be legally required, there appears to be momentum in Europe behind forcing platforms to engage more directly in content moderation.


Section 230

Anti-platform sentiment was not confined to Europe, as U.S policymakers similarly took aim at longstanding intermediary liability protections. Most prominently, Congress considered two bills nominally meant to help law enforcement prosecute websites that facilitate online sex trafficking that presented serious unintended consequences for honest platforms. The Stop Enabling Sex Traffickers Act (SESTA) and the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA) both sought to amend Section 230 of the Communications Decency Act (CDA 230), a law passed in 1996 that prevents websites from being held liable under civil and state criminal law for the actions of their users. Notably, CDA 230 has never given platforms immunity under federal criminal law for their users’ speech. Nevertheless, even though CDA 230 does nothing to stop the Department of Justice from prosecuting websites that facilitate online sex trafficking, the sponsors of SESTA and FOSTA sought to amend CDA 230 to allow state attorneys general to prosecute websites for state laws that relate to sex trafficking and allow private citizens to sue websites if users engaged in sex trafficking even if the site was never found criminally liable. Engine opposed the original versions of these bills on the grounds that they were unnecessary to actually stop online sex trafficking, created a risk of meritless litigation against honest platforms, and threatened to penalize websites that were attempting to remediate illegal activity on their sites.

In October, Engine’s Executive Director, Evan Engstrom, testified before the House Judiciary Committee, highlighting these concerns and offering proposed changes to the legislation that would address startup concerns without weakening law enforcement’s capacity to stop bad actors. Although SESTA ended up passing through the Senate Commerce Committee without any significant fixes, the House Judiciary Committee took law enforcement and tech concerns into account, passing an amended version of FOSTA that gives law enforcement more tools to prosecute bad actors while preserving important protections for law abiding startups. Neither bill has made it to the floor for final passage, but there will likely be movement on these proposals early in 2018.


Social Media Misinformation

In light of the misinformation campaign that plagued social media platforms during the 2016 election, Congress was eager to inquire into how foreign agents were allowed to spread false information through large web platforms, holding several hearings to press companies on their involvement. Although the inquiry has not yet produced any legislation that would hold websites liable for the activities occurring on their platforms, the Senate introduced a bill that would require platforms to disclose information about political ads on their sites.


NAFTA Update

Finally, with international trade a top theme in the 2016 campaign, the new Administration prioritized its efforts to renegotiate the North American Free Trade Agreement—a process that could have a significant impact on web intermediaries. Lobbyists for the large content industries have been attempting to use the NAFTA renegotiation to weaken intermediary protections by inserting into the new agreement provisions that would weaken the Digital Millennium Copyright Act’s protections for online platforms. Because such international trade agreements are negotiated without much public input, it is easier for Hollywood lobbyists to insert copyright maximalist provisions into trade agreements than it is to pass them through Congress. Once these measures are passe through treaties, the content industries then push to pass implementing legislation so that domestic law is consistent with treaty obligations. Engine has rallied the startup community to protect meaningful limitations on platform liability in NAFTA and promote sensible, balanced copyright rules as a part of our national trade strategy.

In the end, while there were not many anti-platform policies enacted in 2017, there was significant movement in that direction. Next year will likely see the culmination of these trends, and startups will have to remain engaged to prevent the unwinding of the policy landscape that helped make the Internet what it is today.