The So-Called “SMART Copyright Act of 2022,” and What it Means for Startups

A new bill would make major changes to the way startups that encounter user content—from podcasts to reviews to direct messages—are expected to handle potential copyright infringement online. It would authorize new requirements for startups that encounter user-generated content to use filtering technologies. And while it is ostensibly aimed at large Internet platforms, it would tee up complex and expensive compliance frameworks that would still sweep up the vast majority of startups that see little (if any) user infringement. 

Background: 

Last week, two senators introduced a bill that would amend the Digital Millennium Copyright Act (DMCA) in a number of ways. As many startups that host content know, § 512 of the DMCA is the law that currently lays out how online service providers (OSPs) are supposed to respond to complaints that a user uploaded or shared something that infringes someone else’s copyright. The DMCA applies to all types of online services—including startups that allow users to post creative content like images or podcasts, websites that allow users to write comments or reviews, services that enable direct messaging or cloud storage, and much more. 

Section 512 is set up as a safe harbor. This means that if an OSP is sued for infringement based on something a user did without the company’s knowledge or direct involvement, the company shouldn’t automatically be held liable if it followed the steps laid out in the statute. For example, the DMCA establishes the notice-and-takedown framework for removing allegedly infringing posts and says companies should  have repeat infringer policies. 

This is a balanced framework that has worked well, especially for startups—most of whom do not see much (if any) infringement. As we’ve explained, startups tend to cater to niche communities and service specific purposes; users don’t sign up for the platform to discuss workplace problems with plans to listen to top 40 music, or download the local review app in the hopes of watching box office hits.

In the rare cases where they do see infringement, these companies can remove the alleged infringement they know about and it alleviates a persistent fear of expensive litigation (where damages for even one infringing post can be $150,000). With this certainty, it has opened up all kinds of avenues for innovation and competition, the U.S. continues to lead in the global Internet ecosystem—a sector that contributes trillions to the economy each year—and growth in copyright-intensive continues to outpace other sectors.  

But there is a steady drumbeat of proposals to change that law and undermine the certainty startups need. Most recent among those proposals is The Strengthening Measures to Advance Rights Technologies (SMART) Copyright Act of 2022

What’s in this bill:

  • Expanded definition of “standard technical measures” (STMs) 🡪 Currently, to qualify for the DMCA safe harbor, OSPs have to accommodate and not interfere with standard technical measures—which are defined as measures “used by copyright owners to identify or protect copyrighted works.” But the bill would expand that definition so that OSPs (including startups) could only qualify for the safe harbor if they used certain technology “to identify or manage copyrighted works on the service.” 

    • So what? This would require startups to affirmatively adopt copyright management technology deemed “standard,” or else incur the risk and substantial cost of litigation were a user to upload something infringing. The bill would shift to an expectation of startups purchasing or building technology, e.g., to review user posts as they’re shared and find possible infringement on their own. That’s not cheap. For example, licenses to existing audio fingerprinting tools run upwards of $10,000 per month, and companies have to pay more (and spend engineering time) to integrate and maintain off-the-shelf tools. And building the technology in-house is, of course, even more expensive. For example, YouTube has spent more than $100 million on ContentID—a very sophisticated system that still draws substantial criticism from some rightsholders who say it doesn’t go far enough and criticism from creators and Internet users who have posts improperly removed or demonetized. 

  • Creating a new class of “designated technical measures” (DTMs) 🡪 The bill would create “designated technical measures”—different copyright management technologies that OSPs have to use, and which the Library of Congress (through the Copyright Office) would be authorized to select. And then, if a company or website failed to use the right technologies, that would be a legal violation, and they could be sued for statutory damages (regardless of whether the company saw any infringement). These DTMs are different from STMs and unrelated to the DMCA safe harbor. 

    • So what? This would create another layer of technology OSPs have to adopt, multiplying the costs of development, licenses, integration, and maintenance. But it would also multiply opportunities for lawsuits against startup OSPs. A startup that never experiences one (allegedly) infringing user post could still be sued if the plaintiff thinks they aren’t using the right DTMs. And copyright owners could seek statutory damages of $150,000 to $2.4 million in repeated lawsuits. This would in turn drive away investors reluctant to have their money go to legal fees. Finally, there are very credible questions about the Copyright Office’s competence to dig in on these highly-technical issues.  

  • Authorizing a web of different, potentially conflicting, technology mandates 🡪 The bill includes two different categories of technical measures—“standard” and “designated.” But it carves those definitions up further, where technical measures would vary depending on the type of content a company hosts, features of its size and resources, details about the relevant industry, and more. So there could be STMs that apply to all OSPs, different STMs that apply to some services and some types of works, other DTMs that apply to all OSPs, different DTMs that apply to a subset of copyrighted works, separate DTMs that apply to a subset of services, and so on. 

    • So what? Because the bill contemplates a lot of different technical measures, it is teeing up a complex patchwork of different required technologies—a maze startups would have to navigate when determining what all they have to build or license. For example, a platform that allows users to post photos and comments could quickly run into a handful of different required technologies. And a company that grows during the year may find itself subject to new technical mandates each month. The bill also acknowledges that some of these technical measures may be in conflict with each other. While it instructs the Copyright Office to examine conflicting requirements—it allows those conflicts, opening up a lose-lose situation where OSPs would be unable to comply with the law because they could not use two required technologies simultaneously. 

  • Adding new legal terms that would have to be litigated 🡪 The bill brings in new terms that would complicate or prolong copyright and technical measures lawsuits. For example, an STM would be developed by a group of “relevant” companies to apply to a “particular” industry. Service providers would be required to use “commercially reasonable efforts” to implement DTMs. And the Copyright Office would provide guidance on what it means for different companies to “accommodate” a given technical measure (because what one company does to accommodate can be different than another is supposed to do with the same technology).  

    • So what? Getting sued is a big deal for startups–even just getting to summary judgment in an intermediary liability case can cost $500,000 in legal fees, and the costs of proving you earned DMCA’s safe harbor can still be so high it has put startups out of business. But new legal standards that are subject to fact-bound interpretation or ambiguous terms would drive up the costs and duration of lawsuits. For example, understanding what is “relevant” to a given OSP, who fits in a “particular industry,” and what form of accommodation is “commercially reasonable” would require a lot of facts and expert discovery. 

  • Creating a new rulemaking proceeding where startups might be at an inherent disadvantage 🡪 The bill would authorize the Library of Congress (through the Copyright Office) to conduct a rulemaking every three years to designate, revise, or withdraw technical measures. These sorts of rulemakings are relatively common in the government—Congress makes a law, and then some other part of the government creates more specific rules to implement it. Here, a petitioner would ask the Copyright Office to pick a technology for DTM treatment. (So someone who’d created a new filtering tool could ask the Copyright Office to decide every OSP has to purchase a license.) Then the Office would solicit written public comment and convene hearings to collect evidence and hear arguments about why it should or should not be designated for all (or some) OSPs to use. The Office would look at things like: the total amount of alleged or actual infringement on relevant OSPs, their revenue, and financial resources, how the technology would impact fair use of content, security risks, competition among OSPs, etc. Then the Office would publish any designated measures on a website. 

    • So what? People who know to participate in rulemakings, and can regularly afford the time and money and lawyers it takes, would be able to share their thoughts. And ensure their circumstances were accommodated in any rule. But these processes are a big lift. Wealthy organizations like the Recording Industry Association of America and the Motion Picture Association (that spend millions in a year just on lobbying) are already repeat players in Copyright Office rulemakings, while it often falls to pro bono lawyers and public interest groups to bring opposing perspectives to the Office’s attention. This bill would expect startups to show up at these rulemakings, or else they would be subject to the same rules and technology requirements as much larger competitors (who can, and would, participate in rulemaking). 

So would the bill create upload filter mandates? Upload filters are the most prevalent thing content industry lobbyists talk about when they discuss these types of technical measures. And they have long pushed for content filtering mandates—something this bill opens the door to. (Indeed, the bill’s co-sponsors tried to deflect this criticism by suggesting that Creative Commons licenses might have a role to play, which drew swift objection from CC itself.)  

Engine has written a lot about how error-prone those filtering technologies are. Over-reliance on technology and over-removal of non-infringing content are real problems with real consequences for OSPs and their users. Classical music concerts are pulled from the Internet, people are prevented from criticizing things, and large rightsholders can simultaneously argue in court that something is not copyrightable (when they are accused of infringement) yet turn around and accuse others of infringement for using the same material. The Copyright Office itself heard scores of complaints about filtering technology last month. 

The Strengthening Measures to Advance Rights Technologies Copyright Act raises very real concerns, for startups and innovation, as well as Internet users, free speech, and small businesses that depend on Internet platforms. 

Disclaimer: This post provides general information related to the law. It does not, and is not intended to, provide legal advice and does not create an attorney-client relationship. If you need legal advice, please contact an attorney directly.