Implications for startups of House online safety package

This week, a key House committee will consider a trove of online safety proposals, marking movement on one of the many intractable tech policy issues Congress perennially debates but rarely advances to the president’s desk. The proposed legislation comes with potential for a big impact on how startups design customer experiences, deliver their services, and interact with their users—regardless of those users’ ages. 

Background

Congress has for years debated expanding protections for young Internet users, in response to growing involvement of technology in everyday life for individuals of all ages. Lawmakers have debated and voted on several proposals in relevant committees, and the Senate passed its version of the Kids Online Safety Act last Congress. That version was not taken up by the House, over concerns the legislation would regulate content—and therefore run afoul of the First Amendment. 

States have stepped into the space as well by passing their own unique legislation. More than a dozen states now require age verification for adult sites, and many others have adopted broader online safety laws covering social media age minimums, default privacy settings for minors, parental-consent requirements, and content-based restrictions. However, most of these laws have been challenged, and in many of them blocked, on First Amendment grounds. The Supreme Court okayed age verification requirements for adult content sites in a ruling last year, but other court challenges have raised concerns about free expression, data-collection implications, and overly broad definitions of what counts as harmful content—resulting in the laws being enjoined.

In December, House lawmakers rekindled work on the issue. The House Energy and Commerce subcommittee on commerce, manufacturing, and technology held a legislative hearing and marked up a trove of online safety proposals, advancing them for consideration before the full Committee. The proposals include studies of kids’ technology use, awareness campaigns, age verification frameworks, bans on specific design features, and more. The proposals would change how Internet companies interact with young users, create obligations around how much the companies have to do to identify whether a user is young, and could even require startups to build new features or services. 

The subcommittee activity revealed partisan splits on key provisions of the bills, even if the underlying issue is a priority for the members. Committee Democrats were not happy with the removal of provisions they championed, the inclusion of language preempting state law, and the absence of private rights of action. They voiced preference for Senate versions of some of the bills, even though those are unlikely to be taken up by House leadership over the same issues that condemned the bills last Congress.  

Issues at play:

Age verification.

Startups are at a dramatic disadvantage when it comes to many of the direct or implied age verification requirements; a company can only limit the kinds of features or content available to young users after it determines which users are young and which aren’t. Age verification and estimation tools are expensive and limited in their accuracy, and their use inherently has tradeoffs for the privacy, security, and expression of all users. Most of the proposals recognize these dynamics by not explicitly requiring it and generally avoiding liability standards that would require it in practice. In the handful of proposals where age verification is explicitly required for general purpose applications, it is to be performed by ostensibly larger entities—app stores and distributors—but even these approaches give rise to concerns that the subcommittee should be mindful to reconcile. 

Clarity about where and to what entities liability for (in)accuracy of age signals attaches is tantamount. Startups, i.e., developers, should not be liable for—nor need to further investigate—age signals shared by distributors. They should merely be called on to follow the signals they receive. And lawmakers must be clear-eyed about the fact that requiring developers to build their products around signals from distributors risks cementing the central role that distributors have in the mobile ecosystem—a role that facilitates many startups’ go-to-market strategy but that critics, including some lawmakers, have said raises competition concerns.

Knowledge standard.

The knowledge standard, or the legal meaning of “know” in the legislation, is central to understanding the potential impact on startups because it informs how far a startup is expected to go to determine the age of their users. The bills considered at the subcommittee level had varying knowledge standards, with some that would lead to additional burdens for startups, and some that helpfully recognized larger platforms have both greater impact on online safety and greater resources to comply with more extensive regulatory requirements designed to improve it.

For example, H.R. 6291, the Children and Teens’ Online Privacy Protection Act separates “High-Impact” platforms and subjects only those platforms to a higher, more uncertain, more costly knowledge standard, while maintaining the clear and certain “actual knowledge” standard for startups. As in last Congress, where the committee carried this approach across multiple bills, this approach should be carried across all of the legislation that currently fail to recognize the differences between large and small platforms—especially the Kids Online Safety Act, the Reducing Exploitative Social Media Exposure for Teens (RESET) Act, H.R. 2657, Sammy’s Law, and H.R. 6253, Algorithmic Choice and Transparency Act. Unifying thresholds across legislation will help better balance Congress’ desire to improve online safety and preserve competitiveness of U.S. startups. 

User speech and content moderation.

Ultimately, problematic user content underlies many of the online safety issues Congress seeks to address with the legislation. If an online service facilitates user speech or interaction between users, it cannot ensure—in real time, across all media, languages, cultures, and contexts—that users won’t encounter harmful content created and shared by other users. Startups especially contend with the pressures and challenges of content moderation. They do not have the resources to hire tens of thousands of content moderators or invest hundreds of millions of dollars in content moderation tools like their large competitors; they are least equipped to handle ruinous legal costs arising from user content; and they do not have the market share or long-standing relationships with their users to risk alienating their user base by either under- or over-moderating user content. 

Varying standards.

The wide range of legislation being considered takes a variety of approaches to online safety, bringing along with them differing obligations and key definitions (for example, what qualifies as a “covered platform” and who qualifies as a “minor”). These incongruities in themselves threaten to increase costs for startups and undermine their competitiveness, leaving only the platforms policymakers are most concerned about in the marketplace. 

Infeasible requirements for startups.

Recognizing that companies of different sizes and resources will have differing abilities to determine users’ age—under the risk of legal liability if they get it wrong—is crucial to an online safety framework that works for startups. There is more work lawmakers must undertake to tailor obligations under these proposals for small companies. 

A few provisions found in bills being considered are particularly unworkable for startups. For example, in the Kids Online Safety Act, covered platforms—of all sizes, including startups—would be required to conduct annual independent third-party audits. These audit requirements will be prohibitively expensive, and it is further unclear what independent third-party organizations would be prepared to conduct such audits. Using SOC 2 as an illustrative point of comparison (an audit for cybersecurity and privacy controls performed by many software startups), audits from reputable firms can cost between $50,000-$120,000—or 1-2 months of an early-stage startup’s resources. Asking startups to perform these audits annually is certain to discourage entry into this space, undermine competitiveness, and ensure that only large platforms remain.  

Likewise, the premise of H.R. 6253, the Algorithmic Choice and Transparency Act, turns on requiring platforms to offer alternatives in how they display the content available on their services. This might sound reasonable on its face (indeed, some services differentiate themselves in the market this way), but requiring it as a matter of law in effect means that many startups would have to build another, duplicate version of their service. Startups put all of their limited resources into developing their original service, and it is unworkable to ask them to build an entirely new version. 

Preemption

Differing rules state-by-state about the same issue create frustration, practical challenges, and duplicate costs for startups across areas from data privacy to hiring. As states do more in the online safety space, that too marks an emerging patchwork of different rules. The bills considered at the subcommittee level that created obligations for companies (as opposed to, say, asking a federal agency to commission a study) included state preemption provisions to ensure online safety rules are uniform nationwide. At the subcommittee markup, nearly all Democrats and even some Republicans voiced skepticism over federal preemption of state online safety rules. Creating a consistent, predictable, nationwide rulebook is essential for startup competitiveness, and startups will be paying attention to what happens regarding preemption during the full committee markup. 

Enforcement

Most of the bills under consideration are enforced primarily by the Federal Trade Commission (FTC). State attorneys general may also enforce them, provided they give the FTC notice ahead of time. Outside of some Democrat-led proposals, the bills being considered do not provide for private rights of action. Exclusive enforcement by expert agencies provides for consistent and grounded enforcement, and removes the potential for bad-faith lawsuits from competitors or opportunistic plaintiffs. For early-stage companies operating with small teams and tight budgets, the threat of private litigation can be fatal, even when baseless. 

What comes next:

The House Energy and Commerce Committee will hold a full committee markup of these bills later this week, where they will likely be advanced. The bills may be combined at markup (as some were in the last Congress), and hopefully some of the above issues related to infeasible requirements and incongruous standards will be fixed. 

Next
Next

AI policy impacts startup competitiveness