FTC's COPPA rule review should consider impact on startups


FTC's COPPA rule review should consider impact on startups

TLDR: The FTC is reviewing potential updates to a children’s privacy law to determine whether changes need to be made to the law to account for “evolving business practices.” While protecting children’s privacy online is a shared goal of the FTC and the tech community, some potential changes to the rules under the law could impact platforms across the Internet, especially startups.

What’s happening this week: The Federal Trade Commission held a public workshop this week to discuss potential updates to the Children's Online Privacy Protection Act (COPPA), a law that requires operators of websites and apps directed to children to take a number of steps to protect the privacy of users who are under the age of 13 years old. 

The FTC says it is examining the rule to determine whether it requires any additional updates to account for “evolving business practices.” The agency usually reviews its rules every ten years, but decided to take up another COPPA review after just six years “because of questions that have arisen about the Rule's application to the educational technology sector, to voice-enabled connected devices, and to general audience platforms that host third-party child-directed content.” The FTC’s public comment period for the COPPA review period remains open until Oct. 23.

Why it matters to startups: The FTC’s COPPA review deals with a number of issues critical to the startup community, including how the rules should be applied to sites with general audiences that ostensibly include children. Currently, the FTC’s rules under COPPA apply only to websites and platforms that are directed at children under the age of 13, which creates a relatively clear line for Internet companies to follow as they think through their regulatory compliance.

The application of COPPA rules to general audience websites has increasingly become a question in recent FTC actions. Earlier this year, the FTC settled with TikTok (previously called Musical.ly) over charges that the app knew it had users who were children and was illegally collecting personal information of those users. Just last month, the FTC reached a record-breaking $170 million settlement with Google over YouTube’s collection of personal information about children despite knowing that there were child-directed channels on the platform. In addition to the $170 million, the settlement requires YouTube to create a system by which channel operators can identify as child-directed so that YouTube can collect data about those viewers in a COPPA-compliant way.

Much of the discussion at this week’s workshop—and many of the questions raised in the FTC’s notice—focused on how the agency should handle platforms that host user-generated content and are for a general audience but include a large number of users who are children. One idea raised during the workshop would be changing the standard under the rules from actual knowledge—such as when a user identifies as being under the age of 13—to constructive knowledge, which would ask a platform to attempt to figure out whether a user is under the age of 13 based on factors including what the user is viewing on the platform. This kind of shift away from the bright-line “actual knowledge” standard could pose a huge burden for small platforms that won’t always have the best information to determine whether a user is under the age of 13 but will face huge regulatory risks if they get it wrong.

Another idea raised both at the workshop and in the agency’s notice was finding ways to incentivize platforms that host user-generated content to take “affirmative steps” through the use of “reasonably calculated” measures to identify child-directed content. Platforms that host content created and uploaded by users around the world every day can never know for sure exactly what users are sharing, a reality that comes up frequently in debates over copyright infringement and general online content moderation. While critics of online platforms are quick to suggest technological solutions to identify infringing, dangerous—and potentially child-directed, in the conversation over updating COPPA rules—content, these solutions, such as expensive-to-build upload filters, have inherent limitations and only work with limited success in very specific contexts. 

In some copyright infringement detection systems, rightsholders upload their content to a database that can then scan new user content against rightsholders’ content to find identical matches. The system only works when identifying identical copies of already-uploaded content, and it can’t take into account things like fair use or whether the user has the rightsholder’s permission to use the content. It works even less well when trying to identify ever-changing and amorphous content that could be considered child-directed under the rules, which has an imprecise, multi-factor definition for “child directed,” including subject matter, language, and presence of children. Requiring platforms to proactively identify content created by their users that is child-directed will result in companies having to build costly, time-consuming, and ultimately ineffective tools. 

Protecting children’s privacy is important, and we applaud the agency’s ongoing efforts to safeguard our youngest citizens. But it’s also important for the FTC to consider the impact that any potential new rules would have on websites, apps, and platforms of all sizes.

On the Horizon.

  • Engine and the Charles Koch Institute will be holding the final panel in our series on the nuts and bolts of encryption on Friday, Oct. 18, at noon. We’ll be looking at the calculations that would go into determining whether or not to comply with a law enforcement request for an encryption backdoor. Learn more and RSVP here.

  • Two House Energy and Commerce subcommittees have announced plans to hold a joint hearing on Oct. 16 to examine “online content moderation practices and whether consumers are adequately protected under current law.”

  • The House Small Business Committee announced plans to invite officials from Amazon, Google, and Facebook to a hearing this fall to discuss their impact on small businesses.