Startup founders travel to Capitol Hill with Engine to talk content moderation

When policymakers think about changing the rules for how companies host and moderate user content, they focus on three or four large Internet companies. But we know from talking to startups across the country every day that all kinds of companies host user content online. They may not be the companies that are top of mind when policymakers write rules, but they’re the companies least equipped to deal with compliance and litigation costs designed with the large industry players in mind.

That’s why Engine works to highlight the stories and voices of startups, especially around issues where the conversation is dominated by others. And that’s why last week, we brought startup founders to Capitol Hill to discuss content moderation issues with policymakers and their staff. Founders of seven companies from across the country—Atlanta-based iAccess Life, Superior, Colorado-based hobbyDB, Mount Laurel, New Jersey-based TeacherCoach, Los Angeles-based Tomodachi, Omaha, Nebraska-based Event Vesta, Rochester, New York-based Arkatecht, and San Francisco-based Fiskkit—joined us in D.C. to tell the stories of their companies and, specifically, their experience hosting and moderating user content. The founders participated in meetings with 10 different House and Senate offices, as well as spoke on a panel about what content moderation looks like for their business. 

In addition to being from different parts of the country, all of those companies have different business models and host different kinds of content for different communities of users. TeacherCoach allows school districts to share mental health resources with teachers, which is very different than hobbyDB, which lets collectors share information about collectibles. And iAccess Life—a platform that lets users with mobility impairments leave reviews about the accessibility of public spaces—has a different experience than EventVesta, a platform that hosts information about local events. But all of them grapple with limited time and resources to dedicate to content moderation issues, and none of them have the ability to individually review every piece of content uploaded to their services.

Under current intermediary liability frameworks—namely Section 230 and the Digital Millennium Copyright Act—these companies can launch, attract investment and users, and grow while moderating content in ways that make sense for their communities of users and without having to worry about potential ruinous litigation over complaints about user content. Policy changes that would require them to approach content moderation in identical ways—such as proactive monitoring of all content—or force them to navigate costly private litigation over user content would disproportionately harm startups like these.

These are the kinds of companies policymakers should be thinking about when they want to change the rules around hosting and moderating user-generated content. We’re grateful these founders took the time to travel to D.C. and help educate policymakers on the kinds of platforms in the startup ecosystem and the wide range of companies that would be impacted by policy changes.