Report: Nuts & Bolts of Content Moderation

Presented by Engine and Charles Koch Institute:

“The Internet has enabled individuals to easily consume, create, and share content with other users across the world. Indeed, we post, comment on, and share everything from political opinions, to cat videos, to product reviews every minute on platforms across the Internet. These Internet platforms rely on several things to host user content while maintaining the integrity of the platform, including their own tireless content moderation efforts as well as the current legal framework underpinning the Internet since 1996.

That legal framework, especially Section 230 of the Communications Decency Act, has come under attack from all sides in recent years. Some argue Internet platforms are doing too much content moderation, especially in ways that allegedly disadvantage specific political viewpoints. At the same time, others argue Internet platforms are doing too little content moderation and failing to keep users safe and in compliance of state and local rules. As policymakers think through whether there’s a need to change this foundational Internet law, it’s critical that they understand the ways in which all Internet platforms—not just the biggest two or three—rely on this framework to conduct their current content moderation practices.

In this report, and through a series of events in Washington, D.C. in the summer of 2019, Engine and the Charles Koch Institute sought to unpack the nuts and bolts of content moderation. We examined what everyday content moderation looks like for Internet platforms and the legal framework that makes that moderation possible, debunked myths about content moderation, and asked attendees to put themselves in the shoes of content moderators. “

Read the full report here.