Executive order on content moderation


Executive order on content moderation

What’s happening this week:  The tech world is still digesting reports of a draft executive order from the White House that would empower the FTC and the FCC to crack down on social media platforms that moderate content that appears on their sites. The draft order, according to a summary obtained by CNN, would ask the FCC to develop regulations clarifying how and when digital platforms are protected when they remove online content. It also reportedly calls for the FTC to take the new regulations into account when it investigates or sues companies. The proposed order is not final and is subject to change, although it would represent a significant escalation of Trump administration’s allegations that social media sites are biased against conservatives. 

Why it matters to startups: Online platforms of all sizes are forced to deal with the inherent difficulties of moderating content. Too much content moderation, and a segment of users will begin to complain that their voices are being censored by online sites. But too little content moderation can also allow hateful and misleading—although still legal—content to proliferate online, leading to allegations that platforms are not doing enough to combat the spread of hate speech and misinformation. 

The White House’s draft executive order represents the most recent threat to Section 230 of the Communications Decency Act, which protects platforms that are faced with making these difficult decisions. Section 230 allows websites of any size to receive liability protections for deleting third-party content they find objectionable, while also being protected from liability for any content that they do not remove. These protections were written to allow digital platforms of all sizes to grow without the constant threat of lawsuits undermining their success.

If the executive order—as reported—is signed, it would force websites to host objectionable material simply out of fear of being held liable for removing the offending content. As Evan Engstrom, Engine’s executive director, pointed out in a Morning Consult op-ed, “Making this process even more difficult by putting legal liability on websites when they take down too much or too little, as policymakers seem insistent on doing, will not improve anything.”

The draft executive order isn’t even the first attempt over the past year to roll back Section 230 protections. Sen. Josh Hawley (R-Mo.) introduced legislation in June that would remove liability protections from Internet platforms and hold them liable for any illegal content on their sites if they do not remain “politically neutral,” a largely meaningless and subjective term. Rep. Paul Gosar (R-Ariz.) introduced similar legislation in the House last month, although to-date no formal text has been released.

In a meeting with executives from large tech companies on Friday, Trump administration officials also expressed interest in the use of tools that could predict mass shooters by scanning social media posts. Tech leaders reportedly expressed concerns that such technology is not feasible to implement. It’s difficult to understand the administration’s rationale here: officials want to limit platforms’ ability to police their own sites, while also having them more closely monitor the content posted by their users. Preserving Section 230’s protections, and strengthening platforms’ ability to police their own content, is perhaps the most effective way to combat online hate speech, while still preserving the freedom of the Internet ecosystem. 

On the Horizon. 

  • Congressional Startup Day, a nationwide celebration of entrepreneurial communities organized by Engine and other nonprofit organizations, is next Wednesday, August 21st. During Congressional Startup Day, we help lawmakers meet with startups in their district to learn more about how government can work with startups to support innovation and entrepreneurship. Congressional Startup Day events will also be held over the next several weeks.