3 Content moderation
This chapter covers
- What content moderation is, and why it’s important
- A taxonomy of the content that platforms have to deal with
- Practical challenges that are present when doing content moderation
- The legal basis of content moderation in the US
Content moderation is the process by which online platforms review and manage the content that is allowed on their platforms. For content streaming services, it might be a decision that is made when new content is created or added to the platform—for example, by not allowing certain movies or episodes of a TV show considered inappropriate. In the case of platforms with user-generated content, such as social media, the task is much more complicated because each user is also a creator of content, making the potential for abuse much more likely. Platforms have introduced content policies regulating the content they allow in their online space, which can be thought of as the laws that govern their spaces. And just as laws must be interpreted when applied, the same is true for such policies, interpreted and enforced by content moderators. The task is far from being straightforward and easy, as we will see in the next few pages. A common source of confusion is that the removal of content constitutes a breach of free speech rights in the United States. That is incorrect because platforms are private companies, and as such have the right to decide what they can keep and remove—within the bounds of the law.