We have a community moderation problem, not a community-building problem

We have a community moderation problem, not a community-building problem

What if fixing existing communities is more mission-critical than starting new ones? ๐Ÿ› ๏ธ

ยท

2 min read

The Good

Online platforms offer community choices that weren't available pre-90s. For example, before the internet, it could be challenging to find another person interested in competitive duck herding. Online communities provided a way for people all over the world to bond over shared interests.

The Bad

The downside to these communities is that they lack moderation. Moderation is a common feature in offline communities, but it's missing in places like Facebook and Twitter. Part of this is due to an accident of history. Section 230 of the Communication Decency Act ruled that online platforms should not be treated as publishers. In other words, this means that social media sites like Twitter or Facebook are not liable for what their users write on their platforms.

I am not going to give an exhaustive treatment of the pros and cons of this law. What I will say is that this policy encouraged unmoderated forums. Threats of rape and murder became standard in comment threads. Rampant disinformation campaigns appeared in Facebook threads.

The Future- Thoughtful Moderation

The hate became so overwhelming that many major publications like CNN took away their articles' comment sections. That's a shame because comment sections are a place of community.

There's a lot of talk about building community, but what if this is really a call for meaningful moderation. What if we are sick of hate speech but don't think shutting down online platforms is the answer?

I think when many of us say we want "community," what we're really saying is that we want thoughtful moderation. We want spaces where members are held accountable. We want moderators who can curate our experiences and help us grow together. As a community