Short answer: No. 

Let’s take a deep dive and find out why! 

When thinking of the independence Web3 offers for future content-sharing platforms and their users, one can’t help but wonder whether it should be moderated in some way. 

Even though preserving freedom of speech is essential, there might be several circumstances where inappropriate limits could be overstepped, which would make those same content-sharing platforms less safe for us. 

But where do we draw the line between content moderation and censorship? 

What does Web3 represent? 

The new phase of the internet will allow people to interact with each other directly without intermediaries. That means no more middlemen taking a cut of transactions or constantly implying unnecessary requirements for their users in order to be able to use their services. Instead, users will be able to communicate and make payments directly to one another.

Web3 is the natural expansion of the internet. What sets it apart from Web2 is the fact that the applications built on it are running on blockchain ecosystems, instead of sole corporation servers. This means that its control and future development would be in the hands of its community. 

See also: What is Web3 And How it Affects Digital Marketing?

This is both exciting and a little bit concerning.

On one hand, we would have more freedom of speech in every aspect – whether it be social media, website building, online reviews, etc. On the other, however, this would open new paths for malevolent content and hate speech, for example, to spread without any control or consequences. 

Why is content moderation important? 

Content moderation is an essential part of any online community. It ensures that people aren’t posting inappropriate content, and helps keep your site safe for everyone. 

But many people find it odd that even though being able to put out anything on a Web3 platform is generally how we advertise it, there is still a need to regulate what kind of content is posted there. They call it censorship.

I’ve had many discussions with people asking the same question: Why is this necessary if Web3’s core concept is to be able to share your thoughts without worrying about corporate regulations? 

And I understand that. But there are several topics of concern that would arise if we drop all control into the hands of good intentions and wishful thinking. 

Main points of concern, that can be prevented with content moderation:

The solution: 

Decentralized concerns call for decentralized measures.

There is already a dApp that can give us decentralized content moderation and proof of humanity verification. It’s called MODCLUB and is fully built and operating on the Internet Computer blockchain. It is based on the idea that Web3 content moderation naturally should rely on community governance. 

Their concept is simple. If there is a platform that needs any kind of user-generated content moderation, they can hire MODCLUB, and provide them with base rules and qualifications on the content they would like to have monitored. 

After that, MODCLUB’s community of moderators starts reviewing and voting on whether something is appropriate for the nature of the platform or not. 

The exciting thing is that anyone could become a content moderator by simply applying on their website, getting verified, and choosing their preferences on what kind of posts and articles they would like to work on. 

In order to prevent unfair and false voting, MODCLUB has introduced a reward system for all participants who vote “correct” (meaning with the majority) on the presented topics. 

Here’s where you can find more information on them:

How do you feel about content moderation on Web3?