Content Moderation Basics Made Easy
When you have a platform that encourages user-generated submissions, you’ll need to moderate the content people post. There is no way around that fact. Otherwise, you’ll end up with a mess on your hands.
Preparing a set of pre-determined guidelines is a must to keep your visitors from experiencing offensive or age-inappropriate content or links to unsuitable content. You also want visitors to feel free to express their views without fear if they fall within your website’s guidelines.
Deciding the difference between censorship and upholding community standards is tricky. You must decide ahead of time what is acceptable, so your moderators and your community understand what the community’s rules are.
What is Inappropriate Content?
What’s unfit depends on your audience. Typically, sites flag:
- Spam and external links
- Hate speech
- Graphic violence
- Personal information
- Depictions of illegal activity
Flagging Inappropriate Content
There are areas where you will have to review content flagged by a human moderator or a software program. For example, an arrest where police act violently can document an event, it does not necessarily depict gratuitous violence. A member of the LGBT may want to share something someone said to them as an example of the prejudice they experience in daily life, which an automated monitor could flag.
Image moderation and artificial intelligence are handy tools so you don’t get bogged down finding and deleting content that violates your site’s guidelines. AI can detect links, disgusting images and shocking or spam content, but there will always be gray areas requiring a human’s opinion. Automated programs work best at finding banned words; however, they are getting more sophisticated every day.
When to Moderate
Decide if users’ submissions will need moderation before or after posting. The direction you go will depend on the type of site you operate and your audience. Pre-moderation is essential if your audience is primarily children. One suggestive comment from a troll could have parents up in arms. Webmasters with questions about what type of user-submitted content they should allow should consult an attorney.
If you run a platform where users are adults, pre-moderation can discourage people who want to converse with others in real-time. Instead, use automated moderators to flag obviously inappropriate content before it posts and then have humans perform post-moderation and review the AI’s findings.
You can also have a way for members of the community to flag comments. This strategy works best when you have a dedicated core community that has a vested interest in keeping out people who are not interested in starting a discussion or contributing to one.
Have consequences in place for those who violate your website’s guidelines. These can include editing out portions of the content or removing it altogether. You can also temporarily or permanently suspend a user’s posting privileges. You can require pre-registration to avoid drive-by commentators who will never revisit your site.
Without moderation, your site can quickly descend into anarchy. You’ll discourage new members from joining, which is essential for your site’s growth.