User Generated Content Moderation – Challenges & Beyond!
User Generated Content Moderation – Challenges & Beyond!
August 14, 2018

User Generated Content, or UGC as it is referred to, has been, and will always be a buzzword in the world of social media. The reason being, it is a two-edged sword!


Now, what do we mean by that, literally?


Since the time brands gained awareness about the power of social media in branding, they have extensively used UGC in their promotional campaigns. The audience that posts such content act as micro influencers and promotes the brand indirectly. Also, it goes a long way in saving the marketing costs for the brand(s).


However, UGC can wreak havoc for the brands if not monitored in a systematic manner. This is due to the fact that websites have no control over the users’ thought process. Hence, content moderation has picked up the pace!


The task of content moderation can be accomplished in either way – human interference, or through automation.


Now, irrespective of how content moderation is approached by the concerned organizations, they broadly fall into the following categories, viz.

    1. Pre-moderation: As the name suggests, the posted content is evaluated by a team of moderators (or by a piece of software) before it goes live on the respective website.
    1. Post-moderation: The content goes live, and is lined up for a review simultaneously.
    • Reactive moderation: This method relies on users’ collective opinions regarding a specific post. When users’ community flag down a post as inappropriate, the relevant team gets to review it and acts accordingly.
    1. User-only: Users are given the authority to rate content on an individual basis. If a post gets rated as inappropriate a set number of times, then it is removed automatically from the site.

Each of the above techniques has its own downside. In this article, however, we’ll analyze the challenges of UGC moderation (especially those faced by human moderators) in a broader sense.




Content moderation has become more of a specialized subject in itself, than a mere topic to be talked upon. Considering the work profiles of moderators, there are many dilemmas they have to deal with on a regular basis.


So, two of them are –

  1. A. Distinction Between Right and Wrong
  2. B. Limitations of Workforce

Let’s delve into the depths of these,

    1. A. Distinction Between Right and Wrong

Doesn’t it seem obvious? Content moderators are mature species, aren’t they?

It’s very easy to ask such questions. But, there are times when this becomes a tedious task!


Every social media platform has its own set of guidelines. So, technically speaking, what might be deemed appropriate on one site will not be so on another platform and vice-a-versa.


Slang language can be easily differentiated, but what about those who indulge in indirect profanity. For example, mischief-mongers have their own ‘code word(s)’ while addressing a community or religion. And, it, later on, creates a political storm or can proliferate communal hatred.


I won’t be able to quote any specific case study in this blog, considering the sensitivity of the issue.


So, the bottom line is that it is indeed challenging to draw a thin line between right and wrong!

    1. B. Limitations of Workforce

This includes flexibility and psychological impacts regarding human content moderation.


As stated before, UGC has gained tremendous momentum in the digital world. So, it’s practically impossible for moderators to be omnipresent for evaluating every single post that goes on the web! This is generally the case when a given site has a global presence or targets different demographic groups.


Profanity is not limited to usage of slang and provocative language. It includes obscenity in form of images, videos, etc. And, moderators have to go through tonnes of them, daily!


Try putting yourselves in their shoes and just see the psychological impact it can have!



Content Moderation System

Content moderation is no more an option today. It is a must-do, so as to keep out unwanted content from the vicinity of your dear site(s).


Striking a balance will do the trick!


But a balance between what?


I mentioned automation at the beginning of this post. Remember?


Automation, like in any other arena, will surely ease out the workload of moderators up to a great extent.


That being said, automation should be foolproof! Effective methods include automated algorithms, such as Bayesian filtering and pattern detection of blacklisted words and phrases, colour tone and user/location profiling.


There is one great piece of an article out there, that clearly outlines how content moderation can be approached by organizations. Check it out!




If approached with a rational mindset, content moderation can be fun and, at the same time, will help to make the digital community more resourceful and engaging.


Content Moderation

Share this article