News

Content moderation is nasty, but essential. How do we make it better?

Content moderation is a dirty job, but someone has to do it. This is the reality of the Internet today. This is what many executives fail to realise, as they cut back on content moderation because it doesn’t appear to be essential for their business. Robust content moderation isn’t just something that’s nice to have – it’s essential if you want your users and customers to return.

Talk to anyone that has sat on the front line of a content moderation team and they will tell you about their experience of viewing murder, suicide, porn – including children, tragic road or air accidents, and even cannibalism. Content moderators are not just browsing content and checking it’s all OK. They need to watch child sexual abuse all day so your end users can avoid it.

Since the Covid pandemic arrived, many content moderators are working from home. Imagine introducing a firehose of disgusting filth into your own home. At least the office environment helped to create a psychological barrier, but now moderators may be watching child abuse one minute and then caring for their own child a few minutes later.

Technology can help, but only up to a point. Artificial Intelligence (AI) is getting smarter and can screen out a huge amount of content automatically. Try uploading porn to YouTube and it’s almost certainly never going to be published – Google has enough faith in their AI to automatically block the upload. In this case, human content moderators don’t need to be involved, but every time there is some doubt or uncertainty in the system, a real person has to view the content to decide if it violates the rules of the platform.

It’s nasty, but it is also essential.

If we want to continue using the Internet as we have become accustomed – with a large amount of user-generated content (UGC) – then users need to feel safe. Therefore the content needs moderation.

It’s not just a problem for social networks. You might automatically think of TikTok, YouTube, or Twitter when someone mentions uploading content, but most E-commerce platforms now ask customers to give feedback. If online retailers ask customers to post photos, videos, and reviews then they are just as vulnerable as the social media platforms.

Elon Musk recently laid off some of his Twitter content moderation team. Engadget even went so far as to call this reduction in content moderation headcount “a monthly occurrence at Twitter.”

When Musk took over at Twitter, reinstating banned accounts and reducing controls on content moderation in order to ‘enable free speech’, advertisers fled the platform. Half of all the top advertisers just left because they did not want to see their adverts featured next to hate-filled or offensive posts. Forrester Research said that Twitter needs advertisers much more than advertisers need Twitter.

The message from the recent experience at Twitter is clear. If you reduce moderation, for whatever reason, so that the average user might be frequently offended by what they see online then your users will start leaving – along with the advertisers.

Advertisers and users both want to engage in an environment that most people consider safe. Content moderation is critical and will be even more so as we move into a metaverse environment of virtual worlds where most of the environment you experience will be created by the users. UGC is all around inside the metaverse.

So what would I suggest for an improvement to the way that content moderation is managed at present? Here are three ideas:

  1. Greater Specialisation: I understand why all the customer service business process outsourcing companies are involved in this area. They have experience managing customer service processes that on the surface are similar – a contact centre full of people answering calls or checking content. However, content moderation is a very different process that requires ongoing psychological support and flexibility. You cannot put people into a role like this and then expect them to hit ‘sales targets’ of how many offensive videos they can view on a shift. A different approach is required and I would suggest there is scope for some new specialist brands or spin-outs from the major BPOs – when they realise that it’s better to manage this process separately from their bread and butter customer experience (CX) business.
  2. Ideal for gig work: this work can be tough and may be better suited for those who can focus on it for a shorter period of time. One or two hours a day, rather than a continuous ten-hour shift. What if you feel frazzled after five hours of disgusting content? If we switch focus to paying people for each video moderated, rather than an hourly rate, and remove fixed shifts and work times, then complete flexibility can be handed to the moderator. If they are feeling good then they can work longer. If they are struggling with the content, they can quit for the day. Combine the flexibility of this GigCX approach with counseling and support and the process looks far more robust.
  3. Focus on your people: make it clear they can take breaks as needed. Make it clear that you value the health and wellbeing of your moderators. Compensate them fairly for the job they are doing – protecting customers. Make it easy to use counselling and support services without stigma. Make it clear during the hiring process that patience really is a virtue in this job – people may often find they need to place this work in a box in their own mind to avoid future trauma. Find the people who do this more easily and support them.

It may look like a contradiction to say that home-based content moderation is problematic, but work-from-home gig workers can manage. The difference is in the flexibility of the hours and the approach. If a worker has to do full shifts then the mental separation provided by an office is helpful. If they can control their hours and stop as needed, then working from home may still be an option.

Content moderation is here to stay and is becoming critically important for online safety as the promise of Web 3.0 is focused on more decentralised UGC. For this to work, we all need to feel safe and supported – both the end users and the content moderators.

Let me know what you think about content moderation and how it might develop in 2023. Leave a comment here on the article or get in touch via my LinkedIn.

To Top