It takes years to build a powerful brand that resonates with customers. It can take just moments to destroy your customer’s trust and loyalty when they are exposed to abuse, fraud, misinformation, or offensive content on your company’s digital platforms.
Maintaining control of your brand reputation while engaging with customers in the digital world can be challenging due to the sheer volume of content online. Prior to 2020, social media platforms reported a staggering amount of content being shared. Every day, 350 million photos are uploaded on Facebook, 500 million tweets are posted on Twitter, and 720,000 hours of video are uploaded to YouTube. According to Datareportal’s Global Overview report, 319 million new users came online in 2020 — almost 875,000 new users each day.
Users are not just consuming digital content, they are commenting, posting, interacting and uploading content. User-generated content (UGC) refers to any type of content that is created by people, rather than a brand or business. UGC includes text, images, video, and audio that is shared on social media, company websites, review sites, e-commerce sites, community forums, gaming platforms and other digital channels.
The proliferation of UGC and its power to sway consumers creates a significant risk for online brands. Businesses have no control over what users share, which opens the door to spam, hateful or defamatory content, harassment, copyright and trademark infringement, and privacy issues. In August 2020, TikTok reported that it had removed more than 300,000 videos that violated its hate speech policy. It also banned more than 1,300 accounts for hateful content or behavior, and removed over 64,000 hateful comments. NewsGuard has identified more than 430 websites publishing misinformation about COVID-19 — from false cures to conspiracy theories to vaccine myths. The case for protecting a brand’s customers from potential abuse is compelling.
Content moderation is the process of screening and monitoring UGC (text, images, video and audio) based on a predetermined set of rules to filter content that is spam, abusive, inappropriate, illegal or otherwise does not adhere to the site’s guidelines for UGC.
There are five common approaches to content moderation.
The explosion of UGC in recent years and the value that brands place on live consumer engagement has created a growing demand for real-time content moderation services. A study by Transparency Market Research estimates the content moderation solutions market to reach $11.8 billion by 2027, with a CAGR of 10% from 2019 to 2027.
The TMR report pointed out that, in 2018, 60% of the study’s end users preferred content moderation services over software, and they expect that trend to continue.
CustomerServ’s content and community moderation call center and BPO partners provide:
The pandemic dramatically accelerated the popularity, growth and business value of user-generated content globally. Unfortunately, the amount of online abuse also has rapidly grown over the last few years, with four-in-ten Americans (41%) in 2020 stating that they have experienced some form of online harassment, according to Pew Research Center.
Protecting your customers and potential customers from toxic voices and offensive content is essential for building brand loyalty online. Content moderation provides the means to mitigate the risks associated with user-generated content while providing customers with a safe place to share and connect.