Help Center> Content Moderation> Service Overview> What Is Content Moderation?
Updated on 2024-03-27 GMT+08:00

What Is Content Moderation?

Content Moderation adopts image, text, audio, audio stream, and video detection technologies that detect pornography and images and text violating related laws or regulations. This reduces potential business risks.

Malicious information, such as pornographic information bursts with the rapid development and information explosion of the Internet. Products with such information may annoy users and even lose user confidence.

Content Moderation provides services through open application programming interfaces (APIs). You can obtain the inference result by calling APIs. It helps you build an intelligent service system and improves service efficiency.

Image Moderation

Image Moderation uses the deep neural network (DNN) models to accurately identify pornography in images, protecting you from non-compliance risks.

Text Moderation

Text Moderation uses the AI-based text detection technology to detect non-compliant content, such as pornographic content, advertisements, offensive content, and spamming content, and provide custom text moderation solutions.
Figure 1 Example of Text Moderation

Audio Moderation

Audio Moderation adopts a leading speech recognition engine and an intelligent text detection model to accurately identify pornography and abuse in audio, greatly improving user experience.

Video Moderation

Video Moderation adopts advanced AI technologies to detect video images, sounds, and subtitles and accurately and efficiently identify pornography, violence, and advertisements, improving the content governance quality and efficiency.

Audio Stream Moderation

Audio Stream Moderation accurately identifies pornographic content, abuse, and advertisements in various scenarios to defend against content risks, improve audio stream review efficiency, and deliver better experience.