What Is Content Moderation?
Content Moderation adopts image, text, audio, audio stream, and video detection technologies that detect pornography and images and text violating related laws or regulations. This reduces potential business risks.
Malicious information, such as pornographic information bursts with the rapid development and information explosion of the Internet. Products with such information may annoy users and even lose user confidence.
Content Moderation provides services through open application programming interfaces (APIs). You can obtain the inference result by calling APIs. It helps you build an intelligent service system and improves service efficiency.
Image Moderation
Image Moderation uses the deep neural network (DNN) models to accurately identify pornography in images, protecting you from non-compliance risks.
Text Moderation
Audio Moderation
Audio Moderation adopts a leading speech recognition engine and an intelligent text detection model to accurately identify pornography and abuse in audio, greatly improving user experience.
Video Moderation
Video Moderation adopts advanced AI technologies to detect video images, sounds, and subtitles and accurately and efficiently identify pornography, violence, and advertisements, improving the content governance quality and efficiency.
Audio Stream Moderation
Audio Stream Moderation accurately identifies pornographic content, abuse, and advertisements in various scenarios to defend against content risks, improve audio stream review efficiency, and deliver better experience.
Feedback
Was this page helpful?
Provide feedbackThank you very much for your feedback. We will continue working to improve the documentation.See the reply and handling status in My Cloud VOC.
For any further questions, feel free to contact us through the chatbot.
Chatbot