Gcore AI content moderation is a powerful solution for analyzing and filtering video content. It detects inappropriate materials and ensures that delivered videos are safe and suitable for your users.
You can also use content moderation to detect specific sports activities for better personalization or copyright protection.
Content types we detect in videos and images:
We run multiple AI models on our infrastructure to conduct real-time analysis of sensitive and restricted content types. After the video is processed, the original file is deleted from AI’s local storage.
The content is detected by analyzing keyframes (iFrames) in a video. For example, if a keyframe is set every 2 seconds, the analyzis will occur at these intervals. Currently, we don’t detect objects between these timestamps. However, we’re working on a version to analyze more frames.
You can also process static images, where the duration of one picture is counted as 1 second.
Check out Gcore pricing page for detailed information about AI content moderation costs.
Video streaming & TV broadcasting:
Broadcasting of sports events:
Video on demand (VOD) platforms:
There's no one-size-fits-all criterion or nudity score that can definitively determine whether a video is inappropriate. Different video hosting services cater to specific audiences such as adults, children, educational groups, etc. For instance, an acceptable nudity percentage for a site dedicated to sex education would be higher than for a hosting site that uploads entertainment videos intended for children.
You can set a probability threshold to determine when a video is inappropriate for your specific use case. One method is to run videos for one day and analyze the resulting probability coefficient.
Was this article helpful?
Explore the Streaming Platform by Gcore