API
The Gcore Customer Portal is being updated. Screenshots may not show the current version.
Video Streaming
Video Streaming
Chosen image
Home/Video Streaming/AI video services/Content moderation/Child sexual abuse materials

Child sexual abuse materials (CSAM)

The child pornography content moderation task detects child sexual abuse materials (CSAM).

To identify this content, we first run the soft nudity and hard nudity detection tasks. If both methods indicate the presence of obscene content with the involvement of children (child's face) in a frame, then such a video is marked as obscene. Frames are designated by the age category of identified children.

The check returns information with the number of the video frame in which the child's face was found and the child’s age. Objects with a probability of at least 30% are included in the response.

To run the child pornography detection check:

1. In the Gcore Customer Portal, navigate to Streaming > AI. The AI tasks page will open.

AI tasks page

2. In the Origin URL field, enter the link to your MP4 video. You have two options:

  • Paste video origin URL: If your video is stored externally, provide a URL to its location. Ensure that the video is accessible via HTTP or HTTPS protocols.

    To check the example of correctly formatted URL, use the link under the field. It will autogenerate a sample URL and you can adjust your URL accordingly.

    Example of the origin URL
  • Select from uploaded videos: choose a video hosted on the Gcore platform.

3. In the Task type, choose Content moderation.

4. In the following dropdown, select Child pornography detection.

5. Click Generate task.

6. Wait until the task is processed and has the Sucess status, click the task ID to view task details.

7. Check out the Task result field. You can have one of the following outputs:

  • Child pornography detection: not found. This means that your video has no child sexual abuse materials.

  • If some sensitive content is found, you’ll get the info about the detected element, relevant iFrame, and the confidence level in % of how sure AI is that this content contains child sexual abuse materials.

Child pornography detection task details

Was this article helpful?

Not a Gcore user yet?

Explore the Streaming Platform by Gcore

Go to the product page