API
The Gcore Customer Portal is being updated. Screenshots may not show the current version.
Video Streaming
Video Streaming
Chosen image
Home/Video Streaming/AI Video Services/Content Moderation/

Hard nudity detection

The hard nudity content moderation task detects explicit nudity of the human body (involving genitals) in a video. This method is often used to detect whether videos can be published to all users, or the publication should be prohibited due to offensive and inappropriate content.

This task detects fewer objects than soft-nudity detection. Hard nudity detection works faster and better when you only need to detect exposed body parts. For a full list of objects that can be detected with the soft nudity check, read our API documentation.

If hard nudity content is detected, the AI model will provide its confidence level (in percentage) of how sure it is that the content is hard nudity.

To run the Hard nudity detection check:

1. In the Gcore Customer Portal, navigate to Streaming > AI. The AI tasks page will open.

AI tasks page

2. In the Origin URL field, enter the link to your MP4 video. You have two options:

  • Paste video origin URL: If your video is stored externally, provide a URL to its location. Ensure that the video is accessible via HTTP or HTTPS protocols.

    To check the example of correctly formatted URL, use the link under the field. It will autogenerate a sample URL and you can adjust your URL accordingly.

    Example of the origin URL
  • Select from uploaded videos: choose a video hosted on the Gcore platform.

3. In the Task type, choose Content moderation.

4. In the following dropdown, select Hard nudity detection.

5. Click Generate task.

6. Wait until the task is processed and has the Sucess status, click the task ID to view task details.

7. Check out the Task result field. You can have one of the following outputs:

  • Hard nudity detection: not found. This means that your video has no hard nudity content.

  • If some sensitive content is found, you’ll get the info about the detected element, relevant iFrame, and the confidence level in % of how sure AI is that this content is hard nudity. For example, you can get the following output: “FEMALE_BREAST_EXPOSED: detected at frame №2 with score 41%”address.

Hard nudity detection task details

Was this article helpful?

Not a Gcore user yet?

Explore the Streaming Platform by Gcore

Go to the product page
// // Initialize a variable to undefined initially. // var growthBook = undefined; // (function() { // try { // var script = document.createElement('script'); // script.src = "https://cdn.jsdelivr.net/npm/@growthbook/growthbook/dist/bundles/auto.min.js"; // script.setAttribute("data-api-host", "https://cdn.growthbook.io"); // script.setAttribute("data-client-key", "sdk-truekA5wvhMYaqsu"); // document.head.appendChild(script); // script.onload = function() { // console.log("GrowthBook script loaded successfully."); // growthBook = window.GrowthBook; // Assuming GrowthBook attaches itself to window // }; // script.onerror = function() { // console.error("Failed to load the GrowthBook script."); // growthBook = undefined; // Explicitly set to undefined on error // }; // } catch (error) { // console.error("An error occurred while setting up the GrowthBook script:", error); // growthBook = undefined; // } // })(); // // Optional: Push to dataLayer if needed // window.dataLayer = window.dataLayer || []; // window.dataLayer.push({ // 'event': 'scriptLoadStatus', // 'growthBookStatus': growthBook ? "Loaded" : "Failed" // });