The hard nudity content moderation task detects explicit nudity of the human body (involving genitals) in a video. This method is often used to detect whether videos can be published to all users, or the publication should be prohibited due to offensive and inappropriate content. This task detects fewer objects than soft-nudity detection. Hard nudity detection works faster and better when you only need to detect exposed body parts. For a full list of objects that can be detected with the soft nudity check, read our API documentation. If hard nudity content is detected, the AI model will provide its confidence level (in percentage) of how sure it is that the content is hard nudity. To run the Hard nudity detection check: 1. In the Gcore Customer Portal, navigate to Streaming > AI. The AI tasks page will open.Documentation Index
Fetch the complete documentation index at: https://gcore.com/docs/llms.txt
Use this file to discover all available pages before exploring further.

- Paste video origin URL : If your video is stored externally, provide a URL to its location. Ensure that the video is accessible via HTTP or HTTPS protocols.

- Select from uploaded videos : choose a video hosted on the Gcore platform.
- Hard nudity detection: not found. This means that your video has no hard nudity content.
- If some sensitive content is found, you’ll get the info about the detected element, relevant iFrame, and the confidence level in % of how sure AI is that this content is hard nudity. For example, you can get the following output: “FEMALE_BREAST_EXPOSED: detected at frame №2 with score 41%“address.
