This algorithm allows to quickly detect inappropriate content, determining that the content is NSFW (“Not Safe For Work”) or normal. Generic info about all capabilities and limits see in the generic “Content Moderation” method.
What is “Not Safe For Work”?
The algorithm has recognized inappropriate content in a video and it might not be suitable to view in public places. The solution provides its confidence level (in percentage) of how sure it is that the content is NSFW, or it most likely does not contain any sexual or similar content.
Different to soft-nudity-detection and hard-nudity-detection, this model will only check for sensitive material that can be considered not-safe-for-work.

How to use?
Frames within the specified video are analyzed.
Response will contain only frames for which the class nsfw is detected with a confidence of more than 50%.
Example of detected NSFW:
{
"nsfw_detected": true,
"detection_results": [ "nsfw" ],
"frames": [
{
"label": "nsfw",
"confidence": 0.93,
"frame_number": 1
},..
]
}
Example of a response without detecting inappropriate content:
{
"nsfw_detected": false,
"detection_results": [],
"frames": []
}
Please note that the API only provides a set of data (json) about the objects found, so no video is generated. The demo video video (above ^) was specially created based on json from the API for visual demonstration and better perception of the possibilities.
API key for authentication. Make sure to include the word apikey, followed by a single space and then your token.
Example: apikey 1234$abcdef
Name of the task to be performed
content-moderation URL to the MP4 file to analyse. File must be publicly accessible via HTTP/HTTPS.
Model for analysis (content-moderation only). Determines what exactly needs to be found in the video. AI content moderation with NSFW detection algorithm
nsfw Meta parameter, designed to store your own identifier. Can be used by you to tag requests from different end-users. It is not used in any way in video processing.
256Meta parameter, designed to store your own extra information about a video entity: video source, video id, etc. It is not used in any way in video processing.
For example, if an AI-task was created automatically when you uploaded a video with the AI auto-processing option (nudity detection, etc), then the ID of the associated video for which the task was performed will be explicitly indicated here.
4096Response returns ID of the created AI task. Using this AI task ID, you can check the status and get the video processing result. Look at GET /ai/results method.
ID of the created AI task, from which you can get the execution result