Create AI CM:CASM task
This algorithm allows to detect Child Sexual Abuse Materials (CASM). Generic info about all capabilities and limits see in the generic “Content Moderation” method.
What is Child Sexual Abuse Materials detection?
This method is intended to prevent this type of content from being distributed over the Internet.
For child pornography detection we first run a “soft_nudity
” and a “hard_nudity
” tasks. If both methods indicate the presence of obscene content with the presence of children (child’s face) in a frame, then such a video is marked as obscene. Frames are designated by the age category of identified children.
How to use?
The information is returned with the number of the video frame in which the child’s face was found and age.
Nudity detection is done using AI, so for each object a probability percentage is applied; objects with a probability of at least 30% are included in the response.
Video processing speed is approximately 1:10.
Example of detected nudity:
{
"`child_pornography_detected`": true,
"`detection_results`": [ "3-9" ],
"frames": [
{
"`frame_number`": 407,
"label": "`FACE_FEMALE`",
"confidence": 0.78,
"age": "3-9",
"`age_confidence`": 0.65
}...
]
}
Example response without nudity found is empty array:
{
"`child_pornography_detected`": false,
"`detection_results`": [],
"frames": []
}
Authorizations
API key for authentication.
Body
Response
Response returns ID of the created AI task. Using this AI task ID, you can check the status and get the video processing result. Look at GET /ai/results method.
The response is of type object
.