CSAM

This page deals with sensitive content, including abuse targeted at children. This banner serves as a content warning before reading on.

Glossary of terms

  • NCMEC = National Center for Missing and Exploited Children - NCMEC works with local and federal agencies in the USA along with other countries when it comes to investigating and reporting instances of CSAM and other abuses towards children.
  • CSAM = Child Sexual Abuse Material - this represents abuse or exploitation of minor that is typically sexual in nature
  • Hash Database - NCMEC maintains a validated hash database of known CSAM. This database is used by Clavata to check if uploaded content presents a hash match

How does Clavata handle this content?


When evaluating image content, Clavata uses the NCMEC hash database to check to see if the content we are evaluating is a match for known instances of CSAM. Upon getting a match you will see the following in your API response.



When receiving the enum "PRECHECK_FAILURE_TYPE_NCMEC" you should treat the content with caution. It very likely contains CSAM that was uploaded by a user. Below is a small checklist to follow when you get this flag.


  1. Lock the content down! Ensure other users on your platform can't view this content.
  2. Head over to NCMEC's CyberTipline Report to report the content to NCMEC.
    1. When filling out the report be detailed and provide as much information as you can.
  3. You will typically need to isolate the content on your system for a period of time in order to securely share the content with NCMEC.
  4. From there you should follow guidance and instructions from NCMEC.
Did this answer your question? Thanks for the feedback There was a problem submitting your feedback. Please try again later.