Child Sexual Abuse Material (CSAM)

Any content found to be sexualizing and exploiting minors will be removed and reported to the National Center for Missing & Exploited Children (NCMEC) and INHOPE. This applies to photos, videos, animated imagery, descriptions, sexual chat concerning children and all other media/text involving children. is protected by the CloudFlare Child Sexual Abuse Material (CSAM) scanning tool. The CSAM scanning tool allows us to proactively identify and take action against any CSAM located on, or any attempts to upload such material to, our website. This service will automatically report any image or video file that matches known Child Sexual Abuse Material (CSAM) to the National Center for Missing and Exploited Children (NCMEC) and INHOPE, where they will take appropriate action.

The purpose of the CloudFlare CSAM scanning tool is to prevent the spread of child sexual abuse content, and to support investigations targeted at stopping the distribution and possession of child sexual abuse content.

For more information please read:

Why are we so strict with Child Sexual Abuse Material (CSAM)? There are a few reasons:

  1. CSAM is illegal and immoral.
  2. CSAM is dangerous to the future of our project and community.
  3. Our community does not want to associate with any CSAM, and posting this kind of media has a very negative impact on our audience.
  4. Our advertising customers do not wish to be associated with any CSAM content.