Trust & Safety

Safety & Content Moderation

keni hosts user-generated content (chat media, moments) and takes child safety extremely seriously. This page describes the technical and operational measures we use to detect and report child sexual abuse material (CSAM) and other illegal content.

1. CSAM detection

Images uploaded to keni Moments or sent over chat (whether E2EE or transient) are hashed and matched against Microsoft PhotoDNA's set of known CSAM. PhotoDNA returns a match score; above the operator-defined threshold the upload is blocked and the matching media is preserved per 18 U.S.C. § 2258A.

2. NCMEC reporting

Confirmed CSAM matches trigger an automated CyberTipline report to the National Center for Missing & Exploited Children (NCMEC) within 24 hours, with the preserved media bytes, hash, uploader account identifiers, and timestamp.

The reporting account, user agent, IP, and account history are included so NCMEC can route to the appropriate law enforcement agency.

3. Other illegal content

Beyond CSAM, keni's moderation pipeline screens for:

4. User reporting

Every chat, moment, and profile in the keni app has a "report" action. Reports route to safety@hikeni.com with target ID, reason category, and reporter context.

5. Law enforcement

Valid subpoenas / search warrants from law enforcement agencies are acknowledged within 5 business days. Emergency requests (life-threatening situations) are reviewed within 4 hours. Contact: legal@hikeni.com.

6. Transparency

We will publish an annual transparency report covering: total CSAM matches, NCMEC reports filed, law enforcement requests received and response rate, accounts suspended for safety violations.