This page sets out the published standards of Colab Services ("Colab Services", "we", "us", or "our") regarding child sexual abuse and exploitation (CSAE) on the Frankli website at https://frankli.in and the Frankli mobile application available on Google Play and other distribution channels (together, the "Platform"). These standards are intended to be read together with our Content and Community Guidelines, Terms of Use, and Privacy Policy.
CSAE refers to content or behaviour that sexually exploits, abuses, or endangers children, including for example grooming a child for sexual exploitation, sextortion of a child, trafficking of a child for sex, or otherwise sexually exploiting a child.
CSAM (child sexual abuse material) means any visual depiction, including photographs, videos, and computer-generated imagery, involving a minor engaged in sexually explicit conduct. CSAM is illegal. It is prohibited on the Platform and may not be stored, shared, solicited, or distributed through Frankli.
Our standards
We prohibit CSAE and CSAM on Frankli in all forms. Without limitation, users must not use the Platform to:
- Share, request, offer, or solicit CSAM or any sexual depiction of a minor;
- Groom, coerce, or facilitate sexual contact with a minor, or encourage sexual conduct involving minors;
- Engage in sextortion or threats involving minors;
- Sexually exploit or endanger children in any other way;
- Use the Platform to facilitate trafficking or commercial sexual exploitation of children.
We apply these standards regardless of whether content is created by a user or with the help of third-party tools (including synthetic or AI-generated media). Content that violates this page or our Community Guidelines may be removed and may result in suspension or termination of accounts and other enforcement consistent with our policies and applicable law.
Addressing CSAM and CSAE
When we obtain actual knowledge of CSAM or CSAE on the Platform, we act in accordance with these published standards and applicable laws. That includes removing or disabling access to violating content where appropriate, restricting or terminating accounts involved in serious or repeated violations, and cooperating with law enforcement and competent authorities when required or permitted by law.
We comply with applicable child safety and intermediary obligations in the jurisdictions where we operate, including India, to the extent they apply to our services.
Reporting concerns
Users can report content or behaviour that may violate these standards using the in-app reporting or feedback tools available on Frankli, where provided. Reports help us identify and address violations.
You may also contact us by email at support@frankli.in with the subject line "Child safety report" and sufficient detail for us to assess the concern. We review reports in line with our policies and applicable law.
Child safety point of contact
For inquiries from regulators or platform partners regarding our CSAE prevention practices and this policy, Colab Services designates a child safety point of contact reachable at support@frankli.in. Please include "Child safety – point of contact" in the subject line. The individual designated in Google Play Console for CSAM-related contact, where applicable, aligns with this channel.
Changes
We may update these standards from time to time. The current version will be published at this URL. The "Last updated" date at the top of this page indicates when material changes were last reflected.
