Child Safety Standards
Seeyaa – Child Safety Standards (India)
At Seeyaa, we are committed to providing a safe and responsible platform for adults. Protecting children from online sexual exploitation and abuse is a top priority, and we strictly comply with Indian laws including the Protection of Children from Sexual Offences (POCSO) Act, 2012, the Information Technology Act, 2000, and other applicable regulations.
1. Zero Tolerance for Child Sexual Abuse Material (CSAM)
- Seeyaa strictly prohibits the creation, distribution, or possession of Child Sexual Abuse Material (CSAM).
- Any attempt to share or request such material will result in immediate account termination and will be reported to the Indian Cyber Crime Portal and relevant law enforcement authorities.
- In line with Section 67B of the IT Act, we cooperate fully with law enforcement agencies to prevent and punish child sexual exploitation online.
2. Age Restrictions
- Seeyaa is strictly for users 18 years of age and older.
- Users found to be misrepresenting their age or attempting to use Seeyaa as a minor will have their accounts terminated.
- Adults attempting to contact or exploit minors will be reported under the POCSO Act, 2012.
3. Reporting Mechanism
- Users can report profiles, content, or conversations that raise child safety concerns directly through the in-app reporting tools or by contacting contact@seeyaa.app.
- Reports related to child protection are treated as urgent priority and acted upon immediately.
4. Cooperation with Authorities
- Seeyaa complies with obligations under the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 to assist law enforcement and provide required information in child safety cases.
- All verified CSAM reports are forwarded to the Indian Cyber Crime Reporting Portal and, where applicable, to regional/national child protection agencies.
- We also align with international best practices through organizations such as INHOPE and the National Center for Missing & Exploited Children (NCMEC), if cross-border cases are detected.
5. Moderation & Detection
- We use a combination of user reporting, proactive moderation, and automated tools to detect, review, and remove harmful content.
- Content that violates child safety standards is permanently removed, and user accounts are banned.
- We maintain secure audit logs for law enforcement without retaining illegal material longer than required for investigation.
6. User Responsibility
- Users must not attempt to upload, request, or share any material involving minors.
- Users are encouraged to immediately report any suspicious activity involving children.
- Failure to comply may lead to permanent suspension and reporting to authorities.
📩 Child Safety Contact
For urgent child safety issues, please contact our dedicated safety team:
- Email: contact@seeyaa.app
- Report In-App: Use the “Report” feature available on all profiles and messages.
⚖️ Legal References (India)
- Protection of Children from Sexual Offences (POCSO) Act, 2012
- Information Technology Act, 2000 (Section 67B)
- Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021