Child Sexual Abuse and Exploitation (CSAE)

Children's Safety, Privacy, and Data Protection Policy

Effective Date: 30 January 2026
Last Updated: 30 January 2026
Version: 1.0
Policy Owner: Chief Trust & Safety Officer
This policy establishes comprehensive standards for protecting minors (individuals under 18 years of age) across all jurisdictions where Takchik Limited operates. It applies to all platforms, services, applications, and websites operated by Takchik Limited globally.

1. Purpose and Scope

This policy establishes comprehensive standards for protecting minors across all jurisdictions where Takchik Limited operates. It applies to all platforms, services, applications, and websites operated by Takchik Limited globally.

2. Age Restrictions and Access

2.1 Minimum Age Requirements

Absolute Minimum Age: No individual under 18 years of age may create an account, register, or use Takchik services in any capacity — even with parental permission.
  • No Parental Permission Exception: Unlike some platforms that permit usage with parental consent for users aged 13–17, Takchik does not allow minors to use the service under any circumstances.
  • No Proxy Registration: Parents, guardians, or adults may not register accounts on behalf of minors or allow minors to use accounts registered in the adult's name.
  • Zero Tolerance: Any account found to be used by a minor will be immediately and permanently terminated.

2.2 Age Verification Mechanisms

Takchik implements multi-layered age verification:

Initial Registration

  • Date of birth declaration (mandatory)
  • Email verification requiring adult email addresses
  • Phone number verification (where legally permissible)
  • Device fingerprinting to detect patterns associated with minor usage

Ongoing Verification

  • AI and machine learning algorithms analysing user behaviour, content patterns, and linguistic markers
  • Facial age estimation technology (where legally compliant)
  • Random verification challenges requiring government-issued ID upload
  • Community reporting mechanisms with rapid response protocols

Enhanced Verification Triggers

  • Suspicious activity patterns
  • Content suggesting underage usage
  • User reports or flags
  • Behavioural indicators inconsistent with declared age

3. Compliance with International Regulations

3.1 United States

COPPA Compliance: While Takchik does not permit users under 18, COPPA requirements for under-13 protections are exceeded by our blanket prohibition. No collection of personal information from anyone under 13 under any circumstances.

  • No targeted advertising to minors
  • Immediate deletion of any data discovered to belong to minors
  • California CCPA/CPRA age-appropriate design provisions exceeded
  • Compliance with California Age-Appropriate Design Code Act (AB 2273)
  • Arkansas Social Media Safety Act compliance
  • Utah Minor Protection in Social Media Act compliance

3.2 European Union / EEA

  • GDPR Article 8 provisions regarding children's consent exceeded (EU standard: 16 years)
  • Enhanced data protection for all users given our adult-only platform
  • Right to erasure particularly enforced for any minor data inadvertently collected
  • Privacy by design and by default principles applied

UK GDPR & Age Appropriate Design Code

  • Full compliance with ICO Children's Code
  • All 15 standards of the Age Appropriate Design Code exceeded through our 18+ policy
  • Geolocation services off by default
  • Options to maximise privacy settings prominently displayed
  • Data minimisation principles applied universally

3.3 Other Jurisdictions

  • Australia: Online Safety Act 2021; eSafety Commissioner standards; Privacy Act 1988
  • Canada: PIPEDA; Quebec's Law 25
  • China: Cybersecurity Law; PIPL minor provisions
  • South Korea: Protection of Children and Youth from Internet Gaming Addiction Act
  • Japan: APPI provisions for minors
  • India: Digital Personal Data Protection Act, 2023 — parental consent requirements exceeded
  • Singapore: Personal Data Protection Act (PDPA)
  • Brazil: LGPD children's provisions
  • Argentina: Personal Data Protection Law
  • Mexico: Federal Law on Protection of Personal Data
  • UAE: Data protection regulations
  • South Africa: POPIA
  • Saudi Arabia: Personal Data Protection Law (PDPL)

4. Prohibited Activities Related to Minors

4.1 Absolute Prohibitions

Takchik has zero tolerance for the following:
  • Registration or Usage by Minors: Under any circumstances, with or without permission
  • Adult Registration for Minor Use: Adults creating accounts for minors to use
  • Grooming or Predatory Behaviour: Any attempt to contact, solicit, or engage with minors
  • Child Sexual Abuse Material (CSAM): Zero tolerance; immediate reporting to NCMEC, IWF, and relevant law enforcement
  • Child Exploitation Content: Including sexualisation, endangerment, or abuse
  • Age Misrepresentation: Deliberately falsifying age to circumvent restrictions
  • Sharing Minor's Personal Information: Doxxing or sharing identifying information about minors
  • Recruiting Minors: Any content or behaviour designed to attract underage users

4.2 Enforcement Actions

  • Immediate permanent account suspension
  • Device and IP address banning
  • Report to law enforcement when criminal activity is suspected
  • Cooperation with international child protection agencies
  • Legal action against violators where appropriate

5. Data Protection and Privacy for Minors

5.1 Data Handling Protocols

Despite our 18+ policy, if minor data is inadvertently collected:

Immediate Actions

  • Automatic account suspension upon discovery
  • Complete data deletion within 24 hours of verification
  • Notification to relevant data protection authorities if legally required
  • Internal investigation to prevent recurrence
  • No retention of minor data for any purpose, including analytics or machine learning training

Data Minimisation

  • Collection of only essential data for all users
  • No collection of sensitive personal information beyond what's legally required
  • Regular data audits and deletion of unnecessary information

Data Security

  • End-to-end encryption for communications where technically feasible
  • Secure storage with industry-standard encryption (AES-256 or equivalent)
  • Regular security audits and penetration testing
  • Incident response plan with specific protocols for minor data breaches

5.2 Third-Party Data Sharing

  • No sharing of user data with third parties for marketing purposes
  • Limited sharing with service providers under strict data processing agreements
  • Absolute prohibition on sharing any minor data if inadvertently collected
  • Transparency reports published semi-annually detailing data practices

6. Content Moderation and Safety

6.1 Proactive Measures

Content Filtering

  • AI-powered content moderation scanning all uploads
  • Human moderator review of flagged content (24/7 coverage)
  • PhotoDNA and hash-matching technology for CSAM detection
  • Automated blocking of known CSAM material in real-time

Community Guidelines

  • Clear prohibition of minor-related harmful content
  • Reporting mechanisms prominently displayed
  • Education about online safety for all users
  • Regular safety campaigns and awareness initiatives

Algorithm Design

  • No recommendation algorithms designed to be addictive
  • No amplification of harmful content
  • Transparent algorithm operations (annual transparency reports)

6.2 Reporting and Response

User Reporting

  • One-click reporting for suspected minor accounts or child safety concerns
  • Anonymous reporting options
  • Multi-language support for global accessibility
  • Acknowledgment of reports within 24 hours

Response Protocols

  • Child safety reports prioritised above all other moderation queues
  • Response within 1 hour for CSAM or imminent danger situations
  • Response within 24 hours for other minor-related violations
  • Mandatory reporting to relevant authorities (NCMEC, IWF, local law enforcement)

Law Enforcement Cooperation

  • Dedicated law enforcement liaison team
  • 24/7 emergency contact line for urgent child safety matters
  • Preservation of evidence upon valid legal request
  • Proactive cooperation with investigations

7. Technical Safeguards

7.1 Platform Design

  • No features specifically designed to appeal to minors
  • Visual design and user interface appropriate for adult audiences
  • Content ratings and warnings where applicable
  • Geolocation features off by default
  • Privacy settings maximised by default

7.2 Detection Technology

Machine Learning Systems

  • Behavioural analysis to detect minor usage patterns
  • Natural language processing to identify age-inappropriate discourse
  • Image recognition for visual age estimation
  • Pattern recognition for account takeover or sharing

Continuous Monitoring

  • Real-time monitoring of platform activity
  • Automated alerts for suspicious patterns
  • Regular audits of account authenticity
  • Cross-platform tracking of repeat offenders

8. Parental Rights and Responsibilities

8.1 Parental Awareness

While parents cannot register accounts for minors on Takchik, we provide:

  • Dedicated parent portal with safety information
  • Educational materials about online safety
  • Clear communication about our 18+ policy
  • Guidance on monitoring children's online activity
  • Information about device-level parental control tools
  • Partnerships with parental control software providers
  • Resources to block Takchik on minors' devices

8.2 Parental Reporting

Parents may:

  • Report suspected underage accounts
  • Request removal of content featuring their minor children
  • Request investigation into potential grooming or predatory behaviour
  • Access educational resources about protecting children online
Response Commitments: Priority handling · Response within 24 hours · Complete confidentiality · Follow-up confirmation of actions taken

9. Employee Training and Accountability

9.1 Mandatory Training

All Takchik employees receive:

  • Initial child safety training during onboarding
  • Annual refresher training
  • Role-specific training (enhanced for moderators and customer service)
  • Updates on new regulations and best practices
  • Certification requirements for child safety roles

Training Topics

  • Recognising signs of underage users
  • Identifying grooming behaviour
  • CSAM recognition and reporting protocols
  • Data protection for minors
  • Legal compliance obligations
  • Psychological impacts of child exploitation material (for moderators)

9.2 Employee Responsibilities

  • Mandatory reporting of suspected violations
  • Prohibition on accessing minor data without authorisation
  • Background checks for employees with access to user data or content moderation
  • Confidentiality agreements with specific child protection provisions
  • Whistleblower protections for reporting violations

9.3 Moderator Welfare

  • Mandatory counselling services available
  • Regular mental health check-ins
  • Rotation policies to prevent burnout
  • Peer support programs
  • Time limits on consecutive CSAM review

10. Transparency and Accountability

10.1 Public Reporting

Takchik publishes:

  • Semi-Annual Transparency Reports — accounts terminated for age violations, CSAM reports made to authorities, response times, age verification accuracy metrics, government requests related to minors
  • Annual Child Safety Report — policy updates, technology enhancements, training initiatives, partnerships, and industry collaboration efforts

10.2 External Audits

  • Annual third-party audits of child safety practices
  • Regular penetration testing focused on age verification bypass attempts
  • Compliance audits by relevant data protection authorities
  • Certification pursuits (e.g., kidSAFE Seal Program)

10.3 Stakeholder Engagement

  • Regular consultation with child safety organisations (NCMEC, IWF, NSPCC)
  • Participation in industry coalitions (e.g., Technology Coalition)
  • Advisory board including child safety experts
  • Collaboration with academic researchers
  • Engagement with policymakers and regulators

11. International Enforcement Cooperation

11.1 Law Enforcement Partnerships

  • Memoranda of Understanding with international law enforcement agencies
  • Participation in INTERPOL's International Child Sexual Exploitation database
  • Cooperation with Europol's European Cybercrime Centre
  • Partnership with FBI's Innocent Images National Initiative
  • Collaboration with local police forces globally

11.2 Industry Collaboration

  • Membership in relevant industry associations
  • Hash-sharing partnerships to identify CSAM
  • Best practice exchanges with peer platforms
  • Joint technology development initiatives
  • Coordinated response to emerging threats

12. Emerging Technologies and Future Safeguards

12.1 Artificial Intelligence

  • Investment in AI for improved age detection
  • Machine learning for behavioural pattern recognition
  • Natural language processing for risk assessment
  • Computer vision for visual age estimation
  • Ethical AI principles applied to all child safety technologies

12.2 Biometric Verification (where legally compliant)

  • Facial recognition for age estimation
  • Voice analysis for age verification
  • Liveness detection to prevent fraud
  • Privacy-preserving biometric technologies

12.3 Blockchain and Digital Identity

  • Exploration of decentralised identity verification
  • Age verification tokens without exposing full identity
  • Collaboration with digital identity providers
  • Privacy-preserving age attestation methods

13. Remediation and Continuous Improvement

13.1 Incident Response

When minor access is discovered:

  1. Immediate account suspension
  2. Forensic investigation to determine extent
  3. Data deletion per protocol
  4. Assessment of how verification was bypassed
  5. Implementation of additional controls
  6. Notification to authorities if legally required
  7. Documentation and learning review

13.2 Policy Review

  • Quarterly review of child safety policies
  • Annual comprehensive policy update
  • Immediate updates in response to new regulations
  • Incorporation of lessons learned from incidents
  • Benchmarking against industry best practices

13.3 Feedback Mechanisms

  • User feedback on safety features
  • Expert consultation on policy improvements
  • Regulator feedback incorporation
  • Academic research integration
  • Community input on safety enhancements

15. Contact and Reporting

15.1 Child Safety Contact Points

Child Safety (24/7)
childsafety@takchik.com
In-App Reporting

One-click child safety reporting button

General Safety
safety@takchik.com
Data Protection Officer
dpo@takchik.com

15.2 Regulatory & Law Enforcement

Law Enforcement (24/7 Emergency)
lawenforcement@takchik.com
Legal (Non-Emergency)
legal@takchik.com

16. Policy Governance

16.1 Ownership and Approval

  • Policy Owner: Chief Trust & Safety Officer
  • Approved by: Board of Directors
  • Effective Date: 30 January 2026
  • Review Frequency: Quarterly with annual comprehensive review

16.2 Amendment Process

  • Proposed amendments reviewed by Legal, Trust & Safety, and Privacy teams
  • Consultation with external child safety experts
  • Board approval for substantive changes
  • Public notification of material policy changes
  • 30-day notice period for user-impacting changes (where not immediately required for safety)

16.3 Version Control

  • All policy versions archived and accessible
  • Change log maintained with rationale for amendments
  • Historical versions available upon request for transparency

17. Definitions

  • Minor: Any individual under 18 years of age, regardless of jurisdiction's legal definition of majority
  • Parent/Guardian: Legal parent or court-appointed guardian with parental rights
  • CSAM: Child Sexual Abuse Material, including any visual depiction of sexually explicit conduct involving a minor
  • Age Verification: Process of confirming a user's age through various technical and documentary means
  • Grooming: Building a relationship with a minor to facilitate abuse or exploitation
  • Personal Information: Any information relating to an identified or identifiable individual as defined by applicable data protection laws

18. Acknowledgment

By creating an account on Takchik, users acknowledge that they are 18 years of age or older; they will not allow minors to use their account; they will not create accounts on behalf of minors; they understand violations result in immediate permanent termination; they consent to age verification processes; and they have read and agree to this Child Safety Policy.

Appendix A: Relevant Legislation Reference

This policy is designed to comply with and exceed requirements of:

United States

  • Children's Online Privacy Protection Act (COPPA)
  • California Consumer Privacy Act (CCPA/CPRA)
  • California Age-Appropriate Design Code Act (AB 2273)
  • Various state-level social media safety acts

European Union / EEA

  • General Data Protection Regulation (GDPR)
  • ePrivacy Directive
  • Digital Services Act (DSA)
  • Audiovisual Media Services Directive

United Kingdom

  • UK GDPR
  • Age Appropriate Design Code
  • Online Safety Act 2023

International

  • Convention on the Rights of the Child (UN)
  • Optional Protocol on the Sale of Children, Child Prostitution and Child Pornography

Appendix B: International Child Safety Reporting Contacts

NCMEC (US)
CyberTipline.org
INHOPE Network
inhope.org
ECPAT International
ecpat.org

National Hotlines

  • US: 1-800-THE-LOST (1-800-843-5678)
  • UK: 0808 800 5000 (NSPCC)
  • EU: 116 000
  • Australia: 131 237
  • Canada: 1-866-387-2623