STANDARDS ON CHILD SAFETY
1. Introduction
Our platform Life for love.com is committed to creating a safe digital environment for all users, especially children. We enforce strict child protection standards based on international frameworks, including:
- UN Convention on the Rights of the Child (UNCRC)
- WePROTECT Global Alliance
- INHOPE / NCMEC best practices
- EU Digital Services Act (DSA)
- US PROTECT Act and related CSAM regulations
Any form of sexual violence, exploitation, harm, or abuse involving children is strictly prohibited.
A “child” is defined as any person under the age of 18, regardless of local legal definitions.
2. Absolute Prohibition of Child Sexual Abuse and Exploitation
We apply a zero-tolerance policy toward the following behaviors:
2.1. Prohibited Content
The following types of content are strictly forbidden:
- Child sexual abuse material (CSAM)
- Any depiction of coercion, restriction of liberty, or harassment of a child
- Any sexual acts involving minors
- Realistic, generated, or manipulated depictions of sexualized minors
- Artistic or animated visualizations portraying minors in sexualized contexts
- Requests, distribution, or exchange of sexual content involving children
This prohibition also includes:
- AI-generated content (deepfakes, animation, 3D models)
- Drawings, stylized media, comics involving minors in sexual situations
- Written descriptions or narratives of sexual activities involving minors
2.2. Prohibited Behavior
The following behaviors are banned on the platform:
- Grooming (manipulative actions intended to sexualize or prepare a child for exploitation)
- Recruitment of minors for any sexual purposes
- Attempts to obtain intimate materials from children
- Any adult contacting a child with the intent to sexualize the interaction
- Blackmail, coercion, threats, trafficking of minors
- Advertising, offering, or promoting services related to child sexual exploitation
3. User Responsibilities
All users must:
- refrain from publishing, distributing, or requesting sexualized content involving minors;
- avoid attempting to contact minors with harmful or exploitative intent;
- immediately report violations using the platform’s reporting tools;
- refrain from impersonating a child or an adult to gain trust from minors.
Any violation may result in account termination and referral to law enforcement authorities.
4. Zero-Tolerance Policy
We enforce a strict Zero-Tolerance Policy, which includes:
- Immediate removal of prohibited content
- Permanent account termination
- Preservation of evidence as required by law
- Referral of relevant information to competent authorities
No justification, artistic intent, or humor is accepted as an excuse.
5. Moderation and Content Detection
To identify and prevent harm, we use a multilayered approach:
- automated CSAM detection systems (where legally permitted);
- hash-matching technologies (PhotoDNA, other hashing systems);
- proactive monitoring of high-risk interactions;
- regular audits and ongoing moderator training;
- manual review of suspicious content in compliance with legal protocols.
Moderators are trained in international child exploitation protocols and operate under safe working conditions.
6. Interactions with Minors
We limit interactions between adults and minors by implementing:
- filtered direct messaging;
- search and discovery restrictions;
- enhanced protection of minor profiles;
- age-verification measures;
- automated detection of risky behaviors.
Any attempt to bypass these safety measures is considered a violation.
7. Reporting and Incident Response
We provide clear and accessible reporting channels:
- “Report” button for content and users
- Prompt review of reports (typically within hours)
- Cooperation with NCMEC, law enforcement, and child protection organizations when necessary
- Notifying the user of moderation decisions when legally allowed
8. Partnerships and Legal Compliance
The platform collaborates with:
- INHOPE hotlines
- CyberTipline (where applicable)
- National child protection agencies
- Police, prosecutors, and cybersecurity services
- Anti-trafficking organizations
We comply with all legal obligations, including reporting requirements, evidence preservation, and timely escalation of confirmed threats.
9. Child Safety Within the Platform Ecosystem
Additional safety measures include:
- content filters for minor users;
- functional restrictions (e.g., no default private messaging);
- parental and educational tools;
- profile visibility limitations;
- technologies preventing re-upload of known CSAM.
10. Risk Assessment and Continuous Improvement
We conduct regular risk assessments (including DSA, COPPA, UK Online Safety Act requirements) and update our safety standards accordingly.
We routinely review:
- emerging threats;
- changes in legislation;
- new technologies, including AI models;
- external expert reports.
11. Contact Person for Child Safety
To ensure transparency and compliance with international child safety requirements, our organization designates an official contact person responsible for handling inquiries related to:
- child safety policies and enforcement,
- detection and moderation of content involving minors in sexualized or exploitative contexts,
- compliance with applicable child safety regulations and platform safety obligations,
- risks associated with applications or features that may endanger minors.
This contact person is available to respond to regulatory authorities, platform partners, and other authorized entities when required.
Designated Child Safety Officer (CSO)
Name: Elizabeth Westhead
Position: Child Safety Officer / Trust & Safety Lead
Email: support@lifeforlove.com
Phone: +442045773347
Organization: LFL Team Ltd
Emergency escalation: 24/7 via Contact us
The contact information provided above is regularly reviewed and updated to ensure continuous accuracy and availability.
Our correspondence address: 65 London Wall London EC2M 5TU