Child Sexual Abuse & Exploitation (CSAE) Policy
Child Sexual Abuse & Exploitation (CSAE) Policy
1. Purpose & Scope
1.1. Freedom Machine app is an 18+ motorcycling community. We have zero tolerance for child sexual abuse or exploitation (CSAE).
1.2. This policy explains what is prohibited, how we prevent and detect abuse, how to report concerns, and how we cooperate with UK authorities and specialist organisations.
1.3. It applies to all users, content, groups, messaging, advertising, and third-party services on our platform.
2. Definitions
2.1. Child / Minor: Anyone under 18 years of age.
2.2. CSAE: Any sexual activity, content, contact, or behaviour involving a child, including grooming, sexual solicitation, trafficking, or distributing sexual content.
2.3. CSAM: Child Sexual Abuse Material - images, videos, or other media depicting sexual activity involving a minor.
2.4. Grooming: Deliberate actions to gain a child’s trust or influence for sexual abuse or exploitation.
2.5. User Content: Anything uploaded, shared, messaged, or posted by users.
3. Zero Tolerance & Eligibility
3.1. Any content, behaviour, or use of the platform that sexualises, exploits, or endangers children is strictly prohibited.
3.2. All users must be 18 years or older. Accounts belonging to minors or facilitating sexual contact with minors will be suspended or banned, and relevant authorities will be notified.
4. Prohibited Conduct & Content
4.1. Creation, possession, distribution, facilitation, or exchange of CSAM.
4.2. Sexualised messaging or contact with anyone known or reasonably suspected to be under 18.
4.3. Grooming behaviour online or offline directed at minors.
4.4. Advertising, linking, or promoting sexual services, trafficking, or sexual content involving minors.
4.5. Attempts to evade detection (e.g., encoding, cropping, hash evasion).
4.6. Misrepresenting age to impersonate a minor or solicit minors.
5. Age-Gating & Account Verification
5.1. Users must confirm they are 18+ at signup.
5.2. Risk-based verification may be applied for sensitive features (e.g., direct messaging, group access).
5.3. Accounts suspected to belong to minors will be suspended pending investigation.
6. Detection & Preventative Measures
6.1. Industry-standard hash-matching to detect known CSAM.
6.2. Automated content screening before media is visible. Suspicious content is flagged for human review.
6.3. Text analysis to detect grooming language, sexual solicitation, or risky patterns.
6.4. Monitoring of messaging behaviour signals (while respecting privacy laws).
6.5. Limits on new accounts for mass messaging or rapid group-joining.
6.6. Detection of obfuscated content intended to hide abuse.
7. Moderation & Enforcement
7.1. Automated blocking and quarantine of flagged content.
7.2. Safety moderators review flagged content promptly.
7.3. Enforcement actions include content removal, account suspension, or permanent bans.
7.4. Violations are reported to UK authorities and specialist bodies (e.g., IWF, CEOP).
7.5. Users can appeal enforcement actions, which will be reviewed independently.
8. Evidence Preservation & Reporting
8.1. All suspected CSAE content and related metadata are securely preserved for investigation.
8.2. Chain-of-custody procedures are maintained for legal admissibility.
8.3. Confirmed CSAE is reported to the Internet Watch Foundation (IWF) and UK law enforcement, including CEOP/NCA, in accordance with statutory obligations.
8.4. Users who report CSAE may be updated on action taken, where legally safe.
8.5. Law enforcement requests follow formal legal processes unless there is an immediate emergency risk.
9. Privacy & Safety
9.1. We are transparent about what data we monitor and why.
9.2. End-to-end encryption features may include safety-preserving measures, consistent with UK law.
9.3. Data collected for safety purposes is limited to what is necessary and retained in accordance with statutory requirements.
10. User Reporting & Support
10.1. Users can easily report suspected CSAE via visible report buttons on content or messaging.
10.2. Reportable categories include CSAM, grooming, sexual solicitation, suspected minors, and other safety concerns.
10.3. Essential report information includes content links, account handles, timestamps, and optional screenshots.
10.4. We provide updates to reporters on the progress of their reports, where legally and safely possible.
11. Cooperation with Authorities & Specialist Bodies
11.1. We maintain a dedicated point of contact (SPOC) for UK law enforcement and specialist organisations.
11.2. Emergency requests (e.g., imminent risk of harm) are handled 24/7 according to formal procedures.
11.3. We cooperate with international authorities and safety organisations, subject to UK law and mutual legal assistance agreements.
12. Vendor & Third-Party Requirements
12.1. Vendors processing content or moderation must comply with these CSAE standards, undergo background checks, follow evidence-handling protocols, and provide training.
12.2. Vendor contracts include audit rights, data protection clauses, incident reporting obligations, and termination rights for non-compliance.
13. Metrics & Response Times
13.1. Automated quarantine occurs immediately on detection.
13.2. High-priority content flagged for human review is addressed within defined internal response times.
13.3. Preservation and retention of evidence follow statutory criminal evidence retention rules.
13.4. Records of law enforcement disclosures are maintained.
14. UX & Product Design
14.1. Age warnings are displayed at registration and before sensitive features.
14.2. Report buttons are always visible near content and chats.
14.3. Confirmation steps are required for bulk uploads or mass messaging.
14.4. Trusted community flagging supports rapid review.
14.5. Contextual safety prompts are shown when behaviour patterns indicate risk.
15. Sanctions & Enforcement Examples
15.1. First confirmed CSAE violation results in permanent ban, content removal, and reporting to authorities.
15.2. Attempts to evade detection result in permanent ban and forensic review.
15.3. Malicious false reports are subject to sanctions under our abuse policy, without discouraging safe reporting.
16. Legal Compliance
16.1. Compliance with UK laws, including the Online Safety Act 2023, Sexual Offences Acts, and UK GDPR/Data Protection Act, is maintained.
16.2. Differences in Scottish and Northern Irish law are acknowledged and implemented where applicable.
16.3. Reporting and cooperation with authorities follow statutory obligations and best practices for victim protection.
You, our Members who make up Freedom Machine, play a key role in moderation by reporting content or Members that violate these Guidelines.
Please remember that disagreeing with a post is not a reason to report it. This slows down our ability to remove content that is truly abusive and to create a platform where everyone feels welcome.
