Compliance
Across our platforms, the safety and wellbeing of our communities is our highest priority. We comply with online safety, digital services, and data protection legislation across the jurisdictions in which we operate.
Recent Legislation Updates
Age verification implemented for Australian users to comply with the under-16 social media ban.
Age verification enabled to comply with Tennessee's Protecting Children from Social Media Act and Mississippi's Social Media Safety Act.
Age restrictions introduced across platforms requiring UK users to verify they are 18+ to access direct messaging and rooms.
Regulatory Frameworks
As a UK-based company, we are fully behind the UK Online Safety Act (OSA) and proud to play our part in building safer digital communities. We are committed to leading the way in embedding its principles throughout our platform.
Highly Effective Age Assurance
Our platforms are designed for users aged 16 and above, with private messaging and group chat features reserved for those aged 18+. We implement highly effective age assurance measures, partnering with reputable third-party providers.
Tackling Illegal Harms
We take a zero-tolerance approach to illegal content, with powerful systems in place to detect, remove, and escalate material including CSEA, terrorism and violent extremism, hate crime, fraud, illegal drugs and weapons sales, and revenge pornography.
Safer Communities
With advanced moderation systems, robust reporting tools, and clear community guidelines, we go beyond compliance to create a supportive, protective environment for everyone.
We are fully committed to complying with the EU Digital Services Act (DSA), which sets clear standards for transparency, accountability, and user protection across the European Union.
Transparency of Rules
We provide clear and accessible terms of service and community standards, ensuring users understand what content is permitted and how moderation decisions are made.
User Empowerment
We offer straightforward reporting tools so users can flag harmful or illegal content, supported by accessible appeals processes.
Fair Processes
We respect users' rights to contest and appeal content decisions, and apply moderation policies consistently and fairly across our platform.
Risk Reduction
Our safety measures are designed to reduce exposure to illegal or harmful content, particularly for minors and other vulnerable groups.
In the United States, we are committed to meeting the requirements of emerging state-level age verification laws, particularly where they affect social platforms.
Tennessee & Mississippi
Both states have enacted new legislation requiring enhanced parental consent processes and strict age verification measures. TalkCampus access in these jurisdictions is restricted to users aged 18 and above.
Other States
Similar legislation is under consideration in states including Louisiana, Utah, and Arkansas. We actively track these developments and prepare to update our policies as each new law takes effect.
Ongoing Monitoring
Our compliance team maintains a continuous legislative review process, allowing us to adapt quickly and remain aligned with the latest regulatory obligations nationwide.
In Australia, we follow the Online Safety Codes, which set important standards for protecting users and reducing online harms. These codes focus on:
- Minimising harmful online content and ensuring timely removal
- Providing clear and accessible complaint mechanisms
- Ensuring age-appropriate design features and protections
We have implemented age verification for Australian users to comply with the under-16 social media ban and will continue to adapt our platform policies and safeguards as the legal framework evolves.
Alongside the UK, EU, US, and Australia, we closely monitor and comply with online safety and age verification requirements in other jurisdictions.
Canada
We are closely monitoring developments around the proposed Online Harms Act and other child-safety legislation, and will adapt our safeguards in line with final requirements.
Global Approach
We review emerging frameworks in Asia-Pacific, Latin America, and other jurisdictions to ensure our safety measures meet or exceed international best practices.
Safeguarding our communities also means safeguarding their data. We treat privacy and data protection as core responsibilities, ensuring that personal information is handled securely, transparently, and in full compliance with global regulations.
We comply with a range of regional and international standards, including:
- UK GDPR & Data Protection Act 2018 - see our dedicated GDPR Compliance page
- EU GDPR for our European users
- CCPA (California Consumer Privacy Act) for users in California
- Other regional privacy regulations, applied where relevant
In addition to meeting these legal requirements, we limit the data we collect, process user data responsibly, and provide clear privacy choices for our users.
We take our responsibility to protect users seriously and work closely with law enforcement agencies where required.
- We proactively escalate certain categories of serious illegal content, such as child sexual exploitation and terrorism-related material, to the relevant authorities.
- We respond to lawful requests for user data only when legally required.
- All requests are carefully reviewed to ensure they are valid, proportionate, and legally binding.
- Emergency requests are prioritised to prevent imminent harm.
Our Ongoing Commitment
Online safety is never static. We continually review and strengthen our compliance frameworks, safety measures, and reporting processes as laws evolve worldwide.