On This Page
📘 Introduction
The internet is a vital part of modern life—but it comes with serious risks, especially for children and vulnerable users. Governments around the world have introduced online safety laws to hold platforms accountable for the content they host, the data they collect, and how they protect users.
At Sentrium, we track and analyze these laws to understand how platforms respond to them and whether their safety systems are effective.
On this page, you’ll find a region-by-region breakdown of key laws—and how platforms are adapting to comply.
European Union (EU)
Main Laws:
- GDPR (General Data Protection Regulation): Protects personal data, requires user consent, gives users control over their information.
- DSA (Digital Services Act): Requires platforms to remove illegal content, assess systemic risks (like disinformation), and publish transparency reports.
How platforms comply:
- Cookie banners, data access requests, recommender system settings, flagging tools.
- Meta, YouTube, and TikTok now publish algorithm risk reports under the DSA.
United Kingdom (UK)
Main Law:
- Online Safety Act (2023): Targets illegal content, child safety, and introduces rules for age-appropriate design.
How platforms comply:
- Content moderation filters for children, risk assessments, and transparency reports submitted to Ofcom, the UK’s regulator.
- TikTok, Discord, and Roblox are updating age verification systems for UK users.
United States (US)
Main Laws:
- COPPA (Children’s Online Privacy Protection Act): Requires parental consent to collect data from children under 13.
- Section 230: Protects platforms from being held liable for most user-generated content, but allows content moderation in good faith.
How platforms comply:
- YouTube Kids has limited data tracking and curated content.
- Platforms like Reddit and Discord rely on user reporting and community rules under Section 230 protection.
Australia:
- Online Safety Act (2021): Gives the eSafety Commissioner power to order takedowns of harmful content like image-based abuse and cyberbullying.
India:
- IT Rules (2021): Requires social media platforms to appoint grievance officers, allow takedown of content within 36 hours, and trace the origin of messages.
Sentrium monitors how effective these systems are, and whether they truly reduce harm.