On This Page
π§ Our Mission
Sentrium Research investigates the systems, technologies, and language that shape how online platforms moderate content β especially in environments where young people are most at risk.
We focus on language-based moderation, platform accountability, and community-driven insights to illuminate both the power and the pitfalls of digital governance.
We aim to:
- Expose blind spots in automated and human moderation systems.
- Analyze language evasion and false positives in moderation.
- Create tools and datasets that empower platforms, researchers, and users.
- Push for more transparent, consistent, and just moderation practices.
π§ What Makes Sentrium Different?
We are:
- Focused β specializing in moderation for youth-oriented platforms like Roblox, Discord, and TikTok.
- Open and collaborative β we share our tools, findings, and datasets freely.
- Solution-oriented β not just diagnosing issues, but helping platforms fix them.
While other institutions focus on high-level policy or ethics, we go into the trenches of moderation β language models, filter testing, evasion methods, and reporting systems.
π€ Partner With Us
Sentrium actively partners with:
- Online platforms looking to improve their content moderation systems.
- Nonprofits working on child safety, free expression, or tech accountability.
- Academic institutions and research labs conducting work on language, AI, or digital harm.
Our Partnership Model Includes:
- Remit Reports
We maintain a confidential channel for users to submit content that was not removed despite violating platform rules.
If a case meets our internal criteria (e.g. violating platform policies), we escalate it directly to platform staff for review and action. - Research Collaboration
We co-author papers, share benchmarks, and advise on real-world moderation models. - Tool Deployment
We can adapt our detection tools, evasion analyzers, or filter testers to help platforms catch more nuanced harms.
To partner, email us at [email] or reach out via our [Contact Form]. We’re open to NDAs or structured collaborations to ensure trust and discretion.
Letβs work together to make digital platforms safer and smarter.