ROBLOX: A PLAYGROUND FOR PREDATORS – Parents Must Pull Their Kids Out Now

Recent signers:
Kemp Kehn and 19 others have signed recently.

The Issue

 

Children deserve wonder, not predators.

We, parents, survivors, educators, and child-safety advocates, call on Apple App Store, Google Play, and Roblox Corporation to suspend Roblox’s availability to minors until a credible, independent child-safety overhaul is completed and published.

Recent events highlight a deeper safety crisis. A creator who exposed predatory behavior on Roblox says he was banned and served a cease-and-desist. Separately, a longtime Roblox developer publicly stated that company changes “undermine player safety,” alleging that Roblox removed the age requirement for voice chat and dismantled automated moderation that previously auto-banned accounts tied to sexualized “condo” games—and that he was later flagged for “defamation” after speaking out.

Additionally, Roblox’s CEO has publicly discussed exploring adult-only, ID-verified “virtual dating” (21+). On a platform where a large share of users are under 13, any move toward adult dating raises grooming-risk and discovery concerns unless adult features are completely segregated, default-off for minors, and independently audited. We oppose launching any adult-dating features until child-safety standards are verified by third-party experts.

These public claims—combined with the lived experience of families—demand independent verification, hard fixes, and real transparency. If a platform can’t prove it protects children, it shouldn’t be in kids’ pockets.

This is not about panic or vigilantism. It’s about minimum, non-negotiable standards for companies profiting from children.

Our demands

Suspend access for minors until all of the following are verifiably implemented and published.

  • Independent child-safety audit (public)
  • Commission qualified third-party experts (not PR firms) to assess grooming risk, age/identity controls, chat/voice systems, moderation tooling, and ban-evasion.
  • Publish the full report (findings, methodology, deadlines) and commit to external re-audits at set intervals.
  • Default-safe for minors
  • Robust age and identity verification for all new and existing accounts.
  • DMs and voice chat OFF by default for users under 18; re-enable only with verified parental consent.
  • Block links/invites that steer minors to off-platform chats or encrypted apps.
  • Adult-only features (e.g., “virtual dating”) must be fully siloed: strict 21+ ID verification, separate discovery/search, no cross-DM/voice/recs with minors, and independent pre-launch safety review.
  • App stores must withhold approval of any adult-only feature until an independent audit certifies clean separation and ongoing quarterly checks.
  • Proactive moderation—restored and strengthened
  • Reinstate/upgrade automated and human moderation for high-risk content and “condo” activity.
  • Dedicated live escalation for suspected grooming with strict time-to-action SLAs.
  • Device/payment-instrument bans for repeat offenders and meaningful ban-evasion prevention.
  • One-tap reporting + law-enforcement pathways
  • Simple in-app reporting that auto-triages suspected exploitation to specialized safety teams.
  • Documented, timely referrals to NCMEC and law enforcement, plus a survivor/family notification loop when action is taken.
  • Radical transparency
  • Quarterly safety reports with real numbers: grooming reports, enforcement actions, time-to-action, ban-evasion metrics, and outcomes of safety experiments (e.g., voice/age-gating).
  • Public changelogs for safety features and an external advisory panel that includes survivor advocates.
  • Survivor support & restitution
  • Fund counseling and provide a clear pathway for impacted families to get help, including rapid contacts, resources, and case follow-up timelines.

Why this matters

Predators are experts at grooming; children are not equipped to detect it; parents can’t monitor every second. Platforms must build guardrails that make exploitation hard—not easy. If a company can ship a new skin overnight, it can ship child-safety upgrades just as fast. Adult-only features—especially dating—must not be discoverable by, surfaced to, or intersect with minors in any way.

What success looks like

  • Temporary suspension or 18+ age-gating in app stores until the audit is complete and fixes are verified.
  • A public, time-bound remediation plan with independent oversight that includes survivor advocates and child-protection experts.
  • Default-safe settings for minors, plus evidence they work in the real world.

We’re not asking individuals to conduct stings or engage suspects. If you encounter illegal behavior, report it to the NCMEC CyberTipline and to local law enforcement. Accountability belongs with platform leadership and app stores.

Sign and share if you believe children’s safety comes before corporate profits. When families speak with one voice, companies—and app stores—listen.

Fearlessly,
— TheBoldAdvocate, FL MomArmy Battalion Leader, backed by MOM ARMY & DAD ARMY

523

Recent signers:
Kemp Kehn and 19 others have signed recently.

The Issue

 

Children deserve wonder, not predators.

We, parents, survivors, educators, and child-safety advocates, call on Apple App Store, Google Play, and Roblox Corporation to suspend Roblox’s availability to minors until a credible, independent child-safety overhaul is completed and published.

Recent events highlight a deeper safety crisis. A creator who exposed predatory behavior on Roblox says he was banned and served a cease-and-desist. Separately, a longtime Roblox developer publicly stated that company changes “undermine player safety,” alleging that Roblox removed the age requirement for voice chat and dismantled automated moderation that previously auto-banned accounts tied to sexualized “condo” games—and that he was later flagged for “defamation” after speaking out.

Additionally, Roblox’s CEO has publicly discussed exploring adult-only, ID-verified “virtual dating” (21+). On a platform where a large share of users are under 13, any move toward adult dating raises grooming-risk and discovery concerns unless adult features are completely segregated, default-off for minors, and independently audited. We oppose launching any adult-dating features until child-safety standards are verified by third-party experts.

These public claims—combined with the lived experience of families—demand independent verification, hard fixes, and real transparency. If a platform can’t prove it protects children, it shouldn’t be in kids’ pockets.

This is not about panic or vigilantism. It’s about minimum, non-negotiable standards for companies profiting from children.

Our demands

Suspend access for minors until all of the following are verifiably implemented and published.

  • Independent child-safety audit (public)
  • Commission qualified third-party experts (not PR firms) to assess grooming risk, age/identity controls, chat/voice systems, moderation tooling, and ban-evasion.
  • Publish the full report (findings, methodology, deadlines) and commit to external re-audits at set intervals.
  • Default-safe for minors
  • Robust age and identity verification for all new and existing accounts.
  • DMs and voice chat OFF by default for users under 18; re-enable only with verified parental consent.
  • Block links/invites that steer minors to off-platform chats or encrypted apps.
  • Adult-only features (e.g., “virtual dating”) must be fully siloed: strict 21+ ID verification, separate discovery/search, no cross-DM/voice/recs with minors, and independent pre-launch safety review.
  • App stores must withhold approval of any adult-only feature until an independent audit certifies clean separation and ongoing quarterly checks.
  • Proactive moderation—restored and strengthened
  • Reinstate/upgrade automated and human moderation for high-risk content and “condo” activity.
  • Dedicated live escalation for suspected grooming with strict time-to-action SLAs.
  • Device/payment-instrument bans for repeat offenders and meaningful ban-evasion prevention.
  • One-tap reporting + law-enforcement pathways
  • Simple in-app reporting that auto-triages suspected exploitation to specialized safety teams.
  • Documented, timely referrals to NCMEC and law enforcement, plus a survivor/family notification loop when action is taken.
  • Radical transparency
  • Quarterly safety reports with real numbers: grooming reports, enforcement actions, time-to-action, ban-evasion metrics, and outcomes of safety experiments (e.g., voice/age-gating).
  • Public changelogs for safety features and an external advisory panel that includes survivor advocates.
  • Survivor support & restitution
  • Fund counseling and provide a clear pathway for impacted families to get help, including rapid contacts, resources, and case follow-up timelines.

Why this matters

Predators are experts at grooming; children are not equipped to detect it; parents can’t monitor every second. Platforms must build guardrails that make exploitation hard—not easy. If a company can ship a new skin overnight, it can ship child-safety upgrades just as fast. Adult-only features—especially dating—must not be discoverable by, surfaced to, or intersect with minors in any way.

What success looks like

  • Temporary suspension or 18+ age-gating in app stores until the audit is complete and fixes are verified.
  • A public, time-bound remediation plan with independent oversight that includes survivor advocates and child-protection experts.
  • Default-safe settings for minors, plus evidence they work in the real world.

We’re not asking individuals to conduct stings or engage suspects. If you encounter illegal behavior, report it to the NCMEC CyberTipline and to local law enforcement. Accountability belongs with platform leadership and app stores.

Sign and share if you believe children’s safety comes before corporate profits. When families speak with one voice, companies—and app stores—listen.

Fearlessly,
— TheBoldAdvocate, FL MomArmy Battalion Leader, backed by MOM ARMY & DAD ARMY

Support now

523


The Decision Makers

Apple App Store
Apple App Store

Supporter Voices

Petition updates