Roblox is implementing new safety measures to protect young users following concerns about inappropriate content and potential exploitation. The gaming platform will now offer parents more comprehensive oversight of their children’s online activities.
Starting Monday, parents can access a dashboard showing their child’s interactions, daily screen time, and age verification. The platform will restrict users under nine to “mild” content and require parental approval for more mature games. Preteens will also be blocked from chat functions outside of games.
These changes come in response to serious allegations from Hindenburg Research, which claimed the platform contained inappropriate content, including child sexual abuse material and violent games. UK technology officials have pressed Roblox to improve its safety controls.
Key safety enhancements include:
- Parental dashboard with interaction and time tracking
- Content restrictions based on age
- Limitations on chat and messaging functions
- Automated content moderation
From December 3, additional restrictions will prevent under-13s from accessing new games awaiting maturity ratings. The company aims to create a safer environment for its 90 million daily users, primarily children and young teens.