Roblox has announced changes to its security systems following reports that criticized the platform’s failure to adequately protect children.
Last July, an investigative report from Bloomberg highlighted potential security concerns on Roblox, including how law enforcement had arrested “at least” 24 perpetrators using the platform. This led to further scrutiny, culminating in Turkey blocking the platform in August “to protect its children.” In response, Roblox has decided to address these issues, particularly in terms of how it manages preteen users. The changes will be implemented gradually, and the platform has also introduced a set of guidelines for users and parents to follow to ensure player safety.

First, Roblox will require parental permission for users under 13 to access certain chat features, and users under 9 will need parental consent to play experiences labeled as containing “moderate” content. The new content labeling system will help determine which content is appropriate for different age groups, with “moderate” referring to a moderate amount of crude humor or violence.
Another key change is the introduction of parent accounts, allowing parents and children to link their Roblox accounts for better control and insights. Parents will be able to monitor their children’s daily screen time, view their Roblox friends, and easily update parental control settings from their own devices.

The upcoming changes were further detailed in an email to parents obtained by Bloomberg, in which Roblox outlined additional updates coming to the platform next month. These include the new content maturity setting, designed to help parents better manage the content their children can access.
While still early, Roblox appears to be addressing its security concerns seriously, and parents will soon have more tools to protect their children on the platform. Consumers can visit the official Roblox website for more information on the impending changes.