Technology

Roblox launches numerous new safety features

Examples of three new safety features on Roblox.

The gaming platform Roblox announced a suite of new safety features aimed at protecting young users and giving parents more control over their children’s accounts.

In a blog post published Monday, the company said that children under age 13 will no longer be able to directly message other Roblox users, outside of the games or experiences they play.

Roblox is also debuting a new built-in setting that restricts users younger than 13 from directly messaging other users in games and experiences, which parents can choose to change. Users under 13 will be able to share public broadcast messages in games or experiences by default.

Previously, this age group could send direct messages, both inside and outside games, and public broadcast within a game. Now, under 13 users can only do the latter. Online child safety advocates have argued that direct-messaging capabilities can make young users vulnerable to predation.

Parents and caregivers will also now have access to what Roblox describes as remote management. Instead of adjusting parental controls through their child’s account, adults who manage these settings can now link their own Roblox account to their child’s.

In order to link the two accounts, the adult has to verify themselves using government-issued identification or a credit card. Once verified, a parent will be able to manage their child’s settings via a parental control dashboard, from their own device and account. New options in that dashboard include the ability to review and approve requests to play certain categories of games based on their content maturity label, set screen-time limits, and monitor a child’s friend list.

An example of the new Roblox parental controls dashboard.
The new parental controls dashboard lets parents review their child’s screen time and friends list. Credit: Roblox

“We believe that these built-in protections are a critical part of our goal of making Roblox safe for all users, by default,” Dina Lamdany, Roblox’s senior product manager for user settings and parental controls, said in a briefing for reporters.

Lamdany noted that the changes were part of a years-long effort to enhance safety on Roblox.

Roblox worked with the Family Online Safety Institute (FOSI) and the National Association for Media Literacy Education, among other partners, on creating the new features.

In a statement provided by Roblox to the media, Stephen Balkam, CEO of FOSI, described the enhanced parental controls as a “considerable leap forward.”

“By offering robust tools for non-intrusive monitoring and privacy, Roblox is providing families with the confidence they need to foster a secure and enriching online environment,” Balkam said.

Concerns about child predators on Roblox

Roblox, however, has also been under fire for disturbing revelations about predation and grooming that occurred or began on the platform.

Last November, a group of families filed a class-action lawsuit against Roblox for exposing underage users to inappropriate or explicit content and allowing them to engage in inappropriate encounters. The company disputed the allegations.

This summer, extensive reporting by Bloomberg Businessweek revealed how easy it can be for predators to meet and groom minors on Roblox. While the platform’s policies already prohibit sexual content, as well as romantic and flirtatious gestures, bad actors have developed strategies for coercing minors into sending them explicit child sexual abuse material. They also know how to evade moderation- and filter-based detection.

In one case, a 15-year-old girl met a popular Roblox game developer on the platform, in January 2022. The girl communicated with and received gifts from him over a period of months. In May, she left her home and was transported across state lines to meet him. The 22-year-old man, later identified as Arnold Castillo, sexually assaulted the teen multiple times before police rescued her.

In the wake of that case, Roblox implemented new safety practices and created roles related to child safety investigations and child exploitation moderation, according to Bloomberg Businessweek.

Roblox chief safety officer Matt Kaufman told reporters in a briefing about the new safety features that they were not in response to a specific incident, and that the company regularly updates its policy and safety systems. He noted that Roblox had introduced more than 30 improvements in 2024 so far.

Example of how to set content maturity level on Roblox.

Parents can set the content maturity level for their child. Credit: Roblox

Example of what a child sees when content is restricted.

Minors will see a restriction notice if the content maturity level exceeds their setting. Credit: Roblox

The new safety features also include content labels that describe the maturity level of each game, instead of age-based labels. But users 13 and younger will encounter age-gating for certain experiences, based on user behavior sometimes demonstrated in those games.

Additionally, users under 13 won’t be able to access experiences designed primarily for socializing with users who are not on their friends list, or that allow free-form writing or drawing.

To access the platform’s most mature content, such as profanity, users have to declare themselves as 17 or older and verify themselves using a government ID.

If you are a child being sexually exploited online, or you know a child who is being sexually exploited online, or you witnessed exploitation of a child occur online, you can report it to the CyberTipline, which is operated by the National Center for Missing Exploited & Children.

Mashable