Getty Images
Roblox has announced it will block under-13s from messaging others on the online gaming platform as part of new efforts to protect children.
By default, child users will not be able to send direct messages in games unless a verified parent or guardian gives them permission.
Parents will also be able to view and manage their child's account, including viewing their online friends list and setting daily limits on their play time.
Roblox is the most popular gaming platform with children aged 8 to 12 in the UK, according to Ofcom research, but it has been urged to make its experiences safer for children.
The company said it will start rolling out the changes from Monday and they will be fully implemented by the end of March 2025.
This means that young children will still be able to access public conversations seen by everyone in games – so they will still be able to talk to their friends – but will not be able to have private conversations without their parents' consent.
Matt Kaufman, Roblox's head of security, said the game is played by 88 million people every day and more than 10% of its entire employees, or thousands of people, work on security features of the platform.
“As our platform grew, we always recognized that our approach to security needed to grow with it,” he said.
In addition to banning children from sending direct messages (DMs) on the platform, this will give parents more ways to easily view and manage their child's activity.
Roblox
The platform says parents will be able to more easily manage controls such as what content their child sees and when they can send direct messages.
Parents and guardians must verify their identity and age with a government-issued ID or credit card in order to access parental permissions for their child, through their own linked account.
But Mr Kaufman acknowledged that identity verification is a challenge facing many tech companies and called on parents to ensure their child's age is correct on their account.
“Our goal is to ensure the safety of all users, regardless of age,” he said.
“We encourage parents to work with their children to create accounts and hopefully ensure their children use their correct age when signing up.”
Richard Collard, deputy head of child online safety policy at UK children's charity NSPCC, called the changes “a positive step in the right direction”.
But he added that they must be supported by effective means of monitoring and age verification of users in order to “result in safer experiences for children”.
“Roblox must make this a priority to effectively combat harm on its site and protect young children,” he added.
Maturity Guidelines
Roblox also announced plans to simplify content descriptions on the platform.
It replaces age recommendations for certain games and experiences with “content labels” that simply describe the nature of the game.
This meant parents could make decisions based on their child's maturity rather than age.
These range from “minimal”, potentially including mild violence or occasional scares, to “restricted” – potentially containing more mature content such as strong violence, language or lots of realistic gore.
By default, Roblox users under the age of nine will only be able to access “minimal” or “light” experiences – but parents can allow them to play “moderate” games by giving consent.
But users can't access “restricted” games until they are at least 17 years old and have used the platform's tools to verify their age.
This follows an announcement in November that Roblox would ban under-13s from accessing “social hangouts,” where players can communicate with each other via text or voice messages, starting Monday.
It also told developers that starting December 3, Roblox game creators should clarify whether their games are suitable for children and block games for those under 13 that do not provide this information.
The changes come as platforms accessed and used by children in the UK prepare to comply with new rules regarding illegal and harmful content on their platforms under the Online Safety Act.
Ofcom, the UK's law enforcement watchdog, has warned that companies will face sanctions if they fail to keep children safe on their platforms.
It will publish its codes of good practice that companies must respect in December.