Roblox child safety overview:
- Who: Roblox is implementing new safety measures affecting millions of users worldwide.
- Why: The changes aim to prevent children from chatting with adult strangers on the platform.
- Where: The measures are effective globally, beginning with Australia, New Zealand, and the Netherlands.
- How to Get Help: If you or your child encountered a child predator on Roblox, you may be eligible to take legal action by joining a Roblox class action lawsuit.
Gaming platform Roblox is tightening safety protocols to protect young users from adult strangers.
The company announced new age checks for communication on the Roblox website on Nov. 18.
The company is introducing mandatory age verification for accounts using chat features, initially launching in Australia, New Zealand and the Netherlands in December, with a global rollout planned for January.
The Roblox child safety update comes in response to criticism and legal challenges over child safety concerns. Roblox has faced scrutiny for allowing minors to access inappropriate content and interact with adults.
In a BBC report, the platform’s chief executive, Dave Baszucki said: “If you’re not comfortable, don’t let your kids be on Roblox”.
Roblox’s latest safety measures are a response to growing concerns about adult interactions with minors. In a BBC test earlier this year, it was demonstrated that users of significantly different ages could communicate, highlighting the need for stricter controls.
Anna Lucas, online safety supervision director at the United Kingdom’s Office of Communications, or Ofcom, said: “Platforms must now take steps to keep kids safe.”
Roblox enhances child safety with face age verification
The new measures include facial age verification technology, which estimates a user’s age using the device’s camera. Players can only chat with others in similar age ranges, unless they add someone as a “trusted connection”, which is a feature for people they know.
Roblox’s age verification process involves facial estimation technology, which processes images through an external provider and deletes them immediately after verification, according to the website announcement.
This approach aims to provide age-appropriate experiences and prevent adult predators from targeting young users. The platform already restricts image and video sharing in chats and limits links to external sites.
The new system will prevent under-13s from sending private messages or participating in certain chats without parental consent.
The changes coincide with a virtual protest by campaign groups ParentsTogether Action and UltraViolet, which are advocating for stronger child-safety measures on Roblox.
Roblox is committed to enhancing user safety by implementing these new measures. The company expects other gaming platforms to adopt similar methods to ensure child safety.
Currently, Roblox is facing an ongoing class action lawsuit alleging the company failed to implement adequate safety measures to protect children from predators and failed to warn parents about the risks associated with the platform.
What do you think of the new Roblox child safety measures? Let us know in the comments.
Don’t Miss Out!
Check out our list of Class Action Lawsuits and Class Action Settlements you may qualify to join!
Read About More Class Action Lawsuits & Class Action Settlements: