Roblox to Block Kids from Chatting With Unknown Adults as New Safety Rules Roll Out

 Roblox to Block Kids from Chatting With Unknown Adults as New Safety Rules Roll Out

Roblox, one of the world’s biggest gaming platforms, is introducing sweeping new restrictions designed to stop children from communicating with adult strangers online.

Starting in December, users in Australia, New Zealand, and the Netherlands who want to access chat features will be required to complete mandatory age verification. The rollout will expand worldwide beginning in January.

The platform—used by millions of children—has come under intense scrutiny for exposing young players to inappropriate content and interactions with adults. Roblox is also facing multiple child-safety lawsuits in the US.

These new measures come just as Australia prepares to ban social media accounts for children under 16, raising pressure on lawmakers to decide whether gaming platforms like Roblox fall under the same restrictions.

“If You’re Not Comfortable, Don’t Let Your Kids Use Roblox”

Roblox CEO Dave Baszucki has argued that the company invests heavily in safety, but he also acknowledges that parental judgment plays an essential role.

“If you’re not comfortable, don’t let your kids be on Roblox,”
Baszucki told the BBC earlier this year.

It’s a blunt message—but one that reflects growing concern about how children navigate online spaces.

A Major Shift: Facial Age Verification Becomes Mandatory

Roblox will soon become the first major gaming platform to require facial age estimation for users who want to access chat tools.

The system uses the device’s camera to scan a person’s face and estimate their age. According to Roblox’s chief safety officer Matt Kaufman, the technology is “very accurate,” generally estimating age within a one- to two-year range for users aged five to 25.

After verification, players are sorted into age brackets:

  • Under 9
  • Ages 9–12
  • Ages 13–15
  • Ages 16–17
  • Ages 18–20
  • 21+

Children will only be able to chat with people in similar age ranges unless they explicitly add someone as a trusted connection—a feature intended for real-life friends or family.

Kids under 13 will still be blocked from private messages unless parents enable them. The platform will continue to restrict images, videos, and most external links in chats.

Roblox says all images used for facial checks are processed by a third-party provider and deleted immediately afterward.

A Long-Needed Change, Experts Say

Child-safety organizations have long warned that Roblox has allowed children to interact far too easily with adults.

Rani Govender from the NSPCC said young users had been exposed to “unacceptable risks” on the platform and that the company “must ensure they deliver real change.”

The UK’s Online Safety Act now requires tech companies to take stronger action to protect children. Ofcom—which enforces the law—said it welcomed Roblox’s new verification rules.

Huge Platform, Huge Responsibilities

With more than 80 million daily active users, about 40% under 13, Roblox is one of the most influential online spaces for young people.

But the platform has repeatedly faced criticism:

  • In a BBC experiment earlier this year, a 27-year-old and 15-year-old on separate devices were able to message each other.
  • Roblox is being sued by Texas, Kentucky, and Louisiana over child-safety concerns.
  • Campaign groups accuse the platform of becoming “a playground for predators.”

This week, advocacy organizations ParentsTogether Action and UltraViolet held a virtual protest inside Roblox, delivering a digital petition with over 12,000 signatures demanding tougher protections.

Roblox’s Big Bet on Safety

Roblox argues that the new verification system will create more age-appropriate experiences and maintain better control over who can interact on the platform.

The company expects other gaming and social platforms to eventually follow suit as pressure mounts globally for stricter child-safety measures.

Related post