Roblox is taking a bold step to enhance child safety by implementing facial age checks for all users, effectively blocking children from chatting with adult strangers. This move comes in response to growing concerns about the platform's safety measures and the potential risks it poses to young users. With over 80 million daily players, Roblox has faced criticism for allowing inappropriate content and communication with adults, especially those under 13. The company's CEO, Dave Baszucki, previously stated that parents should not let their children use Roblox, but this new policy aims to address these concerns directly. The facial age check technology, currently voluntary, will be mandatory in Australia, New Zealand, and the Netherlands starting December, with a global rollout in January. This system categorizes users into age groups, ensuring that players can only chat with others in similar age ranges. Under-13s will still be restricted from private messages and certain chats unless a parent grants permission. The initiative is a response to incidents like a BBC test where a 27-year-old and a 15-year-old exchanged messages on unlinked devices. Roblox argues that this new approach will create more age-appropriate experiences and expects other platforms to follow suit. A digital petition, signed by over 12,000 people, demands stronger child-safety measures, highlighting the need for Roblox to prioritize user protection.