Recent developments in online safety have put platforms like Discord and Roblox under scrutiny due to concerns over child protection. Discord, a chat service with 200 million monthly users, is implementing a new age verification system that includes facial recognition to ensure compliance with global safety regulations. According to TechJuice, this measure aims to protect younger users by requiring face scans or ID verification for accessing adult content. The Hindu notes that this initiative is part of a broader effort to create a 'teen-appropriate experience' by default, though it has sparked controversy over privacy concerns. Meanwhile, Roblox, a gaming platform with 150 million daily users, is facing criticism from the Australian government over reports of child grooming and exposure to harmful content. Business Recorder reports that Communications Minister Anika Wells has expressed 'grave concern' and requested an urgent meeting with Roblox, highlighting the platform's failure to adequately protect children despite implementing nine safety features last year. The eSafety Commissioner, Julie Inman Grant, has warned that Roblox could face fines up to $49.5 million if found non-compliant with Australia's Online Safety Act.
GAMING
Australia Government Meeting with Roblox Over Child Grooming Concerns

Discord and Roblox face scrutiny over child safety. Discord adds age verification; Roblox criticized by Australia for child grooming risks. Both platforms struggle with effective safety measures.
Detailed Analysis
COVERAGE ACROSS SOURCES
How different outlets covered this story.
4 outlets · 4 articles
Filter:
TE
TechJuice
Updated 15h agoBR
Business Recorder
Updated 16h agoBU&C
BBC US & Canada
Updated 1 day agoTE
TechCrunch
Updated 1 day agoYT