\n\n\n\n\n\n\n
Roblox’s AI-Powered Age Verification Is a Complete Mess

Just days after launching, Roblox’s much-hyped AI-powered age verification system is a complete mess.

Roblox’s face scanning system, which estimates peoples’ ages before they can access the platform’s chat functions, rolled out in the US and other countries around the world last week, after initially launching in a few locations in December. Roblox says it is implementing the system to allow users to safely chat with users of similar ages.

But players are already in revolt because they can no longer chat to their friends, developers are demanding Roblox roll back the update, and crucially, experts say that not only is the AI mis-aging young players as adults and vice versa, the system does little to help address the problem it was designed to tackle: the flood of predators using the platform to groom young children.

In fact, WIRED has found multiple examples of people advertising age-verified accounts for minors as young as 9 years old on eBay for as little as $4.

After WIRED flagged the listings, eBay spokesperson Maddy Martinez said the company was removing them for violating the site’s policies.

Roblox did not respond to requests for comment. However, Roblox addressed some of the criticism in an update on Friday, writing: “We are aware of instances where parents age check on behalf of their children leading to kids being aged to 21+. We are working on solutions to address this and we’ll share more here soon.”

Roblox announced the age verification requirement last July as part of a raft of new features designed to make the platform safer. The company has come under intense pressure in recent months after multiple lawsuits allege the company failed to protect its youngest users and facilitated predators to groom children.

The attorneys general of Louisiana, Texas, and Kentucky also filed lawsuits against the company last year making similar claims, while Florida’s attorney general issued criminal subpoenas to assess whether Roblox is “aiding predators in accessing and harming children.”

Roblox claims that requiring people to verify their ages before allowing them to chat to others will prevent adults from being able to freely interact with children they don’t know.

While the process is optional, refusing to do it means a person will no longer have access to the platform’s chat functions, one of the key reasons most people use Roblox.

To verify their ages, people are asked to take a short video using their device’s camera, which is processed by a company called Persona that estimates their age. Alternatively, users can upload a government-issued photo ID if they are 13 or older.

Roblox says all personal information is “deleted immediately after processing.” However many users online say they are unwilling to conduct age verification over privacy concerns.

Peope who have verified their ages are only allowed to chat to a small group of other players around their own age. For example, those verified as under 9 can only chat with players up to the age of 13. Players deemed to be 16 can chat with players between 13 and 20.

The company initially rolled out the AI-powered age verification system in Australia, New Zealand, and the Netherlands in November, but this week Roblox revealed that just half of all players verified their ages using the system.

In the days since the update was rolled out globally, players and developers have flooded forums, Reddit, and social media platforms like X and TikTok with complaints about the system.

Culture,Culture / Video Games,Half Measuresartificial intelligence,video games,crime,roblox,face recognition#Robloxs #AIPowered #AgeVerification #Complete #Mess1768330864

Leave a Reply

Your email address will not be published. Required fields are marked *

Instagram

[instagram-feed num=6 cols=6 showfollow=false showheader=false showbutton=false showfollow=false]