Roblox Launches Mandatory AI Age Verification Amid Lawsuit

Roblox has introduced a compulsory AI‑driven facial age‑verification system for all users, aiming to meet evolving child‑protection laws. Within days, a federal lawsuit claimed the new safeguards are still inadequate to prevent sexual exploitation, sparking debate over privacy, biometric data handling, and the platform’s future safety strategy.

How the AI Age‑Verification System Works

Every user who wants to access chat features must complete a short “video selfie check.” The AI algorithm estimates the user’s age and places them into one of six predefined age brackets. Users aged 13 and older may also verify their identity with a government‑issued ID; younger users rely solely on the facial scan. Roblox states that all biometric images and video are deleted immediately after the age estimate is generated, and no raw identity data is stored on its servers.

Age Bracket Structure

  • Under 9
  • 9‑12
  • 13‑15
  • 16‑17
  • 18‑20
  • 21 and over

The brackets limit cross‑generational communication. For example, a user in the 9‑12 bracket can only chat with peers in the same bracket or the adjacent Under 9 and 13‑15 groups. Adults in the 21 and over bracket are blocked from communicating with minors unless a “Trusted Connections” protocol is activated, which requires mutual consent and, in some cases, parental verification via QR code or contact‑list confirmation.

Legal Challenge and Federal Complaint

A federal complaint filed on behalf of a minor alleges that Roblox “failed to implement reasonable safeguards, monitoring, and age‑verification measures to prevent foreseeable sexual exploitation.” The filing argues that despite the AI‑driven checks, the platform’s existing controls are insufficient to protect children from grooming and other predatory behavior. The lawsuit seeks stricter enforcement of child‑protection statutes and greater transparency around biometric data handling.

Potential Impact on Users and Developers

The age‑bracket system could reduce cross‑generational interaction, affecting mentorship and community building that many developers rely on. Additionally, Roblox’s “behavior mismatch” feature may trigger re‑verification prompts when a user’s in‑game actions appear inconsistent with the verified age, potentially causing friction for legitimate players.

From a regulatory perspective, the case highlights the growing pressure on tech platforms to navigate a complex landscape of state and federal age‑verification requirements, especially concerning biometric data, parental consent, and real‑time monitoring.

Next Steps for Roblox and Parents

Roblox has not publicly responded to the federal filing. The company continues to update its developer forum with roadmap details, emphasizing ongoing refinements based on behavioral signals. Parents and guardians are encouraged to review Roblox’s privacy policies, enable available parental controls, and actively monitor their children’s activity while the legal process unfolds.