TAFT, Calif. (KERO) — On April 13 at 10 a.m., the Kern County Sheriff’s Office received a report of a 10-year-old who was last seen the night before.
“We are bringing this case because of what happened to this little girl, which is nothing short of horrifying,” said Alexandra Walsh, a partner with Anapol Weiss, who is representing the Taft family.
Detectives on the case found the child was communicating with 27-year-old Matthew Macatuno Naval, where, allegedly, they communicated through the gaming platform Roblox, before moving to Discord, a free communication platform where users chat through voice, video, and text.
“She was identified, targeted, and groomed by a really dangerous sexual predator on the Roblox platform,” said Walsh.
It was through these apps that the 10-year-old allegedly shared her address to Naval.
“He drove several hours away to the home. Thankfully, the police were able to find her by using the GPS was able to locate her. And she’s now home safely with her family,” said Walsh.
Now, the family is suing Roblox and Discord for negligence and strict liability.
Walsh says many families have gone through a similar experience.
“The family wants Roblox to be honest about the dangers that kids face when they’re allowed to play on these platforms,” said Walsh. “Because parents have a lot of tough decisions to make when it comes to their kids and their kids being online.”
23ABC Neighborhood Reporter Avery Elowitt reached out to Roblox for an interview.
"We are deeply troubled by any incident that endangers our users. While we cannot comment on claims raised in litigation, we always strive to hold ourselves to the highest safety standards. This includes strong processes to detect and act on problematic behaviors, including advanced technology and 24/7 human moderation and attempts to direct users off platform, where safety standards and moderation may be less stringent than ours. At Roblox, safety is a top priority and we are continually innovating new safety features – over 100 this year alone – that protect our users and empower parents and caregivers with greater control and visibility. While no system is perfect, Roblox is designed with rigorous built in safety features, and our policies are purposely stricter than those found on other platforms, including limiting chat for younger users, not allowing sharing images through chat and filters designed to block the sharing of personal information. We also understand that this is an industry-wide issue, and we are actively dedicating resources to develop industry-wide standards and solutions focused on keeping children safe online. We partner with law enforcement and leading child safety and mental health organizations worldwide to combat the sexual exploitation of children.” - Roblox spokesperson
Additional Background Information:
- At Roblox, we are constantly innovating safety tools and launching new safeguards. In the past year, Roblox has introduced over 100 new features to protect its youngest users and empower parents and caregivers with greater control, including updated parental controls, stricter defaults for users under 13, and new content maturity labels.
- Roblox has taken an industry leading stance on age-based communication and recently introduced new age estimation technology to help confirm a user’s age through a simple, quick selfie-video. This technology allows us to provide a safer and more tailored online experience for our users by accurately confirming their age and unlocking appropriate features. Read more in this blog post [corp.roblox.com].
- Roblox started as a platform for children, and while 64% of the user base is now 13 or over, the platform has rigorous safety features built in, and its policies are purposely stricter than those found on social networks and other user-generated content platforms. This includes rigorous text chat filters that block inappropriate language and attempts to direct users under 13 off the platform or solicit personal information. Furthermore, users under 13 cannot directly message others outside of games or experiences unless parental controls are adjusted, and direct image sharing between users is prohibited.
- We dedicate substantial resources to help detect and prevent inappropriate content and behavior. Our Community Standards [en.help.roblox.com] set clear expectations for how to behave on Roblox and define restricted experiences [en.help.roblox.com]. We have both advanced AI models together with a large, expertly trained team with thousands of members dedicated to protecting our users and monitoring 24/7 for inappropriate content.
- For inappropriate content: Our Trust & Safety team takes swift action (typically within minutes) to address violative material and accounts are removed through AI scans, user flags, and proactive monitoring, with a dedicated team focused on enforcement and swift removal. [For instance, Diddy experiences violate our Real World Sensitive Events policy and we have a dedicated team working on scrubbing that content from the platform.]
- In August we released our latest open-source model, Roblox Sentinel [corp.roblox.com], an advanced AI-powered system designed to help detect child endangerment interactions early for faster intervention. Human experts continue to be essential for investigating and intervening in the cases Sentinel detects. Read more in this blog post [corp.roblox.com].
- We know safety is critically important to families, and we strive to empower our community of parents and caregivers to help ensure a safe online experience for their children. This includes a suite of easy to use parental controls to provide parents with more control and clarity on what their kids and teens are doing on Roblox. Parents can:
- Block or limit specific experiences based on content maturity ratings
- Block or report people on their child’s friends list
- See which experiences their child is spending the most time in
- Set daily screen time and spending limits
- Families and caregivers can find resources detailing our safety measures here. [corp.roblox.com] (See blog posts from November 2024 [corp.roblox.com] and April 2025 [corp.roblox.com], and new safety tools for teens [ir.roblox.com].)
- Roblox collaborates with law enforcement, government agencies, mental health organizations, and parental advocacy groups to create resources for parents and to keep users safe on the platform. Through vigorous global outreach, we’ve developed deep and lasting relationships with law enforcement at the international, federal, state, and local levels. For example, we maintain direct communication channels with organizations, such as the FBI and the National Center for Missing and Exploited Children (NCMEC), for immediate escalation of serious threats that we identify.
- We proactively report potentially harmful content to NCMEC, which is the designated reporting entity for the public and electronic service providers regarding instances of suspected child sexual exploitation. In 2024, we submitted 24,522 reports to NCMEC (0.12% of the 20.3 million total reports submitted to NCMEC).
Discord also said they don’t comment on legal matters, but sent us this statement:
"Discord is deeply committed to safety and we require [discord.com] all users to be at least 13 to use our platform. We use a combination of advanced technology and trained safety teams to proactively find and remove content that violates our policies. We maintain strong systems to prevent the spread of sexual exploitation and grooming on our platform and also work with other technology companies and safety organizations to improve online safety across the internet."
Walsh said, “We have a big fight ahead of us. Roblox is one of the fastest-growing tech companies in the world. It has seemingly endless resources, many, many, many lawyers. I am just honored to represent this family and so amazed by their grit and their determination to make a difference.”
Next, Roblox and Discord will be moving to compel the case to arbitration.
Stay in Touch with Us Anytime, Anywhere: