Support WBUR
Roblox chief safety officer on new safety features, past cases of child abuse on the platform

One month after short sellers tarred Roblox as a “pedophile hellscape,” the video game platform is taking new steps to safeguard the tens of millions of kids who log on every day.
“We're trying to minimize the exposure of some of our youngest users to content that might be more appropriate to teenagers,” says Roblox chief safety officer Matt Kaufman. “For our users under 13, we’re limiting some of the communication channels that they have.”
Monday’s update also gives parents more tools to manage a child’s Roblox activity and comes amid longstanding criticism that predators can contact kids too easily on the platform. Earlier this year, Bloomberg reported on more than two dozen cases where criminals were charged with abducting or abusing children they met through Roblox.
“Speaking as a parent and on behalf of Roblox, any time anything happens to a child that puts them at risk is one too many,” says Kaufman. “We will do everything we possibly can to stop that, but I think sometimes we're losing sight of the tens of millions of people where Roblox is an incredibly enriching part of their life.”
5 questions for Matt Kaufman
Let me put a real case to you from August of this year. A 44-year-old man from Maryland was charged with sexual abuse of a minor and possession of child pornography. He is charged with soliciting sexual photos from a 9-year-old who told police she met him on Roblox. Do your mitigation strategies prevent this kind of meeting?
“Let me start off by saying that any instance of a child being hurt or put in danger that has anything to do with Roblox is just absolutely unacceptable for us. We make safety our number one priority and we invest heavily in systems to prevent situations like the one that you're referring to.
“In that particular case, we have done a fair amount of investigation into what happened there. And when we do those types of investigations, we look into the accounts of the perpetrator victims, and we try to understand what happened and what happened on our platform on Roblox versus what happened on other platforms out there. Oftentimes we find that users on Roblox have access to lots of different ways of communicating and our goal is to make sure that what is happening on Roblox is as safe as it can possibly be.”
There are something like 50,000 chat messages every second on Roblox. How in the world can you police all those?
“So to put it in context, each day, almost 90 million users log in and use Roblox. When you operate at that scale, you have to build fairly sophisticated systems in order to monitor what's happening on the platform. That requires advanced AI in order to determine if we should let messages go through.
“Now I want to point out some key differences between Roblox and other platforms. The first one is none of that communication is encrypted. We see all of it. The second thing is that we filter out a lot of that communication when we find that it doesn't meet our standards. So when you're typing, you might say something, but what actually goes over the wire to somebody else will be edited. It almost looks like when you read like a spy novel and you see documents that are redacted with like black lines through it and stuff like that. That's what our systems do because we're blocking the content before somebody else sees it.”
I understand some of these protection strategies involve advanced AI doing what you describe in addition to human moderators. Roblox, I understand, employs about 3,000 moderators, whereas a TikTok has 40,000. Do you have enough?
“So when we talk about the scale that we run at, you have to run these first lines of defense using automated systems and advanced AI given the volumes and the scale and where the people really come is you need experts to train that AI — they need to be experts in all sorts of categories of violative behavior. And they need to be actively involved with the engineering teams building the AI.
“Those experts do more than just training. In some cases, the AI may not be able to make a decision. And when it's not able to make a decision, it goes to our human experts. The other thing those experts do is they look at appeals. So sometimes we make a decision to block content or to remove a user from the platform. And those users say, ‘Hey, you know, I don't think I did anything wrong’ and they will appeal those decisions. And our humans looked at all of those, too, because it's by looking at those appeals that we learn and we make our systems better. You really can't judge the quality of these moderation systems by the number of people.”
I understand that the default setting is not necessarily the safest setting. For parents saying ‘Why can’t you just help me out? Can you just have the default setting be the safest feature for my child? And then, you know, if my child is old enough, is smart enough and I agree to it, I can lower the wall, but start with the wall as high as possible.’
“We do have default settings, which are based on the age of the user. So for our users under 13 for example, we're limiting some of the communication channels that they have. Very specifically, users between 9 and 13 will be able to communicate in games in public where the communication is filtered, it's monitored and it's not encrypted, and lots of people see what's going on. We do allow that.
“Over 13, there are additional communication features that those users have access to. We are also updating how we talk about the content on the platform. So we are moving to a content system which provides more insights into the type of content that people are experiencing. And for our youngest users, we are adding more restrictions to limit what they can and what they can play on the platform, and those are more restrictive than they were before the release that we're talking about today.”
Bloomberg interviewed eight current or former Roblox workers who say the company favors growth over safety. How can you guarantee that the safety division is not going to be sacrificed for profits, for shareholders?
“I really reject the notion that we would prioritize growth over safety. Safety is the right thing for children. It's the right thing for parents and caregivers. It's the right thing for investors, and it's the right thing as we run the company.
“We have no idea who Bloomberg talked to. They haven't told us. I guess we take their word for it. I think sometimes as an employee in a large company, you're only seeing a small piece of the puzzle. For example, when we think about all of our investigative staff, we did make a decision to bring some staff to be onsite with us. We're in San Mateo, California, and some people look at that as like, ‘Wow. Like, why are you asking people to move?’ But we make choices like that because in that particular case, we wanted our investigators to be right next to our engineers and our data scientists and our policymakers, because that's what makes a safer system.
“Now, one person may not think it's the right decision, but when we run the company and we think about these critical decisions we make about introducing AI, about where our workers come to work every day, about how we staff our teams of experts, we look at those in order to make Roblox as safe as it can be, and it's the right thing to do as a company. So again, I reject these assertions that we make decisions to favor growth over safety.”
This segment aired on November 18, 2024.

