Author: Dean Takahashi / Source: VentureBeat

Roblox has more than 80 million monthly players on its Lego-like virtual world platform for teens, and it only takes a few bad apples to ruin the fun. That’s why the company is moving ahead with a digital civility initiative, which is meant to improve online safety and to start reducing trolling and toxic behavior.
The company woke up to the problem last year when a child’s game character was sexually violated in the virtual world. Tami Bhaumik, vice president of marketing and digital civility for Roblox, briefed me on the San Mateo, California-based company’s progress in an interview at the recent DICE Summit in Las Vegas.
Bhaumik hired Laura Higgins, an online safety expert in the United Kingdom, as Roblox’s first director of digital civility in January. Higgins, who was the online safety operations manager at the South West Grid for Learning, has been advising Roblox for more than 18 months. The company has more than 600 human moderators to patrol the content and behavior on the Roblox, and it is dedicating more resources to the task of changing online behavior.
But it’s hard to stay ahead of the problem, Bhaumik said. Kids are smart and they come up with ingenious ways to get around the rules.

“The industry is always going to have a whack-a-mole problems,” she said. “Kids try to gamify the system. We try our best to stay ahead of things. With the combination of technology, filters, artificial intelligence, and human moderation, we try our best to stay ahead of things. We are doing a pretty good job. We have an entire product engineering team dedicated to it. We have third-party filtering software on top of our own.”
She added, “We are in good shape from a safety standpoint. It’s the proactive digital civility that we are moving forward on now. It’s about how to get kids to make smart choices online.”
Roblox…
The post How Roblox is moving ahead with its digital civility initiative appeared first on FeedBox.