BANGALORE/SAN FRANCISCO – Profanities and other offensive content that basic word-filtering tools are designed to catch can be found in some game titles and user profiles on children’s gaming platform Roblox, searches of the website show, despite the company’s “no tolerance” policy and assurances it has safeguards to enforce it.
Powered by user-created games, Roblox filed late Thursday for a multibillion-dollar stock market debut, riding the lockdown entertainment boom with its appeal as a place for safe fun and interactions for the youngest gamers.
But parenting groups and investors alike said they were concerned about whether the company’s automated systems to moderate content can effectively delete potentially offensive language and images that pop up on the platform.
Simple Google keyword searches of its site – conducted twice by Reuters since the company announced its stock market plans in October – turned up more than 100 examples of abusive language or imagery. One profile, for example, included “shut up and rape me daddy” in the profile description line, while another had “MOLESTINGKIDSISFUNTOME.”
In response to written questions, company spokeswoman Teresa Brewer said in a statement that Roblox “has no tolerance for inappropriate content, which is why we have a stringent safety system, including proprietary text filtering technology, third-party machine learning solutions, and customized rules on what to block, which we update daily.”
Last month, Roblox removed the examples within hours of Reuters sharing them with the company. Roblox has said it has 1,600 people working full time to eliminate inappropriate content on the platform.
In the stock registration filed after this story was published, the company acknowledged that “from time to time inappropriate content is successfully uploaded onto our platform and can be viewed by others prior to being identified and removed by us” and was a “risk factor.”
“This content could cause harm to our audience and to our reputation of providing a safe environment,” the company wrote of the former concern.
“If we are unable to prevent, or are perceived as not being able to sufficiently prevent, all or substantially all inappropriate content from appearing on our platform…
Tech and entertainment watchdog Common Sense Media has lifted its suggested age for Roblox players to 13 years old over the last few years, after abusive language in profiles and sexual content in games kept appearing on Roblox after the company said it would remove it, according to Jeff Haynes, who oversees video gaming coverage for the nonprofit.
Five online safety experts who reviewed the examples found by Reuters said they were surprised such profiles and wording managed to slip through when rudimentary filtering systems can catch and remove such content.
Magid, a Roblox advisor and CEO of the nonprofit ConnectSafely.org – which takes funding from Roblox and other companies to promote safety guidelines for parents – said the examples Reuters had found showed the safeguards did not fully work.
“I think scale is part of it. What I don’t understand is why the software didn’t pick it up,” he told Reuters.
As its stock listing draws near, the company could come under closer public scrutiny from Wall Street, said John Streur, chief executive of Calvert Research and Management, which focuses on socially responsible investing.
“From an investor perspective, it will be a major problem if the headlines months from now reveal that the company is unable to manage the risk of its platform,” Streur told Reuters. Roblox declined to respond to comment on that view.
This article was first published in Asia One . All contents and images are copyright to their respective owners and sources.