SAN FRANCISCO—As long as popular video games depend on online services like matchmaking and chat, those games will suffer from toxicity, harassment, and bullying. Or at least that’s the assumption that some panelists at this year’s Game Developers Conference (GDC) are eager to either soften or nullify altogether.
Ahead of the conference’s show floor opening on Wednesday morning, we listened to a few participants offer their hopes for more positive social gaming environments—and three perspectives stood out as a combined pitch for a brighter future. The proof isn’t yet in these pitches’ pudding, but each points to different, seemingly smarter steps toward a better online-gaming ecosystem.
Turning the temperature down on “heat maps”
The first pitch, from game-moderation startup Good Game Well Played (GGWP), suggests aiming an AI-powered laser at the problem. Co-founded by pro gamer and entrepreneur Dennis “Thresh” Fong, GGWP is designed to slot into existing games’ moderation systems to make report-based moderation stronger by coupling it with two types of real-time data: voice chat and gameplay “heat maps.”
The project began near the outset of the 2020 pandemic, Fong tells Ars Technica, after he chatted with existing gamemakers about the variety of toxic behaviors in online games. Fong was startled by one unnamed game’s revelation: Less than 1 percent of its user-generated reports were followed up with moderation. (Industrywide stats on that topic remain obfuscated, in part, since playerbases are fragmented across a number of online portals, ranging from Xbox Live and PlayStation Network to publisher- and game-specific matchmaking queues. Industrywide audits don’t really exist to confirm toxic online trends across them all.)
The issue, Fong says, boils down to available resources. Insults, bad sportsmanship, and even racist slurs go to the back of the moderation queue in this unnamed game. Reports of violent threats, self-harm, danger to children, and other extreme cases are what gets attention.
This iceberg approach leaves enough annoying, gameplay-centric toxicity intact to frustrate players—or drive them out of certain games entirely. Thus, Fong and his eventual GGWP partners began plotting a system to triangulate any in-game reports with data from the gameplay sessions in question. Fong says that, with one API call, GGWP can funnel voice chat through its systems and use voice recognition to parse whatever language was used by a reported player. Other API calls can do the same for tracking each gameplay session’s relevant data, then checking whether a reported player has negative patterns in other sessions in the same game.
Tracked behaviors in GGWP’s eyes include friendly fire, rage-quitting, body blocking (intentionally standing in the way of your teammates while they try to attack enemies), and feeding (playing poorly on purpose to let opponents win). Fong also suggests that players who don’t exhibit any of GGWP’s tracked negative traits could benefit from a positive reputation score—though exactly how a game would recognize that remains unclear.
Fong suggests that GGWP’s voice chat tracking systems will recognize session-specific context, particularly when the audio in question is between friends. He believes this will work by using a constantly evolving AI model trained on in-game chat. However, when pressed about the system’s ability to parse language that might target traditionally marginalized groups, Fong suggested that the phrase “get back in the kitchen” could mean different things to different people in an online game. (As of press time, GGWP’s public website doesn’t include any women listed on its board.)