Comment & Opinion

How blending AI and human moderation built a safer online community in MMO Star Stable

Utopia Analytics and Star Stable outline the importance of content moderation and the challenges around moderating younger communities, following the 2022 launch of the mobile version of horse-themed MMORPG

How blending AI and human moderation built a safer online community in MMO Star Stable

Cecilia Munthe is director of customer experience at Star Stable, and Mari-Sanna Paukkeri is CEO and co-founder of Utopia Analytics

Every game publisher will have some level of moderation in place if their game is based on online play. But things can get particularly tricky when you’re moderating live chat in 14 different languages for an MMO with over 10 million users – most of whom are under 18 years old.

For Star Stable, the main focus of moderation is to keep players safe, ensuring that interactions between players are running smoothly, and everyone abides by the game’s user policy.

It’s by no means a small undertaking, and we have been refining our approach for several years now. They use Utopia Analytics’ AI moderation tools to automatically filter and block the majority of unsafe or abusive chat messages in almost real-time, in conjunction with a team of human moderators who oversee all the content being moderated. Star Stable also uses moderators within the game to encourage players to play and engage positively – sort of a pre-moderation nudge approach.

Moderate approaches

For moderation to be successful, a flexible approach is required. Language is constantly evolving, and there are plenty of nuances around certain words depending on their use, especially in a gaming environment. On top of that, players might use slang, emojis, and deliberate misspellings to try and get around chat moderation. If players feel like moderation tools are unfair and getting in the way of them communicating with other players, it can impact their enjoyment of the game, which led to Cecilia Munthe, director of customer experience at Star Stable, pursuing Utopia Analytics as a moderation provider.

“Our previous AI moderation system was too strict and causing too many barriers for players,” Munthe explained. “One of the things that moderation tools do is ban certain words, but as soon as you start banning or blacklisting so many different words, you risk taking away the flexibility of language and context.

“Ultimately, our players came to us complaining that the chat function in Star Stable wasn’t flexible enough; they couldn’t use language the way that they wanted and were getting banned for things that weren’t necessarily negative or offensive – and there was no easy way of lifting bans. A community with teens is a little bit like a schoolyard, and there needs to be room for open conversations that can foster growth and personal relationships between players.”

The challenges Munthe describes above result from rules-based moderation, which follows a strict set of rules for automatically detecting and blocking the use of certain words without taking into account context, and is the most widespread approach to text moderation.

AI can do a much better job of analysing text and content at scale. Still, for AI moderation to work at its best, it needs continuous training – as Mari-Sanna Paukkeri, CEO and co-founder of Utopia Analytics, explained.

“Although AI moderation tools significantly reduce the need for manual moderation, these tools still need input from humans so they can constantly evolve while taking new words and language into account. I think Star Stable’s experience with other AI providers before we began working with them perfectly illustrates that not all AI is created equal.

Although AI moderation tools significantly reduce the need for manual moderation, these tools still need input from humans
Mari-Sanna Paukkeri

“The key difference with our AI is that it was created to understand the semantic meaning of entire sentences and not just specific words, and is continuously learning as the system works through millions of messages a day. If the AI is unsure of anything, that content is flagged to the human moderation team for a final decision, which in turn helps improve the AI's accuracy.”

While Munthe says the flexibility of the new AI tools led to a significant reduction in the number of complaints regarding live chat, the younger demographic of Star Stable’s players, alongside the fact it's an online MMORPG, means there is always more work to be done.

Keeping the parents informed

One of the biggest challenges that Munthe and her team are working to address is the number of players trying to share personal information so they can communicate with each other outside of the game – something that breaks the rules of Star Stable’s user policy.

While many of these instances are well-natured due to Star Stable’s teenage players simply wanting to expand their friendships beyond the game, it’s an area that Munthe and her team treat with the utmost seriousness.

So what role do the parents play in all of this? Education around the importance of online safety is important. Munthe believes all parents are responsible for knowing how their children are using the internet and how they’re communicating online.

“There’s no way that we, or any other large game studio for that matter, can have a complete overview of every single conversation that’s happening in our games,” Munthe explained. “We understand that as kids move into their teenage years, having these sorts of conversations can prove more challenging, but think of it this way: it isn’t unusual for parents to ask their kids what they did at school when they come home is it? I think we should approach online interactions in the same way.”

As well as using moderation tools to tackle this and keep players safe, Star Stable has its in-house team patrolling the game and spending time with players, and also uses human moderators known as ‘Game Masters,’ that act as mentors and help mitigate any negative behaviour they notice in the game. Munthe says they’re also a great way of getting constant and reliable feedback on how players communicate with each other in the game.

How effective is this?

Measuring the effectiveness of moderation in terms of numbers can be difficult. Still, Star Stable has built an internal dashboard using Utopia’s data to see areas of access and specific areas to monitor. Around four per cent of all messages on Star Stable are classed as negative, which Munthe says is much better than their competitors, especially as data shows the vast majority of those negative messages result from spam or link sharing.

Having this level of insight into how Star Stable community communicates and responds to moderation allows Munthe and her team to “focus on the positives, as well as the negatives.” These reporting tools also allow Star Stable to monitor certain words and phrases, which is particularly important due to how moderation has changed due to the pandemic.

Munthe continued: “We started to see a lot more players talking about and contacting us regarding mental health issues. Thankfully, we already had the processes in place to safeguard our users and signpost them to professional mental health services so they can get the support they need.

"While this spike in such queries outlines a wider trend around global mental health issues, the fact that players were contacting us directly in some cases highlights the level of trust between our staff and players, something that I don’t think would be possible without the effectiveness of our moderation in previously safeguarding players.”

Star Stable is live in 180 different markets, and that presents its own set of challenges when it comes to responding to the various and ever-changing legislation regarding user and data protection, privacy and moderation across the globe, something that Utopia Analytics is all too familiar with.

“As well as providing the tools that companies need to moderate their content effectively, it’s also our job to ensure clients such as Star Stable are up-to-date with the legislation changes that could impact how we moderate their content,” Paukkeri stated.

Meeting the tide of regulation

The European Union is responsible for legislation in a lot of the markets in which Star Stable operates, but there are particular bills such as Section 230 in the US and the Online Safety Bill in the UK (currently in draft) which will legally require companies to offer self-moderation, providing users with tools to file reports against other people, or players in the case of Star Stable. Data protection laws such as GDPR can also make it challenging to share important information regarding player protection with other platform holders should the need ever arise.

“We spend a lot of time talking to other companies, taking into account all of the different regulations across the globe, to encourage wider conversations about moderation in video games and tech and what more needs to be done in terms of legislation,” Munthe said.

Ahead of implementing these changes, Star Stable has started experimenting with positive behavioural nudging in select markets as a two-month A/B test. The results were positive, leading to a five per cent decrease in negative behaviour in the areas it was trialled and could be particularly useful for reducing more non-serious instances of negative behaviour, such as commenting on another character’s appearance, to help players take other people’s feelings into account.

The move to mobile will present its own set of unique challenges, and we don’t know how these will emerge yet
Cecilia Munthe

“In these instances, rather than telling players, ‘you can’t send this,’ we ask them, ‘are you sure you want to send this?’ which is particularly important as it can help educate our younger players on the importance and power of language,” Munthe said. “Obviously it takes a while to change the players' mindset, but we want to eventually roll this feature out across all regions and would like to integrate specific messaging trees depending on the type of message and its severity.”

Star Stable recently launched on mobile for iOS devices. As Star Stable was previously only available on desktop, the game is much easier to access now and has led to an influx of new players. This move onto a new platform could introduce new moderation challenges to Star Stable.

“We think the move to mobile will present its own set of unique challenges, and we don’t know how these will emerge yet, but we’re constantly monitoring it. That said, it’s been great to see the game arrive on a platform that allows us to find new players while giving existing players the benefit of experiencing Star Stable on the move.”

PocketGamer.biz regularly posts content from a variety of guest writers across the games industry. These encompass a wide range of topics and people from different backgrounds and diversities, sharing their opinion on the hottest trending topics, undiscovered gems and what the future of the business holds.