Building Community Safety:
How Aiba and MovieStarPlanet Created
the Ultimate AI Moderation Platform
MovieStarPlanet
MovieStarPlanet, headquartered in Copenhagen, Denmark, has been a notable player in the gaming sector since its founding in 2009. This company has made a lasting impact on the industry, creating games that resonate strongly with young audiences, such as MovieStarPlanet 1 & 2, and Blockstar Planet. With over 400 million registered players worldwide and a talented team of 55 employees, MovieStarPlanet has built a remarkable presence globally.
At the heart of their Trust & Safety operations are Warin Jaeger, Senior Trust & Safety Manager, and Vernon Jones, Head of Safety. Both bring extensive experience to the table, managing a team of dedicated content moderators who are vital in maintaining player safety and fostering a healthy community. Warin and Vernon have been instrumental in developing and implementing MovieStarPlanet’s safety measures, significantly shaping the company’s content moderation strategies.
This team has played a crucial role in the success of our partnership, particularly in the development and implementation of Amanda as their new content moderation tool.
The Problem
A continuous rise in negative behavior
MovieStarPlanet faced an escalating challenge with content moderation. Like other games with social features and social media platforms, they were seeing a continuous rise in negative behavior that moderators needed to identify and address, alongside a mounting pile of player reports. This issue was further exacerbated when they had to reduce staff, making it clear that a change in their content moderation strategy was urgently needed.
User satisfaction and community well-being are vital for long-term success
MovieStarPlanet is deeply committed to their users, with user safety being a top priority. They recognize that user satisfaction and community well-being are vital for long-term success. Their goal has always been to build a positive space where users can play and interact safely. A non-toxic environment fosters not only healthier communities but also better player retention and increased engagement, including in-game spending.
The mounting stress was beginning to take a toll on morale
The burden on their moderation team was growing. The constant wave of user-generated content, coupled with an unclear content moderation landscape, was becoming overwhelming. The team was facing an uphill battle, and the mounting stress was beginning to take a toll on morale.
The Solution
Seeking more effective solutions
At Aiba, we were in the final stages of developing the first version of Amanda, our AI-driven content moderation platform, and were actively looking for pilot customers. Simultaneously, the management team at MovieStarPlanet was seeking a more effective content moderation solution, with a strong belief that AI could be the key to advancing their strategy. Fortunately, our paths crossed, and since then, we have built a fruitful partnership.
A product capable of tackling all aspects of content moderation
When we first engaged with MovieStarPlanet, we were primarily a research-based company with a specialized product: AI designed to detect cyber grooming conversations. Through our discussions with MovieStarPlanet and a range of other gaming companies, we realized that while cyber grooming is one of the most critical issues, it is not the most frequent. Issues such as toxicity and bullying were far more prevalent in terms of numbers. To create a truly compelling product, we needed to address these broader challenges as well. This realization marked the beginning of our journey to expand Amanda into a comprehensive Trust & Safety platform capable of tackling all aspects of content moderation, with MovieStarPlanet as one of our key partners.
We believe that no one understands a customer’s needs better than the customers themselves. Amanda is the outcome of a close collaboration with the industry, designed to meet the challenges they face every day.
We have communicated our needs for the moderation solution, and the dialogue has been highly productive since the start of the project. Aiba did a superb job translating our requirements into the final product.
Warin Jaeger
Senior Trust & Safety Manager ,
MovieStarPlanet Aps
Our collaboration with MovieStarPlanet began with a series of workshops, during which we analyzed their existing workflows, current challenges, and pain points. This helped us gain a detailed understanding of the requirements necessary to build a complete content moderation platform— one that could efficiently detect various types of violations, manage player reports effectively, incorporate a compliance module, and much more. From there, we started sketching out solutions. Given the inherently overwhelming nature of content moderation, with its vast amounts of information, we emphasized simplicity, striving to create a solution that was as user-friendly as possible. After several iterations, we were confident that we had developed the core features of a complete Trust & Safety platform, versatile enough to serve any gaming or social media platform.
We strive for easy technical integration
We then moved to the technical development phase, establishing the database structure, training AI models, and building out the platform’s backend. Amanda operates using APIs that connect customer data to Amanda’s AI models to detect harmful behavior. We made considerable efforts to ensure that our system was easily accessible for our customers. Of course, this still required some effort from MovieStarPlanet’s side during the implementation phase, but it was an investment well worth making. Throughout the process, we worked closely with MovieStarPlanet’s technical team to ensure the integration went as smoothly as possible.
It has been an absolute pleasure working with Aibas tech team. We were able to bounce ideas back and forth and collaborate very closely on situations. They are motivated, quick to respond, and just really good!
Dagbjört Jónsdóttir
Director of Technology
MovieStarPlanet Aps
Since June 2024, MovieStarPlanet has been using Amanda as their primary content moderation platform. And don’t take our word for it—they have expressed their satisfaction with their choice and the results achieved.
The Result
Following the implementation of Amanda, MovieStarPlanet has already experienced positive changes, even in the early stages. While it is too soon to present specific metrics, the feedback from MovieStarPlanet has been overwhelmingly encouraging. Their team has expressed increased confidence in their ability to manage content moderation effectively and create a safer, more enjoyable environment for their players.
Amanda enables an extremely efficient way of working and is very user friendly. It also has a lot of advanced and fine features that makes it possible to moderate more precise than before.
Vernon Jones
Head of Safety & Support
MovieStarPlanet Aps
MovieStarPlanet is now better positioned to handle the growing volume of user interactions, attributing much of this progress to Amanda’s capabilities. Amanda has enabled moderators to have a comprehensive overview of user behavior, which has been particularly valuable for moderation and support tasks. The ability to quickly gain a snapshot of community situations, such as understanding user complaints about sanctions, has made their work significantly more efficient compared to previous tools.
We have a lot of information, and a good overview in Amanda, and it makes it easier to apply sanctions with precision.
Content Moderator
MovieStarPlanet Aps
Amanda is designed with a player-centric approach, consolidating all incidents involving a user into a single case, which enhances moderation efficiency. This holistic view streamlines the process, allowing moderators to address all aspects of a case at once, avoiding repeated actions on the same user.
When users are complaining about sanctions, you can go in andget an overview, especially from a certain time period. We could not do that easily with previous tools. I think having a comprehensive overview of a user`s behavior has been the most valuable aspect in using the tool in other areas of moderation and support.
Content Moderator
MovieStarPlanet Aps
Amanda’s advanced features have also enhanced accuracy in identifying problematic behavior, which has improved the overall safety of the platform. The moderation team can now work more analytically, allowing them to be proactive rather than reactive. This shift has been largely due to Amanda’s AI capabilities in quickly detecting and prioritizing harmful content and identifying dangerous users, enabling timely intervention before issues escalate.
Amanda is more accurate than what we have worked with before, so from a safety point of view, this tool is very good. I really enjoy working in it.
Content Moderator
MovieStarPlanet Aps
As a result, moderators have gained a broader understanding of community trends, such as language usage and behavioral patterns, which leads to better precision in sanctioning and overall moderation efforts. Amanda’s efficiency and user friendly interface have allowed moderators to perform their tasks with greater accuracy and ease, significantly improving their daily workflow.
Previously, MovieStarPlanet’s moderation approach was more limited, providing only a narrow snapshot of the community. With Amanda, they are now able to gain deeper insights, make data-driven decisions, and ensure a healthier community environment. This has not only improved the efficiency of their processes but also enhanced the safety and quality of the experience for their players.