Blue Scarab Entertainment
Blue Scarab Entertainment is a Stockholm based game studio founded by developers with experience from titles such as Battlefield, World of Warcraft, and Helldivers. Their first game, Equinox: Homecoming, is a social multiplayer experience built around cooperation, exploration, and player interaction.
For Blue Scarab, community is part of the product.
Players collaborate, communicate, and build relationships as they explore the world together. From the beginning, the team knew the tone of that environment would matter as much as the gameplay itself. They wanted a space players would feel comfortable in and a culture they could be proud of.
The challenge was not fixing a problem. It was preventing one.
In social games, early behavior sets expectations. If harmful content becomes visible or normalized, it shapes the culture quickly and is difficult to reverse.
At the same time, Blue Scarab is a focused team building their first title. Moderation could not become a separate operational burden or require dedicated resources from day one. They needed a solution that would:
- Set clear boundaries from launch
- Work quietly in the background
- Scale as the community grows
- Provide the documentation expected by platforms, partners, and investors
They were also looking for a partner they could work with directly as their needs evolved.
The rollout focused on early impact with minimal operational overhead.
Proactive filtering
A community specific profanity system was configured for the tone and audience of Equinox. Contextual detection identifies variations, obfuscation, and attempts to bypass rules, while surfacing only the content that actually requires attention.
Instead of monitoring everything, the team sees what matters and has the context needed to make fast decisions.
Giving players a voice
Player reporting was introduced to capture issues automated detection might miss.
Giving players the ability to report harmful behavior does more than flag content. It signals that the studio takes safety seriously and that players have a voice in protecting the space they share.
At the same time, reporting can quickly become overwhelming for a small team. Amanda’s AI moderation platform validates and prioritizes reports based on content, context, and user behavior, ensuring that only the cases that truly require human attention reach the team.
Even in early access, the operational impact is clear.
Most routine moderation is handled automatically through filtering, prioritization, and case surfacing. The team reviews only the situations that require human judgment, with full context around content history and user behavior.
For a small studio, this makes a real difference.
Instead of spending time managing moderation, the Blue Scarab team can stay focused on development and the player experience.
Players also see the impact. Issues are handled quickly and consistently, and the reporting system reinforces that their concerns are taken seriously.
From a governance perspective, every action is logged and structured, giving the team the audit trail needed to demonstrate responsible safety practices as the community grows.

