Company
Your Online Community’s Superhero
Aiba helps online communities stay safe, healthy, and productive as they grow.
We build Trust and Safety solutions for platforms where people talk, play, create, and sometimes cross the line. Games, social platforms, creator communities, and interactive apps use Amanda to reduce harm, support fair moderation, and keep their communities running smoothly.
At our core, we care about two things:
protecting people, and making Trust and Safety work better for the teams doing the job.
Amanda: an operating system for Trust and Safety
Moderation should not feel like fighting fires all day.
Amanda is designed as an operating system for Trust and Safety. One place to understand what is happening in your community, act quickly, and stay in control as volume and complexity increase.
We focus heavily on:
- Productivity over manual review
- Clear workflows instead of scattered tools
- Design that helps teams make good decisions faster
Everything is built to be easy to use, easy to adapt, and easy to trust. Less noise. More clarity. Better outcomes for both users and moderators.

How we build, together
Aiba is built by a focused team of engineers, Trust and Safety specialists, and product builders who work closely with the customers we serve.
We are a product company, but we are also deeply service minded. Our customers work directly with us. Feedback travels fast, and adjustments happen quickly. We care about how the product feels in real workflows, especially when things get messy, not just how it looks in a demo.
That closeness shapes how we build. We combine AI, thoughtful design, and hands on moderation experience to create tools teams actually enjoy using and trust in daily work.
Behind the platform is a group of people who care about doing things properly, treating users fairly, and helping customers succeed over the long term.

Leadership & Responsibility
Trust and Safety is serious work.
It deserves clear ownership and real accountability.

Hege Tokerud
CEO & Co-founder
As CEO, Hege leads Aiba’s direction, customer relationships, and day to day priorities. She works closely with platforms using Aiba to understand their challenges and make sure the product solves real problems, not theoretical ones.

Gard Støe
CTO & Co-founder
As CTO, Gard is responsible for Aiba’s technology, AI systems, and platform architecture. His focus is on building moderation tools that scale without becoming complex or opaque.
Accuracy matters. Explainability matters. Usability matters. The technology is there to support people, not replace them.
The research behind Aiba
Before Aiba became a company, the work was already underway. For Patrick, tragic cases like Amanda Todd’s made it clear how early signals in digital conversations are often missed, and how much harm that silence can cause.
Our technology is built on years of academic research by Patrick Bours, Professor of Information Security and Communication Technology at NTNU. His research focuses on behavioural patterns in digital communication and how they can be used to detect risk, intent, and misuse online.
Patrick’s work explores how people behave in real interactions, how language evolves, how intent shows up over time, and how harmful patterns can be identified early without relying on simple keyword lists. This research became the foundation for how Aiba understands context, escalation, and behaviour across conversations.
What started as academic work grew into a clear opportunity. The same methods used to study behaviour could help platforms protect users, support moderators, and make online spaces healthier at scale.
Patrick is the inventor behind Aiba’s core technology and continues to be closely involved as Lead Researcher. His role ensures that what we build stays grounded in science, not shortcuts, and that the system evolves alongside real human behaviour.

Professor Patrick Bours

Our story
Online communities move fast. Harmful behavior adapts quickly, and moderation teams are asked to keep up with more risk, more content, and higher expectations than ever before. Younger users often feel the impact first.
For many teams, the tools meant to help have fallen short. Systems that are hard to use and slow to adapt leave too much unseen and push moderation into a constant state of reaction.
We built Amanda to change that. The confidence to do so came from more than ambition. It came from years of research in behavioral biometrics, combined with deep experience building large scale software systems. That foundation shaped an AI driven platform designed around real people and real workflows, with a clear focus on catching what matters and helping teams work with confidence.
That belief guides how we build, how we listen, and how we support the teams working every day to keep communities healthy.

