What every game studio should ask its moderation vendor
The regulatory perimeter around player safety has tightened considerably over the past eighteen months. The EU’s Digital Services Act is in full enforcement. The UK’s Online Safety Act has moved from publication to active duties, with Ofcom issuing codes of practice and information notices. And in the United States, the FTC’s amended COPPA Rule reached its compliance deadline on 22 April 2026.
For studios operating across multiple markets, these three regimes are no longer separate compliance projects. They are converging into a single operational expectation: that platforms know who their users are, that they minimise the data collected about them, that they moderate user-generated content with documented care, and that every third party in the data path is auditable. Enforcement actions, including the Epic Games settlement of USD 520 million, have made clear that regulators are willing to pursue substantial penalties.
At Aiba, we sit inside that data path. Our Amanda Moderation Platform processes chat messages, usernames, and other user-generated content on behalf of gaming studios and online communities. That places us squarely within the scope of all three regimes, and it is the reason we treat regulatory readiness as part of the product, not as a legal afterthought.
This post walks through what each framework requires, what it means for studios specifically, and the questions any studio should be asking its moderation vendor in 2026.
The Three Frameworks at a Glance
| Framework | Jurisdiction | Core Focus | Status |
|---|---|---|---|
| Digital Services Act (DSA) | European Union | Illegal content, transparency, user redress, risk assessments for VLOPs | In force; enforcement active |
| Online Safety Act (OSA) | United Kingdom | Illegal harms, child safety, transparency, codes of practice | Phased duties live; Ofcom enforcement active |
| Amended COPPA Rule | United States | Verifiable parental consent, data minimisation, third-party scope | Compliance deadline 22 April 2026 |
All three point to the same underlying expectation: know your users, minimise their data, moderate with documented care, and keep every third party in your data path auditable. A studio that builds privacy-respecting, audit-ready infrastructure once will be positioned for the next wave, whether it comes from Australia, India, Brazil, or Indonesia.
What the DSA Requires of Game Studios
The Digital Services Act applies to any online platform offering services to users in the EU, regardless of where the platform is established. For most game studios, the practical implications fall into four areas.
Notice and action mechanisms. Studios must provide users with a clear, accessible way to report illegal content. The mechanism must be electronic, easy to use, and produce a timely response. For multiplayer games, this means structured in-game and out-of-game reporting with a documented review process behind it.
Statement of reasons. Any moderation action against a user, whether that means suspending an account, removing content, or restricting visibility, must come with a clear statement explaining the decision and the appeal route. This is operationally significant. It means moderation actions cannot be opaque, and it requires a case management layer that can produce a reasoned decision per action.
Transparency reporting. Platforms must publish regular transparency reports covering the volume of content moderated, the categories of action taken, the use of automated tools, and the rate of successful appeals.
Risk assessment for larger platforms. Very Large Online Platforms, defined as platforms with at least 45 million monthly active recipients of the service in the EU, face additional obligations including systemic risk assessments and independent audits. Most studios will not meet that threshold, but the pattern is clear: assessment and documentation are becoming the baseline, not the exception.
The DSA does not specify which moderation tools a studio must use. It specifies the outcomes those tools must produce: timely action, reasoned decisions, auditable records, and meaningful user redress.
What the UK Online Safety Act Adds
The UK Online Safety Act covers similar territory to the DSA but with a sharper focus on illegal content categories and child safety. Ofcom’s codes of practice translate the statutory duties into specific operational expectations.
For game studios, the OSA imposes three obligations that matter most.
Illegal content duties. Platforms must take proportionate steps to prevent users from encountering priority illegal content, which includes terrorism, child sexual exploitation, and a defined list of priority offences. “Proportionate” is doing real work here. Ofcom expects platforms to demonstrate that their moderation systems are calibrated to the risk profile of their service.
Child safety duties. Platforms likely to be accessed by children must conduct a children’s risk assessment and implement protective measures. This sits adjacent to COPPA but covers a wider age range, up to 18, and a broader set of harms.
Transparency and information powers. Ofcom can compel platforms to produce information about their moderation practices, their use of proactive technology, and their handling of complaints. A studio that cannot evidence its moderation pipeline will struggle to respond to a Section 100 information notice.
The OSA is more prescriptive than the DSA in places, particularly on child safety, and the codes of practice are still expanding. Studios serving UK users should expect the regulatory perimeter to keep moving for the next eighteen to twenty-four months.
What the Amended COPPA Rule Changes
The FTC’s amended COPPA Rule does not replace the 1998 framework. It sharpens it. The three changes that matter most for game studios:
Stricter parental consent standards. Low-friction verification methods face higher scrutiny. The FTC has signalled a clear preference for mechanisms that meaningfully verify the consenting adult is actually a parent or guardian, not just someone who clicked through an email.
Explicit data minimisation. Studios may only collect, retain, and use children’s personal information to the extent reasonably necessary for the specific service the child is using. Analytics pipelines, personalisation features, and retention periods all sit inside the scope of this requirement.
Third-party integrations are in scope. A studio cannot outsource COPPA liability by pointing to a vendor’s privacy policy. SDKs, ad networks, analytics platforms, and moderation vendors all need to fit within the studio’s compliance framework. The “actual knowledge” standard has also been broadened: general audience games with material under-13 usership can no longer rely on the “we do not ask for age” defence.
Vendor Due Diligence Is Now a Regulatory Requirement
The pattern across all three frameworks is unmistakable. Studios are responsible for the conduct of every vendor in their data path, including their moderation vendor. The DSA’s audit obligations, the OSA’s information powers, and COPPA’s third-party scope all converge on the same operational expectation: studios must be able to evidence what their vendors do with user data.
The questions below are the ones any studio should be asking a moderation vendor in 2026. We have included Aiba’s answers to make the evaluation concrete.
What data does the platform collect and process?
Amanda processes only the data passed to it by the studio through our API. That typically includes usernames, chat messages, and metadata required to associate moderation events with the right account. We do not perform independent collection, augment data from external sources, or use customer data to train external models.
How long is data retained?
Default retention periods are configurable per customer and set in the data processing agreement. Retention for flagged content and content that triggered no moderation action is handled separately, with the latter held for shorter windows. Studios can request shorter or longer periods based on their own legal posture.
Where is the data processed?
Aiba is headquartered in Oslo, Norway, and operates within an EU and EEA data processing footprint. For studios serving EU and UK users, this minimises the friction associated with international data transfers and the standard contractual clauses that go with them. GDPR is not a compliance project for us. It is the legal regime we operate under as a baseline.
Does the vendor sell or share data with third parties?
No. Aiba does not sell, rent, or share customer data with third parties. The specialised language models we deploy for moderation are configured per customer, with customer-specific system prompts, and are not used to train cross-customer models.
What is the vendor’s role under applicable privacy law?
Aiba operates as a Data Processor. The studio remains the Data Controller and retains responsibility for the consent determinations that authorise data collection in the first place, including COPPA’s verifiable parental consent. This distinction matters because it places liability where the framework places it, and it means our documentation supports the studio’s compliance evidence trail.
What is the vendor’s security and compliance posture?
Aiba operates under GDPR as our baseline regulatory framework, which sets the standard for our data handling, access controls, and breach notification processes. Our Technical Security Specification is available to customers and prospects under standard NDA terms, and we work directly with InfoSec teams through their security review processes.
Can players request deletion?
Yes. Deletion requests are routed through the studio as Data Controller, and Aiba executes the deletion against our systems within the timeframes specified in the data processing agreement. This applies to GDPR subject access and erasure rights, UK GDPR equivalents, and COPPA-driven deletion requests.
How does the platform support transparency reporting?
Amanda’s case management and reporting layers are designed to produce the records studios need for DSA transparency reports, OSA information requests, and internal audit. Volume of content moderated, categories of action, automated versus human-reviewed decisions, and appeal outcomes are all captured and exportable.
How does the platform handle the human-in-the-loop requirement?
Both the DSA and the OSA expect that automated decisions affecting users can be reviewed and, where appropriate, reversed. Amanda includes a human-in-the-loop review layer as a core part of the platform. Moderation decisions can be reviewed, escalated, and appealed within the same workflow that produced them. This is the architectural choice that makes Amanda defensible under regulatory scrutiny, not just operationally effective.
For a deeper look at how human review fits into an AI moderation pipeline, take a look at this article.
The Bigger Picture
The regulatory environment for online games is not stabilising. It is expanding. Australia’s Online Safety Amendment, India’s Digital Personal Data Protection Act, Brazil’s LGPD, and the proposed COPPA 2.0 legislation (not yet enacted) in the United States all point in the same direction: more accountability, more transparency, more vendor scrutiny.
Studios that build their moderation infrastructure around data minimisation, auditable decisions, meaningful redress, and EU-grade data handling will meet the next round of regulation without disruption. Those that treat compliance as a checklist will spend the next several years catching up.
Amanda was built inside the EU, under GDPR, with human in the loop review and customer-controlled data handling. The design principle came first. The specific rules came later.









