The Digital Services Act (DSA) has introduced a new framework that impacts how, among others, gaming companies handle their content moderation obligations. For small and medium-sized gaming companies, these rules are crucial to ensure compliance and avoid regulatory consequences. This guide, based on key points highlighted in the DSA, is tailored to help gaming companies understand the most important responsibilities they face.

1

Orders from Authorities: Compliance is Key

When ordered by authorities, gaming companies must remove content or provide necessary information as required. This is particularly important when the content violates national or EU law. Compliance under these circumstances is non-negotiable, and timely responses are critical.

2

Points of Contact

Gaming companies must publicly provide a clear point of contact for authorities and users:

  • For Authorities: Ensure quick and transparent communication with the relevant regulatory bodies.
  • For Users: A clear and accessible point of contact for user concerns and reports must also be provided.
  • EU Requirement: For companies based outside the EU, a legal representative within the EU is necessary for compliance with the DSA.

3

Terms and Conditions Transparency

The DSA places importance on transparency regarding moderation processes:

  • Companies must update and make publicly available their terms and conditions (T&Cs), outlining how content moderation decisions are made, including a clear description of the tools used for this purpose.
  • If minors are part of the user base, these T&Cs should be written in a manner that is easy for younger audiences to understand.

4

User Reporting: A Mandatory Process

User reporting is a key requirement under the DSA. The process must include:

  • A reason field outlining why the report is being made.
  • A reference to the illegal content being reported.
  • The name or email of the reporting user, with exceptions for child sexual abuse material (CSAM).
  • A statement verifying the truthfulness of the report.

In addition, companies must provide a receipt of the report to the user and issue a follow-up notification about the decision made. Users who repeatedly report without valid reasons can be warned and eventually ignored if they continue.

5

Sanctioning Users

Companies are required to implement a system to block or ban users if necessary, but only after a warning has been issued. A clear and consistent policy for sanctioning users should be in place.

DSA_Appeals

6

Appeals: Ensuring Fairness

Users must be able to appeal moderation decisions:

  • Both the sanctioned user and the reporting user can appeal a decision.
  • The appeal window lasts for six months post-decision.
  • Appeals cannot be processed by artificial intelligence (AI); they must be reviewed by human moderators.
  • Users who appeal without valid reasons repeatedly can be warned and eventually ignored if they continue.
DSA_Police

7

Police Escalation: Protecting Public Safety

If a company receives information suggesting a potential threat to someone’s life or safety, it must escalate the issue to the relevant authorities. Depending on where the company is located, this could be local law enforcement or, for more severe cases, Europol.

DSA_children

8

Protection of Minors: A Priority but Lacking Precision

The DSA mandates the protection of minors, though how companies should achieve this is somewhat vague. Nonetheless, companies should have mechanisms in place to ensure minors are not exposed to harmful or illegal content.

9

Transparency Reporting: Regular and Detailed

Transparency reporting is a core element of the DSA, and medium-sized gaming companies must make this information accessible in a machine-readable format at least once a year. The report should include:

  • Orders from Authorities: The number of orders received, the nature of the requests, and the average response time to comply.
  • User Reports: A detailed breakdown of user reports, categorized by the type of illegal content reported. This includes the number of reports submitted by trusted flaggers, the actions taken (such as sanctions or content removal), and how the reports are categorized (by legal violations or terms of service breaches).
  • Proactive Moderation: Companies must disclose their use of proactive content moderation tools, including automated systems like AI. This section of the report should describe:
    – The tools and detection methods used to identify harmful or illegal content.
    – The types of restrictions applied based on detected content.
    – The measures taken to prevent false positives and ensure human oversight where necessary.
  • The number and type of content removals and sanctions categorized by violation type.
  • Appeals: A summary of the number of appeals filed, the basis for those appeals, decision outcomes, and the median time needed to resolve them. It should also include how often decisions were reversed.
  • AI in Moderation: A detailed explanation of how AI is used in content moderation, outlining its purpose, accuracy, and potential error margins. Safeguards and oversight mechanisms must also be highlighted to ensure AI’s responsible use.
  • Out-of-Court Disputes: Information about any disputes settled outside of formal legal procedures, the outcomes of these disputes, time to settlement, and the proportion of decisions implemented.
  • User Suspensions: Data on the number of suspensions, categorized by illegal content, unfounded reports, or baseless appeals.

While the transparency reporting is only an obligation for medium sized companies (and bigger), even small and micro companies have to report every six months:

  • Average Monthly Active Users (MAUs) over the last six months.

The Digital Services Act (DSA) introduces a robust framework of rules aimed at enhancing transparency, accountability, and user safety across digital platforms. For small and medium-sized gaming companies, especially those targeting minors, navigating these regulations can seem daunting. However, with the right tools and strategies—such as Amanda AI’s content moderation solutions—compliance becomes much more manageable.

Amanda AI offers automated, accurate, and customizable moderation tools that are designed to detect harmful content, ensuring that gaming companies meet the DSA’s requirements without straining their resources. This is particularly beneficial for games aimed at younger audiences, where the protection of minors is paramount. By automating many of the complex moderation tasks, Amanda AI allows companies to efficiently handle user reports, appeals, and transparency reporting, while still maintaining the necessary human oversight where required.

With Amanda AI’s support, gaming companies can streamline their processes, reduce the burden of manual moderation, and ensure they’re fully compliant with the DSA—continuing to provide safe and enjoyable gaming environments for all users, especially minors.

Author: Marcus Hülsdau