A Simple, Self-Service Solution for the
Digital Services Act (DSA)

compliance dashboard

Rest Easy,
Compliance is Covered

With Amanda as your content moderation and security platform, you can eliminate the stress of complex reporting and compliance worries. Built DSA-compliant by design, Amanda provides all the tools you need to ensure DSA compliance—so you can focus your energy where it’s needed.

Already have your user safety in order?
Check out our Transparency Report Wizard

The 9 DSA principles you need to manage

(Click titles below to expand)

The DSA mandates the protection of minors. Companies should have mechanisms in place to ensure minors are not exposed to harmful or illegal content.

See our solution

The DSA places importance on transparency regarding moderation processes:

  • Companies must update and make publicly available their terms and conditions (T&Cs), outlining how content moderation decisions are made, including a clear description of the tools used for this purpose.
  • If minors are part of the user base, these T&Cs should be written in a manner that is easy for younger audiences to understand.

See our solution

Gaming companies must publicly provide a clear point of contact for authorities and users:

  • For Authorities: Ensure quick and transparent communication with the relevant regulatory bodies.
  • For Users: A clear and accessible point of contact for user concerns and reports must also be provided.
  • EU Requirement: For companies based outside the EU, a legal representative within the EU is necessary for compliance with the DSA.

See our solution

Content Reporting is a key requirement under the DSA. The process must include:

  • A reason field outlining why the report is being made.
  • A reference to the illegal content being reported.
  • The name or email of the reporting user, with exceptions for child sexual abuse material (CSAM).
  • A statement verifying the truthfulness of the report.

In addition, companies must provide a receipt of the report to the user and issue a follow-up notification about the decision made. Users who repeatedly report without valid reasons can be warned and eventually ignored if they continue.

See our solution

Companies must establish a system to block or ban users when necessary, but only after issuing a prior warning. Every sanction must include a clear explanation of the reason. Additionally, a well-defined and consistent policy for user sanctions must be in place.

See our solution

When ordered by authorities, gaming companies must remove content or provide necessary information as required. This is particularly important when the content violates national or EU law. Compliance under these circumstances is non-negotiable, and timely responses are critical.

See our solution

Users must be able to appeal moderation decisions:

  • Both the sanctioned user and the reporting user can appeal a decision.
  • The appeal window lasts for six months post-decision.
  • Appeals cannot be processed by artificial intelligence (AI); they must be reviewed by human moderators.
  • Users who appeal without valid reasons repeatedly can be warned and eventually ignored if they continue.

See our solution

If a company receives information suggesting a potential threat to someone’s life or safety, it must escalate the issue to the relevant authorities. Depending on where the company is located, this could be local law enforcement or, for more severe cases, Europol.

See our solution

Transparency reporting is a core element of the DSA, and medium-sized gaming companies must make this information accessible in a machine-readable format at least once a year. The report should include:

  • Orders from Authorities: The number of orders received, the nature of the requests, and the average response time to comply.
  • Content Reports: A detailed breakdown of content reports, categorized by the type of illegal content reported. This includes the number of reports submitted by trusted flaggers, the actions taken (such as sanctions or content removal), and how the reports are categorized (by legal violations or terms of service breaches).
  • Proactive Moderation: Companies must disclose their use of proactive content moderation tools, including automated systems like AI. This section of the report should describe:
    – The tools and detection methods used to identify harmful or illegal content.
    – The types of restrictions applied based on detected content.
    – The measures taken to prevent false positives and ensure human oversight where necessary.
  • The number and type of content removals and sanctions categorized by violation type.
  • Appeals: A summary of the number of appeals filed, the basis for those appeals, decision outcomes, and the median time needed to resolve them. It should also include how often decisions were reversed.
  • AI in Moderation: A detailed explanation of how AI is used in content moderation, outlining its purpose, accuracy, and potential error margins. Safeguards and oversight mechanisms must also be highlighted to ensure AI’s responsible use.
  • Out-of-Court Disputes: Information about any disputes settled outside of formal legal procedures, the outcomes of these disputes, time to settlement, and the proportion of decisions implemented.
  • User Suspensions: Data on the number of suspensions, categorized by illegal content, unfounded reports, or baseless appeals.

While the transparency reporting is only an obligation for medium sized companies (and bigger), even small and micro companies have to report every six months:

  • Average Monthly Active Users (MAUs) over the last six months.

See our solution

Amanda Makes Compliance Easy!

Let Amanda do the heavy lifting in DSA compliance,
so you can focus on what matters

Protection of Minors

The DSA makes the protection of minors a priority, which means companies should have mechanisms in place to ensure minors are not exposed to harmful or illegal content.

With Amanda, you can be confident that your platform not only meets but exceeds this standard. Designed to safeguard young gamers and social media users, Amanda excels at detecting, managing, and preventing harmful behavior before it escalates.

Discover Amanda’s powerful safety features here.

Safeguarding Children
Terms and Conditions

Terms and Conditions Transparency

Keeping your Terms & Conditions (T&Cs) up to date and publicly accessible is a key requirement under the DSA—and Amanda makes it simple.

  • Built-in T&C templates to clearly outline your content moderation process.
  • Easy-to-understand language designed for younger audiences.
  • Hosting & updates to keep your T&Cs publicly accessible.
Terms and Conditions

Points of Contact

Amanda offers a dedicated contact management system that ensures fast, transparent communication with both authorities and users—seamlessly integrating with your website and games.

  • For Authorities: A central channel for regulatory bodies to reach your team, ensuring compliance with reporting and oversight requirements.
  • For Users: A clear, accessible contact point for concerns and reports, enhancing trust and accountability.
  • For Non-EU Companies: Amanda supports EU legal representative management, helping you stay compliant without extra administrative burden.
Contact
content reports

Content Reporting

Amanda offers a fully integratable reporting system that seamlessly embeds into your game or platform. If you already have a reporting system, Amanda can easily integrate with it, enhancing your existing setup.

Our solution includes:

  • In-game & platform-ready reporting tools with required fields for reason, evidence, and user verification.
  • Automated receipt & follow-up notifications, keeping users informed.
  • Advanced spam detection to prevent abuse.
  • Centralized management dashboard, streamlining moderation decisions.
content reports

Content Removal
and Information Provision

Amanda makes it simple to comply with DSA-mandated takedown and data requests, ensuring fast and secure responses to authorities.

  • One-Click User Data Export – Easily generate and provide all necessary user data when required.
  • Integrated Content Takedowns – Remove flagged content seamlessly within your moderation workflow.
  • Efficient Request Management – Track, review, and process takedown orders with an intuitive dashboard.
Content Takedown

Sanctioning and Feedback

Amanda ensures DSA-compliant user sanctions, helping gaming companies enforce clear, consistent policies while maintaining user trust.

  • Automated Warnings & Sanctions – Issue prior warnings before blocking or banning users.
  • Clear Justifications – Every sanction includes a detailed explanation, ensuring transparency.
  • Customizable Enforcement Policies – Define and enforce consistent sanctioning rules tailored to your platform.

Appeals

Amanda ensures DSA-compliant appeal handling, providing a structured and transparent process for both sanctioned and reporting users.

  • Human-Reviewed Appeals – All appeals are routed to human moderators, ensuring compliance with AI restrictions.
  • Streamlined Appeal Management – Track, review, and resolve appeals efficiently within Amanda’s dashboard.
  • Abuse Prevention – Identify and warn users who repeatedly submit invalid appeals, preventing system misuse.
appeals
Police Escalation

Police Escalation

When a potential threat to life or safety is detected, Amanda ensures a fast, structured, and compliant escalation process.

  • Efficient Workflow – Streamlined case escalation to local law enforcement or Europol.
  • One-Click Data & Evidence Export – Instantly compile and package relevant information for authorities.
  • Clear Audit Trails – Maintain records of all escalations for transparency and compliance.
Police Escalation
Transparency report

Transparency Reporting

Transparency reporting is a key DSA requirement, and Amanda makes the process simple and clear.

A Transparency Report for the DSA shall cover these areas:

  • Orders from Authorities – Number of requests, nature, and response times.
  • Content Reports – Breakdown of illegal content reports, actions taken, and trusted flagger submissions.
  • Proactive Moderation – AI tools used, restrictions applied, and safeguards against false positives.
  • Appeals & Reversals – Number of appeals, outcomes, and median resolution time.
  • AI Moderation Insights – Accuracy, oversight, and responsible AI safeguards.
  • User Suspensions & Disputes – Data on suspensions, out-of-court resolutions, and settlement times.
  • MAU Reporting – Automatic tracking of monthly active users for compliance.
Transparency report

Simplify Your Reporting Process

Our self-service Transparency Report Wizard streamlines your reporting process,
ensuring clarity and structure every step of the way.

Transparency_Report wizard

Seamless Integration with Amanda

If you’re using Amanda as your content moderation platform, most of the heavy lifting is already done—saving you time and effort.

✔ Automatic Data Extraction: Instantly populate a DSA-compliant transparency report with required data.
 One-Click Reporting: Export reports in the correct machine-readable format or submit them directly to the DSA database.

Works with Any Moderation Tool

Even if you use other tools, you can enjoy our our simple and clear reporting process by either Automated API Integration or Manual Input.

No matter your setup, our Report Wizard ensures simple, clear, and efficient process.

Stay Compliant, Stay in Control

Simple DSA compliance with a complete solution for:

Protecting Minors & User Safety
Transparent Terms & Conditions
Clear Points of Contact & Reporting
Content Removal & Moderation Oversight
Fair Sanctions, Appeals & Feedback
Police Escalations & Authority Compliance
Automated Transparency Reporting

Simplify compliance with Amanda.
Book a demo or a chat with us today!