xHamster DSA Transparency report for the reporting period from 17 February 2024 to 31 December 2024



Name of service provider: Hammy Media Ltd
Date of Publication: 13 February 2025
Period Covered: 17 February 2024 - 31 December 2024
Service: xHamster



Clause 1 of Article 15 of Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act or DSA) imposes the following transparency reporting obligations for providers of intermediary services:

“Providers of intermediary services shall make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content moderation that they they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:

(a) for providers of intermediary services, the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;

(b) for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action;

(c) for providers of intermediary services, meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal content or violation of the terms and conditions of the service provider, by the detection method and by the type of restriction applied;

(d) for providers of intermediary services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms and conditions and additionally, for providers of online platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

(e) any use made of automated means for the purpose of content moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.”

Clause 1 of Article 24 of the DSA imposes additional transparency reporting obligations for providers of online platforms, namely: “In addition to the information referred to in Article 15, providers of online platforms shall include in the reports referred to in that Article information on the following:

(a) the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online platform implemented the decisions of the body;

(b) the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.”
In order to fulfill the requirements of Article 15 and Article 24 of the DSA, we, Hammy Media Ltd, are pleased to publish this DSA Transparency report for the reporting period from 17 February 2024 to 31 December 2024 for the xHamster platform.

1. Information about orders received from Member States’ authorities – Art. 15(1)(a) DSA


1.1. The table below shows the number of orders received from Member States’ authorities under Articles 9 of the DSA – orders to act against illegal content, e.g. orders to remove content:

Member StateOrders to act against illegal content under Article 9 of the DSA
Intellectual Property InfringementsProtection of MinorsNon-Consensual BehaviorAnimal WelfareViolence / Abuse (including self-harm)Data Protection and Privacy ViolationsIllegal or Harmful SpeechScams and/or FraudHarassment / Stalking / ThreatsScope of Platform ServiceViolation of Other Laws or Regulations
Austria00000000000
Belgium00000000000
Bulgaria00000000000
Croatia00000000000
Cyprus00000000000
Czech Republic (Czechia)00000000000
Denmark00000000000
Estonia00000000000
Finland00000000000
France00000000000
Germany00000000000
Greece00000000000
Hungary00000000000
Ireland00000000000
Italy00000000000
Latvia00000000000
Lithuania00000000000
Luxembourg00000000000
Malta00000000000
Netherlands00000000000
Poland00000000000
Portugal00000000000
Romania00000000000
Slovakia00000000000
Slovenia00000000000
Spain00000000000
Sweden00000000000
Total00000000000


1.2. The table below shows the number of orders received from Member States’ authorities under Articles 10 of the DSA – orders to provide information, e.g. orders to provide information for a specific profile of the user:



Member StateOrders to act against illegal content under Article 9 of the DSA
Intellectual Property InfringementsProtection of MinorsNon-Consensual BehaviorAnimal WelfareViolence / Abuse (including self-harm)Data Protection and Privacy ViolationsIllegal or Harmful SpeechScams and/or FraudHarassment / Stalking / ThreatsScope of Platform ServiceViolation of Other Laws or Regulations
Austria01000000000
Belgium00000000000
Bulgaria00000000000
Croatia00000000000
Cyprus03000000001
Czech Republic (Czechia)00000000000
Denmark00200000000
Estonia00000000000
Finland00000000000
France00402000001
Germany01912010131010
Greece00600000000
Hungary00000000000
Ireland02000000000
Italy00000000003
Latvia00000000000
Lithuania00000000000
Luxembourg00000000000
Malta00000000000
Netherlands00000000000
Poland00000000000
Portugal00100000000
Romania00000000000
Slovakia00000000000
Slovenia00000000000
Spain00400000000
Sweden00000000000
Total02529030131015


Median time taken to inform Member States’ authorities of receipt of orders submitted under Articles 9 and 10 of the DSA: xHamster confirms the receipt of an order from a Member States’ authority submitted through the dedicated channels immediately, by sending an automatic confirmation

Median time taken to give effect to the orders of Member States’ authorities submitted under Articles 9 and 10 of the DSA 66 hours.

2. Information about notices submitted in accordance with Article 16 of DSA (users notices) – Art. 15(1)(b) DSA


The table below shows the number of notices submitted in accordance with Article 16 of DSA (users notices) categorised by the type of alleged illegal content:



Type of illegal (inappropriate) contentIntellectual property infringementsProtection of minorsNonconsensual behaviorAnimal welfareViolence / abuse (including self-harm)Data protection and privacy violationsIllegal or harmful speechScams and/or fraudHarassment / stalking / threatsScope of platform serviceViolation of any other laws or regulations
Number of notices by category10057892573221119268258339223166


During this reporting period, we received a total of 23,358 valid notices of alleged illegal content. We took appropriate action on all of these valid notices based on our terms and conditions. “Valid notice” means that the company has decided to impose restrictions on content or profiles on the basis of the notice.

At the same time, during this reporting period, we received a total of 10,191 rejected notices of alleged illegal content. “Rejected notice” means that the company has decided not to impose restrictions on content or profiles because the notice was inaccurate, lacked sufficient detail, was submitted incorrectly, was submitted without a sufficiently substantiated explanation of why the content in question should be considered illegal content, or the content did not violate the platform rules or applicable law.

To sum it up, during this reporting period, we received a total of 33,549 notices in accordance with Article 16 of DSA.

Under DSA, trusted flaggers can also submit illegal content reports. However, we did not receive any illegal content reports from trusted flaggers during this reporting period.

All notices are handled by our human moderation team, and we do not use automated means to process notices received under Article 16 of the DSA and to make decisions based on such notices.

Median time to take action on the basis of the notice received under Article 16 of the DSA is 14 hours.



3. Information about the content moderation engaged in at the providers’ own initiative – Art. 15(1)(c) DSA


The table below shows the meaningful and comprehensible information about the content moderation engaged in at our own initiative. This information includes the number and type of measures taken that affect the availability, visibility, and accessibility of information provided by the recipients (users) of our platform and service, the recipients’ ability to provide information through the service, and other related restrictions of the service. This information in categorized by the type of illegal content or violation of our terms and conditions and by the type of restriction applied:



Type of illegal (inappropriate) content / Content that violates our terms and conditionsTotal number of restrictions in this specific type of illegal (inappropriate) contentDecision to restrict visibilityDecision to restrict access to the accountDecision to restrict the provision of the service
Removal of contentDisabling of access to contentSuspension of the user accountTermination of the user accountSuspension of the provision of the service in whole or in partTermination of the provision of the service in whole or in part
Intellectual property infringements118201181400600
Protection of minors14400014400
Non-consensual behavior1275812731002700
Animal welfare850834101500
Violence / abuse (including self-harm)15151491002400
Data protection and privacy violations24924600300
Illegal or harmful speech255002000
Scams and/or fraud14170013400
Harassment / stalking / threats3100200
Scope of platform service849879310056700
Violation of any other laws or regulations295296510198600


We do not use automated tools to make final decisions about imposing restrictions on uploaded content or existing accounts. All such decisions are made manually in each specific case by our human moderation team.
At the same time, we use a mix of automated tools, artificial intelligence, and human review to combat illegal and inappropriate content and protect our users. You can find more information about how we use automated means and tools for the purpose of content moderation below in this report.



Measures taken by us to provide training and assistance to personnel in charge of content moderation:

  • Training for personnel in charge of content moderation is conducted by the senior moderation and support specialist as soon as a new employee joins the team based on the company's established training plan.
  • The training starts with an overview of the main guides used during the moderation by all personnel in charge of content moderation.
  • Training lasts at least eight days. It involves simulated reviews, hands-on practice and observation, role-playing exercises, multiple evaluations and feedback, and studying subject-specific guides. There are also three distinct tests.
  • After training, the new employee will enter a two-week period of close supervision under the senior moderation and support specialist, during which they will have frequent discussions to address any questions they may have.
  • Personnel in charge of content moderation are encouraged to seek guidance whenever needed. They can reach out to the senior moderation and support specialist, or their team lead who maintain regular communication with the staff.


4. Information about complaints received through the internal complaint-handling system – Art. 15(1)(d) DSA


The table below shows the number of complaints (appeals) received through the internal complaint-handling system in accordance with Article 20 of the DSA and in accordance with clause 2.13 of the xHamster User Agreement including the basis for those complaints and decisions taken in respect of those complaints:


Basis for complaints (appeals)Number of upheld decisionsNumber of reversed decisionsNumber of cancelled complaints (appeals)¹
User reports some information (content) to xHamster andis dissatisfied with action(s) taken by xHamster (Complaint regarding a decision not to take action on a notice submitted in accordance with Article 16 DSA)7126
xHamster removes content or information published by a user and user is appealing this action (Complaint regarding a decision to remove or disable access to or restrict visibility of information)47613
xHamster makes a decision to terminate or suspend an account of a user and user is appealing this decision (Complaint regarding a decision to suspend or terminate the provision of the service / Complaint regarding a decision to suspend or terminate an account)46019


¹ The user has mistakenly submitted a complaint through the internal complaint-handling system pursuant to Article 20 of the DSA, or the complaint cannot be resolved for other reasons beyond the xHamster’s control.



The median time needed for taking decisions within the internal complaint-handling system is 33 hours.

The number of instances where decisions within the internal complaint-handling system were reversed is 1.



5. Information about automated means for the purpose of content moderation – Art. 15(1)(e) DSA


At xHamster we prioritize safety, privacy, and trust, and we are committed to protecting our users from illegal content. Upholding these values is essential to our corporate culture. As responsible members of the online community, we recognize the importance of devoting sufficient time and resources to combat inappropriate and illegal content, including non-consensual sexual and intimate content and child sexual abuse material (CSAM).

We use a combination of automated tools, artificial intelligence, and human review to combat illegal content and protect our community from it. The xHamster content moderation process incorporates a substantial team of human moderators who are tasked with the responsibility of reviewing each upload prior to its publication. In addition, a comprehensive system has been implemented for the flagging, review, and removal of any material deemed to be in violation of the law. Also, xHamster implemented parental control measures. While all content available on the platform is reviewed by human moderators prior to publishing, we also have additional automated tools that help content moderation teams to moderate content and scrutinize materials for any potential violations of the xHamster User Agreement.

Automated tools are utilized to assist human moderators in making informed decisions. In instances where an applicable automated tool detects a match between an uploaded piece of content and a previously identified hash list of illegal material, and this match is confirmed, the content is designated as potentially dangerous and illegal. Consequently, a specific warning is displayed to the moderator during the moderation process. xHamster uses the following automated tools and means for content moderation:

1. xHamster is proud to partner with charities and organizations that support noble causes such as combating child exploitation, human trafficking, slavery, and providing general support to adult industry performers. Some of our partnerships include RTA (Restricted to Adults - https://www.rtalabel.org/), ASACP (the Association of Sites Advocating Child Protection - https://www.asacp.org/?content=validate&ql=0b38b549f187edb883db00b7b057cbaa), Revenge Porn Helpline (UK service supporting adults (aged 18+) who are experiencing intimate image abuse - https://revengepornhelpline.org.uk/), and others.

2. xHamster is committed to facilitating parental control over children's access to adult content. All xHamster pages contain "restricted to adults" (RTA) tags, enabling parental tools, controls, and similar solutions to block adult content. In simple words, RTA tag allows parents to protect minors from adult content on various browsers, devices (mobile phones, laptops), and operating systems by easily setting up parental controls measures. More information about such measures can be found in our Parental Controls Policy (https://xhamster.com/info/parentalcontrol).

3. All members must undergo a verification process to become verified uploaders. The process involves filling out a form on the dedicated web page of the platform. Live age verification is mandatory for all uploaders-individual persons. The platform has a contract with reputable thirdparty service providers in the sphere of digital identity and age verification to conduct these live checks. Upon successful completion of the age verification process and manual approval by the platform's moderation team, the member attains verified status.

4. xHamster has developed an advanced system which analyzes text related to the content (title, description) for prohibited and suspicious words. Such words are flagged by the system and displayed to the moderation team during the moderation process.

5. xHamster is in collaboration with various leading software providers detecting potentially harmful content. All content prior to moderation is processed by such software that allows the platform to initially identify content that may potentially violate the xHamster User Agreement. The software enables the platform to detect inappropriate content using a shared database of digital hashes (fingerprints) and can also detect inappropriate content based on artificial intelligence technologies.

6. xHamster uses digital fingerprinting technology which has been specifically designed for the platform. This software protects the platform against inappropriate content that has already been removed from the platform in the past. Digital fingerprinting technology compares the hashes (fingerprints) of newly uploaded content with a database of hashes (fingerprints) of previously removed content. If there is a match, this correlation is highlighted for the moderation team and a human-based decision is made about the content. In simple words, this software prevents inappropriate content from being re-uploaded.

To guarantee the accuracy of the automated tools, all uploaded content is subject to review and approval by our moderation team before being published. This is our quality control mechanism and safeguard for the automated tools.



6. Information about number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21 of the DSA – Art. 24(1)(a) DSA


We inform users, individuals, and entities that if they do not agree with our decisions, they may have the right to challenge the decision in a relevant court and that they may also be able to refer the decision to a certified dispute settlement body. We have clearly mentioned this information in section 17 of the xHamster User Agreement. During the reporting period, we did not receive any disputes from certified out-ofcourt settlement bodies pursuant to Article 21 of the DSA.



7. Information about number of suspensions imposed pursuant to Article 23 of the DSA – Art. 24(1)(b) DSA


During the reporting period, xHamster imposed 9 suspensions on users under Article 23 of the DSA and clause 2.12 of the xHamster User Agreement for the submission of manifestly unfounded notices.

During the reporting period, xHamster did not impose suspensions on users under Article 23 of the DSA and clause 2.12 of the xHamster User Agreement for (i) the provision of manifestly illegal content and for (i) the submission of manifestly unfounded complaints.

คุกกี้ช่วยให้เราส่งมอบบริการของเรา โดยการใช้เว็บไซต์นี้คุณเห็นด้วยกับการใช้คุกกี้ของเรา เรียนรู้เพิ่มเติม