Transparency Report 2024

FapHouse DSA Transparency report for the reporting period from February 17, 2024 to December 31, 2024

I. Clause 1 of Article 15 of Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act, or DSA) imposes the following transparency reporting obligations for providers of intermediary services:

“Providers of intermediary services shall make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content moderation that they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:

(a) for providers of intermediary services, the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;

(b) for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action;

(c) for providers of intermediary services, meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal content or violation of the terms and conditions of the service provider, by the detection method and by the type of restriction applied;

(d) for providers of intermediary services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms and conditions and additionally, for providers of online platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

(e) any use made of automated means for the purpose of content moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.”

II. Clause 1 of Article 24 of the DSA imposes additional transparency reporting obligations for providers of online platforms, namely:

“In addition to the information referred to in Article 15, providers of online platforms shall include in the reports referred to in that Article information on the following:

(a) the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online platform implemented the decisions of the body;

(b) the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.”

In order to fulfill the requirements of Article 15 and Article 24 of the DSA, we, Hammy Media Ltd, are pleased to publish this DSA Transparency report for the reporting period from 17 February 2024 to 31 December 2024 for the FapHouse platform.


1. Information about orders received from Member States’ authorities – Art. 15(1)(a) DSA

1.1. The table below shows the number of orders received from Member States’ authorities under Articles 9 of the DSA – orders to act against illegal content, e.g. orders to remove content:

Member State

Type of illegal (inappropriate) content

Intellectual property infringements

Protection of minors

Non-consensual behavior

Animal welfare

Violence / abuse including self-harm

Data protection and privacy violations

Illegal or harmful speech

Scams and/or fraud

Harassment / stalking / threats

Scope of platform service

Violation of any other laws or regulations

Austria

0

0

0

0

0

0

0

0

0

0

0

Belgium

0

0

0

0

0

0

0

0

0

0

0

Bulgaria

0

0

0

0

0

0

0

0

0

0

0

Croatia

0

0

0

0

0

0

0

0

0

0

0

Cyprus

0

0

0

0

0

0

0

0

0

0

0

Czech Republic (Czechia)

0

0

0

0

0

0

0

0

0

0

0

Denmark

0

0

0

0

0

0

0

0

0

0

0

Estonia

0

0

0

0

0

0

0

0

0

0

0

Finland

0

0

0

0

0

0

0

0

0

0

0

France

0

0

0

0

0

0

0

0

0

0

0

Germany

0

0

0

0

0

0

0

0

0

0

0

Greece

0

0

0

0

0

0

0

0

0

0

0

Hungary

0

0

0

0

0

0

0

0

0

0

0

Ireland

0

0

0

0

0

0

0

0

0

0

0

Italy

0

0

0

0

0

0

0

0

0

0

0

Latvia

0

0

0

0

0

0

0

0

0

0

0

Lithuania

0

0

0

0

0

0

0

0

0

0

0

Luxembourg

0

0

0

0

0

0

0

0

0

0

0

Malta

0

0

0

0

0

0

0

0

0

0

0

Netherlands

0

0

0

0

0

0

0

0

0

0

0

Poland

0

0

0

0

0

0

0

0

0

0

0

Portugal

0

0

0

0

0

0

0

0

0

0

0

Romania

0

0

0

0

0

0

0

0

0

0

0

Slovakia

0

0

0

0

0

0

0

0

0

0

0

Slovenia

0

0

0

0

0

0

0

0

0

0

0

Spain

0

0

0

0

0

0

0

0

0

0

0

Sweden

0

0

0

0

0

0

0

0

0

0

0

Total

0

0

0

0

0

0

0

0

0

0

0

1.2. The table below shows the number of orders received from Member States’ authorities under Articles 10 of the DSA – orders to provide information, e.g. orders to provide information for a specific profile of the user:

Member State

Type of illegal (inappropriate) content

Intellectual property infringements

Protection of minors

Non-consensual behavior

Animal welfare

Violence / abuse including self-harm

Data protection and privacy violations

Illegal or harmful speech

Scams and/or fraud

Harassment / stalking / threats

Scope of platform service

Violation of any other laws or regulations

Austria

0

0

0

0

0

0

0

0

0

0

0

Belgium

0

0

0

0

0

0

0

0

0

0

0

Bulgaria

0

0

0

0

0

0

0

0

0

0

0

Croatia

0

0

0

0

0

0

0

0

0

0

0

Cyprus

0

0

0

0

0

0

0

0

0

0

0

Czech Republic (Czechia)

0

0

0

0

0

0

0

0

0

0

0

Denmark

0

0

0

0

0

0

0

0

0

0

0

Estonia

0

0

0

0

0

0

0

0

0

0

0

Finland

0

0

0

0

0

0

0

0

0

0

0

France

0

0

0

0

0

0

0

0

0

0

0

Germany

0

0

0

0

0

0

0

0

0

0

0

Greece

0

0

0

0

0

0

0

0

0

0

0

Hungary

0

0

0

0

0

0

0

0

0

0

0

Ireland

0

0

0

0

0

0

0

0

0

0

0

Italy

0

0

0

0

0

0

0

0

0

0

0

Latvia

0

0

0

0

0

0

0

0

0

0

0

Lithuania

0

0

0

0

0

0

0

0

0

0

0

Luxembourg

0

0

0

0

0

0

0

0

0

0

0

Malta

0

0

0

0

0

0

0

0

0

0

0

Netherlands

0

0

0

0

0

0

0

0

0

0

0

Poland

0

0

0

0

0

0

0

0

0

0

0

Portugal

0

0

0

0

0

0

0

0

0

0

0

Romania

0

0

0

0

0

0

0

0

0

0

0

Slovakia

0

0

0

0

0

0

0

0

0

0

0

Slovenia

0

0

0

0

0

0

0

0

0

0

0

Spain

0

0

0

0

0

0

0

0

0

0

0

Sweden

0

0

1

0

0

0

0

0

0

0

0

Total

0

0

1

0

0

0

0

0

0

0

0

Median time taken to inform Member States’ authorities of receipt of orders submitted under Articles 9 and 10 of the DSA: FapHouse confirms the receipt of an order from a Member States’ authority submitted through the dedicated channels immediately, by sending an automatic confirmation.

Median time taken to give effect to the orders of Member States’ authorities submitted under Articles 9 and 10 of the DSA: 4 days.


2. Information about notices submitted in accordance with Article 16 of DSA (users notices) – Art. 15(1)(b) DSA

The table below shows the number of notices submitted in accordance with Article 16 of DSA (users notices) categorised by the type of alleged illegal content:

Type of illegal (inappropriate) content

Intellectual property infringements

Protection of minors

Non-consensual behavior

Animal welfare

Violence / abuse including self-harm

Data protection and privacy violations

Illegal or harmful speech

Scams and/or fraud

Harassment / stalking / threats

Scope of platform service

Violation of any other laws or regulations

Number of notices by category

2

2

2

0

3

0

0

0

0

28

1

During this reporting period, we received a total of 38 notices of alleged illegal content. We took appropriate action on all of these notices based on our terms and conditions.

Under DSA, trusted flaggers can also submit illegal content reports. However, we did not receive any illegal content reports from trusted flaggers during this reporting period.

All notices are handled by our human moderation team, and we do not use automated means to process notices received under Article 16 of the DSA and to make decisions based on such notices.

Median time to take action on the basis of the notice received under Article 16 of the DSA is 3.2 days.


3. Information about the content moderation engaged in at the providers’ own initiative – Art. 15(1)(c) DSA

The table below shows the meaningful and comprehensible information about the content moderation engaged in at our own initiative. This information includes the number and type of measures taken that affect the availability, visibility, and accessibility of information provided by the recipients (users) of our platform and service, the recipients’ ability to provide information through the service, and other related restrictions of the service. This information in categorized by the type of illegal content or violation of our terms and conditions and by the type of restriction applied:

Type of illegal (inappropriate) content / Content that violates our terms and conditions

Total number of restrictions in this specific type of illegal (inappropriate) content

Decision to restrict visibility

Decision to restrict access to the account

Decision to restrict the provision of the service

Removal of content

Disabling of access to content

Suspension of the user account

Termination of the user account

Suspension of the provision of the service in whole or in part

Termination of the provision of the service in whole or in part

Intellectual property infringements

0

0

0

0

0

0

0

Protection of minors

1

0

0

0

1

0

1

Non-consensual behavior

112

112

0

0

0

0

0

Animal welfare

10

10

0

0

0

0

0

Violence / abuse including self-harm

99

98

0

0

1

0

0

Data protection and privacy violations

0

0

0

0

0

0

0

Illegal or harmful speech

0

0

0

0

0

0

0

Scams and/or fraud

8

0

0

0

8

0

0

Harassment / stalking / threats

0

0

0

0

0

0

0

Scope of platform service

1457

1291

0

0

166

0

0

Violation of any other laws or regulations

0

0

0

0

0

0

0

We do not use automated tools to make final decisions about imposing restrictions on uploaded content or existing accounts. All such decisions are made manually in each specific case by our human moderation team.

At the same time, we use a mix of semi-automated tools and human review to combat illegal and inappropriate content and protect our users. You can find more information about how we use these means and tools for the purpose of content moderation below in this report as well as on our Trust and Safety page available at https://faphouse.com/pages/trust-and-safety.

Measures taken by us to provide training and assistance to personnel in charge of content moderation:

- Training for personnel in charge of content moderation is conducted by the senior moderation and support specialist as soon as a new employee joins the team based on the company's established training plan.

- The training starts with an overview of the main guides used during the moderation by all personnel in charge of content moderation.

- Training lasts at least eight days. It involves simulated reviews, hands-on practice and observation, role-playing exercises, multiple evaluations and feedback, and studying subject-specific guides. There are also three distinct tests.

- After training, the new employee will enter a two-week period of close supervision under the senior moderation and support specialist, during which they will have frequent discussions to address any questions they may have.

- Personnel in charge of content moderation are encouraged to seek guidance whenever needed. They can reach out to the senior moderation and support specialist, or their team lead who maintain regular communication with the staff.


4. Information about complaints received through the internal complaint-handling system – Art. 15(1)(d) DSA

The table below shows the number of complaints (appeals) received through the internal complaint-handling system in accordance with Article 20 of the DSA and in accordance with clause 20 of the FapHouse Terms and Conditions of Use including the basis for those complaints and decisions taken in respect of those complaints:

Basis for complaints (appeals)

Number of upheld decisions

Number of reversed decisions

Number of cancelled complaints (appeals) [1]

User reports some information (content) to FapHouse and is dissatisfied with action(s) taken by FapHouse (Complaint regarding a decision not to take action on a notice submitted in accordance with Article 16 DSA)

0

0

0

FapHouse removes content or information published by a user and user is appealing this action (Complaint regarding a decision to remove or disable access to or restrict visibility of information)

0

0

0

FapHouse makes a decision to terminate or suspend an account of a user and user is appealing this decision (Complaint regarding a decision to suspend or terminate the provision of the service / Complaint regarding a decision to suspend or terminate an account)

0

0

4

[1] The user has mistakenly submitted a complaint through the internal complaint-handling system pursuant to Article 20 of the DSA, or the complaint cannot be resolved for other reasons beyond the FapHouse’s control.

The median time needed for taking decisions within the internal complaint-handling system is 2 days.

There were no instances where decisions within the internal complaint-handling system were reversed.


5. Information about automated means for the purpose of content moderation – Art. 15(1)(e) DSA

At FapHouse we prioritize safety, privacy, and trust, and we are committed to protecting our users from illegal content. Upholding these values is essential to our corporate culture. As responsible members of the online community, we recognize the importance of devoting sufficient time and resources to combat inappropriate and illegal content, including non-consensual sexual and intimate content and child sexual abuse material (CSAM).

We use a combination of semi-automated tools and human review to combat illegal content and protect our community from it. The FapHouse content moderation process incorporates a substantial team of human moderators who are tasked with the responsibility of reviewing each upload prior to its publication. In addition, a comprehensive system has been implemented for the flagging, review, and removal of any material deemed to be in violation of the law. Also, FapHouse implemented parental control measures. While all content available on the platform is reviewed by human moderators prior to publishing, we also have additional semi-automated tools that help content moderation teams to moderate content and scrutinize materials for any potential violations of the FapHouse Terms and Conditions of Use.

Semi-automated tools are utilized to assist human moderators in making informed decisions. In instances where an applicable semi-automated tool detects a match between an uploaded piece of content and a previously identified hash list of illegal material, and this match is confirmed, the content is designated as potentially dangerous and illegal. Consequently, a specific warning is displayed to the moderator during the moderation process.

FapHouse uses the following automated tools and means for content moderation:

1. FapHouse is proud to partner with charities and organizations that support noble causes such as combating child exploitation, human trafficking, slavery, and providing general support to adult industry performers. Some of our partnerships include RTA (Restricted to Adults), ASACP (the Association of Sites Advocating Child Protection), and others.

2. FapHouse is committed to facilitating parental control over children's access to adult content. All FapHouse pages contain "restricted to adults" (RTA) tags, enabling parental tools, controls, and similar solutions to block adult content. In simple words, RTA tag allows parents to protect minors from adult content on various browsers, devices (mobile phones, laptops), and operating systems by easily setting up parental controls.

3. All members must undergo a verification process to become verified uploaders (Content Providers). The process involves filling out a form on the dedicated web page of the platform. Live age verification is mandatory for all uploaders-individuals. The platform has a contract with reputable third-party service providers in the sphere of digital identity and age verification to conduct these live checks. Upon successful completion of the age verification process and manual approval by the platform's moderation team, the member attains verified status.

4. FapHouse has developed an advanced system which analyzes text related to the content (title, description) for prohibited and suspicious words. Such words are flagged by the system and displayed to the moderation team during the moderation process.

5. FapHouse is in collaboration with various leading software providers detecting potentially harmful content. All content prior to moderation is processed by such software that allows the platform to initially identify content that may potentially violate the FapHouse Terms and Conditions of Use. The software enables the platform to detect inappropriate content using a shared database of digital hashes (fingerprints) and can also detect inappropriate content based on artificial intelligence technologies.

6. FapHouse uses digital fingerprinting technology which has been specifically designed for the platform. This software protects the platform against inappropriate content that has already been removed from the platform in the past. Digital fingerprinting technology compares the hashes (fingerprints) of newly uploaded content with a database of hashes (fingerprints) of previously removed content. If there is a match, this correlation is highlighted for the moderation team and a human-based decision is made about the content. In simple words, this software prevents inappropriate content from being re-uploaded.

To guarantee the accuracy of the automated tools, all uploaded content is subject to review and approval by our moderation team before being published. This is our quality control mechanism and safeguard for the automated tools.


6. Information about the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21 of the DSA – Art. 24(1)(a) DSA

We inform users, individuals, and entities that if they do not agree with our decisions, they may have the right to challenge the decision in a relevant court and that they may also be able to refer the decision to a certified dispute settlement body. We have clearly mentioned this information in section 18 of the FapHouse Terms and Conditions of Use. During the reporting period, we did not receive any disputes from certified out-of-court settlement bodies pursuant to Article 21 of the DSA.


7. Information about the number of suspensions imposed pursuant to Article 23 of the DSA – Art. 24(1)(b) DSA

During the reporting period, FapHouse have not imposed any suspensions on users under Article 23 of the DSA and clause 19 of the FapHouse Terms and Conditions of Use for the submission of manifestly unfounded notices.

During the reporting period, FapHouse did not impose suspensions on users under Article 23 of the DSA and clause 19 of the FapHouse Terms and Conditions of Use for (i) the provision of manifestly illegal content and for (i) the submission of manifestly unfounded complaints.


Name of service provider: Tecom Ltd.

Date of Publication: 28 February 2025

Period Covered: 17 February 2024 - 31 December 2024

Service: FapHouse

Получить полный доступ к секс видео от лучших каналов

  • Эксклюзивный контент FapHouse Originals
  • Более 1,268,720 видео — найдите свой фетиш
  • Разблокируйте 2,688 каналов в одной подписке
  • Качество, которого вы заслуживаете — Full HD и 4K
  • С 600 новыми видео в день вы никогда не заскучаете
  • Скрытая и безопасная оплата
  • Скачивание на высокой скорости
  • Вы можете отменить подписку в любой момент
Оформить подписку -50%
КИБЕРНЕДЕЛЯ SALE 🤖🔥

Скидки только сейчас

Вы собираетесь зайти на сайт FapHouse, ресурс для взрослых с откровенным сексуальным контентом. Чтобы продолжить, подтвердите, что вам не менее 18 лет.

Контент, который меня интересует:

Если вам еще не исполнилось 18, покиньте этот веб-сайт

ЭТОТ ВЕБ-САЙТ СОДЕРЖИТ МАТЕРИАЛЫ ОТКРОВЕННОГО ХАРАКТЕРА (включая порнографию). Для использования этого веб-сайта вам должно быть не менее 18 (восемнадцати) лет, если только возраст совершеннолетия в вашей юрисдикции не превышает 18 (восемнадцать) лет (в этом случае вы должны достичь возраста совершеннолетия в своей юрисдикции). Запрещается использовать этот веб-сайт в местах, где это не разрешено по закону. Для работы с этим веб-сайтом также используются файлы cookie. Дополнительную информацию о наших файлах cookie можно найти в нашей Политике конфиденциальности и Политике в отношении файлов cookie.
ЗАХОДЯ НА ЭТОТ ВЕБ-САЙТ И ИСПОЛЬЗУЯ ЕГО, ВЫ ПРИНИМАЕТЕ НАШУ ПОЛИТИКУ КОНФИДЕНЦИАЛЬНОСТИ И УСЛОВИЯ ИСПОЛЬЗОВАНИЯ ФАЙЛОВ COOKIE.