Transparency Report 2025

FapHouse DSA Transparency report for the reporting period from 01 January 2025 to 31 December 2025

 

I. Clause 1 of Article 15 of Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act, or DSA) imposes the following transparency reporting obligations for providers of intermediary services:

 

“Providers of intermediary services shall make publicly available, in a machine-readable format and in an easily accessible manner, at least once a year, clear, easily comprehensible reports on any content moderation that they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable:

 

(a) for providers of intermediary services, the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order;

 

(b) for providers of hosting services, the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action;

 

(c) for providers of intermediary services, meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal content or violation of the terms and conditions of the service provider, by the detection method and by the type of restriction applied;

 

(d) for providers of intermediary services, the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms and conditions and additionally, for providers of online platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

 

(e) any use made of automated means for the purpose of content moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.”

 

II. Clause 1 of Article 24 of the DSA imposes additional transparency reporting obligations for providers of online platforms, namely:

 

“In addition to the information referred to in Article 15, providers of online platforms shall include in the reports referred to in that Article information on the following:

 

(a) the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online platform implemented the decisions of the body;

 

(b) the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.”

 

In order to fulfill the requirements of Article 15 and Article 24 of the DSA, we, Tecom Ltd, are pleased to publish this DSA Transparency report for the reporting period from 01 January 2025 to 31 December 2025 for the FapHouse platform. This DSA Transparency Report consists of two parts: (i) Transparency report for the period from 01 January 2025 to 30 June 2025 and (ii) Transparency report for the period from 01 July 2025 to 31 December 2025, for the reasons specified below.

 

 

 


 

I. TRANSPARENCY REPORT for the period from 01 January 2025 to 30 June 2025

 

1. Information about orders received from Member States’ authorities – Art. 15(1)(a) DSA

 

1.1. The table below shows the number of orders received from Member States’ authorities under Articles 9 of the DSA – orders to act against illegal content, e.g. orders to remove content:

 

Member State

Type of illegal (inappropriate) content

Intellectual property infringements

Protection of minors

Non-consensual behavior

Animal welfare

Violence / abuse including self-harm

Data protection and privacy violations

Illegal or harmful speech

Scams and/or fraud

Harassment / stalking / threats

Scope of platform service

Violation of any other laws or regulations

Austria

0

0

0

0

0

0

0

0

0

0

0

Belgium

0

0

0

0

0

0

0

0

0

0

0

Bulgaria

0

0

0

0

0

0

0

0

0

0

0

Croatia

0

0

0

0

0

0

0

0

0

0

0

Cyprus

0

0

0

0

0

0

0

0

0

0

0

Czech Republic (Czechia)

0

0

0

0

0

0

0

0

0

0

0

Denmark

0

0

0

0

0

0

0

0

0

0

0

Estonia

0

0

0

0

0

0

0

0

0

0

0

Finland

0

0

0

0

0

0

0

0

0

0

0

France

0

0

0

0

0

0

0

0

0

0

0

Germany

0

0

0

0

0

0

0

0

0

0

0

Greece

0

0

0

0

0

0

0

0

0

0

0

Hungary

0

0

0

0

0

0

0

0

0

0

0

Ireland

0

0

0

0

0

0

0

0

0

0

0

Italy

0

0

0

0

0

0

0

0

0

0

0

Latvia

0

0

0

0

0

0

0

0

0

0

0

Lithuania

0

0

0

0

0

0

0

0

0

0

0

Luxembourg

0

0

0

0

0

0

0

0

0

0

0

Malta

0

0

0

0

0

0

0

0

0

0

0

Netherlands

0

0

0

0

0

0

0

0

0

0

0

Poland

0

0

0

0

0

0

0

0

0

0

0

Portugal

0

0

0

0

0

0

0

0

0

0

0

Romania

0

0

0

0

0

0

0

0

0

0

0

Slovakia

0

0

0

0

0

0

0

0

0

0

0

Slovenia

0

0

0

0

0

0

0

0

0

0

0

Spain

0

0

0

0

0

0

0

0

0

0

0

Sweden

0

0

0

0

0

0

0

0

0

0

0

Total

0

0

0

0

0

0

0

0

0

0

0

 

1.2. The table below shows the number of orders received from Member States’ authorities under Articles 10 of the DSA – orders to provide information, e.g. orders to provide information for a specific profile of the user:

 

Member State

Type of illegal (inappropriate) content

Intellectual property infringements

Protection of minors

Non-consensual behavior

Animal welfare

Violence / abuse including self-harm

Data protection and privacy violations

Illegal or harmful speech

Scams and/or fraud

Harassment / stalking / threats

Scope of platform service

Violation of any other laws or regulations

Austria

0

0

0

0

0

0

0

0

0

0

0

Belgium

0

0

0

0

0

0

0

0

0

0

0

Bulgaria

0

0

0

0

0

0

0

0

0

0

0

Croatia

0

0

0

0

0

0

0

0

0

0

0

Cyprus

0

0

0

0

0

0

0

0

0

0

0

Czech Republic (Czechia)

0

0

0

0

0

0

0

0

0

0

0

Denmark

0

0

0

0

0

0

0

0

0

0

0

Estonia

0

0

0

0

0

0

0

0

0

0

0

Finland

0

0

0

0

0

0

0

0

0

0

0

France

0

0

0

0

0

0

0

0

0

0

0

Germany

0

0

0

0

0

0

0

0

0

0

0

Greece

0

0

0

0

0

0

0

0

0

0

0

Hungary

0

0

0

0

0

0

0

0

0

0

0

Ireland

0

0

0

0

0

0

0

0

0

0

0

Italy

0

0

0

0

0

0

0

0

0

0

0

Latvia

0

0

0

0

0

0

0

0

0

0

0

Lithuania

0

0

0

0

0

0

0

0

0

0

0

Luxembourg

0

0

0

0

0

0

0

0

0

0

0

Malta

0

0

0

0

0

0

0

0

0

0

0

Netherlands

0

0

0

0

0

0

0

0

0

0

0

Poland

0

0

0

0

0

0

0

0

0

0

0

Portugal

0

0

0

0

0

0

0

0

0

0

0

Romania

0

0

0

0

0

0

0

0

0

0

0

Slovakia

0

0

0

0

0

0

0

0

0

0

0

Slovenia

0

0

0

0

0

0

0

0

0

0

0

Spain

0

0

0

0

0

0

0

0

0

0

0

Sweden

0

0

0

0

0

0

0

0

0

0

0

Total

0

0

0

0

0

0

0

0

0

0

0

 

Median time taken to inform Member States’ authorities of receipt of orders submitted under Articles 9 and 10 of the DSA: FapHouse confirms the receipt of an order from a Member States’ authority submitted through the dedicated channels immediately, by sending an automatic confirmation.

Median time taken to give effect to the orders of Member States’ authorities submitted under Articles 9 and 10 of the DSA: FapHouse did not receive orders from Member States’, and due to this fact median time is 0.


 

2. Information about notices submitted in accordance with Article 16 of DSA (users notices) – Art. 15(1)(b) DSA

 

The table below shows the number of notices submitted in accordance with Article 16 of DSA (users notices) categorised by the type of alleged illegal content:

 

Type of illegal (inappropriate) content

Intellectual property infringements

Protection of minors

Non-consensual behavior

Animal welfare

Violence / abuse including self-harm

Data protection and privacy violations

Illegal or harmful speech

Scams and/or fraud

Harassment / stalking / threats

Scope of platform service

Violation of any other laws or regulations

Number of notices by category

2

1

0

0

0

0

0

0

0

3

2

 

During this reporting period, we received a total of 8 notices of alleged illegal content. We took appropriate action on all of these notices based on our terms and conditions.

 

Under DSA, trusted flaggers can also submit illegal content reports. However, we did not receive any illegal content reports from trusted flaggers during this reporting period.

 

All notices are handled by our human moderation team, and we do not use automated means to process notices received under Article 16 of the DSA and to make decisions based on such notices.

 

Median time to take action on the basis of the notice received under Article 16 of the DSA is 3.2 days.


 

3. Information about the content moderation engaged in at the providers’ own initiative – Art. 15(1)(c) DSA

 

The table below shows the meaningful and comprehensible information about the content moderation engaged in at our own initiative. This information includes the number and type of measures taken that affect the availability, visibility, and accessibility of information provided by the recipients (users) of our platform and service, the recipients’ ability to provide information through the service, and other related restrictions of the service. This information in categorized by the type of illegal content or violation of our terms and conditions and by the type of restriction applied:

 

Type of illegal (inappropriate) content / Content that violates our terms and conditions

Total number of restrictions in this specific type of illegal (inappropriate) content

Decision to restrict visibility

Decision to restrict access to the account

Decision to restrict the provision of the service

Removal of content

Disabling of access to content

Suspension of the user account

Termination of the user account

Suspension of the provision of the service in whole or in part

Termination of the provision of the service in whole or in part

Intellectual property infringements

0

0

0

0

0

0

0

Protection of minors

1

0

0

0

1

0

1

Non-consensual behavior

496

496

0

0

0

0

0

Animal welfare

0

0

0

0

0

0

0

Violence / abuse including self-harm

0

0

0

0

0

0

0

Data protection and privacy violations

0

0

0

0

0

0

0

Illegal or harmful speech

0

0

0

0

0

0

0

Scams and/or fraud

753

0

0

0

753

0

753

Harassment / stalking / threats

0

0

0

0

0

0

0

Scope of platform service

4017

3986

0

0

31

0

31

Violation of any other laws or regulations

0

0

0

0

0

0

0

 

We do not use automated tools to make final decisions about imposing restrictions on uploaded content or existing accounts. All such decisions are made manually in each specific case by our human moderation team.

At the same time, we use a mix of semi-automated tools and human review to combat illegal and inappropriate content and protect our users. You can find more information about how we use these means and tools for the purpose of content moderation below in this report as well as on our Trust and Safety page available at https://faphouse.com/pages/trust-and-safety.

 

Measures taken by us to provide training and assistance to personnel in charge of content moderation:

- Training for personnel in charge of content moderation is conducted by the senior moderation and support specialist as soon as a new employee joins the team based on the company"s established training plan.

- The training starts with an overview of the main guides used during the moderation by all personnel in charge of content moderation.

- Training lasts at least eight days. It involves simulated reviews, hands-on practice and observation, role-playing exercises, multiple evaluations and feedback, and studying subject-specific guides. There are also three distinct tests.

- After training, the new employee will enter a two-week period of close supervision under the senior moderation and support specialist, during which they will have frequent discussions to address any questions they may have.

- Personnel in charge of content moderation are encouraged to seek guidance whenever needed. They can reach out to the senior moderation and support specialist, or their team lead who maintain regular communication with the staff.

 


 

4. Information about complaints received through the internal complaint-handling system – Art. 15(1)(d) DSA

 

The table below shows the number of complaints (appeals) received through the internal complaint-handling system in accordance with Article 20 of the DSA and in accordance with clause 20 of the FapHouse Terms and Conditions of Use including the basis for those complaints and decisions taken in respect of those complaints:

 

Basis for complaints (appeals)

Number of upheld decisions

Number of reversed decisions

Number of cancelled complaints (appeals) [1]

User reports some information (content) to FapHouse and is dissatisfied with action(s) taken by FapHouse (Complaint regarding a decision not to take action on a notice submitted in accordance with Article 16 DSA)

0

0

0

FapHouse removes content or information published by a user and user is appealing this action (Complaint regarding a decision to remove or disable access to or restrict visibility of information)

0

0

0

FapHouse makes a decision to terminate or suspend an account of a user and user is appealing this decision (Complaint regarding a decision to suspend or terminate the provision of the service / Complaint regarding a decision to suspend or terminate an account)

1

0

0

[1] The user has mistakenly submitted a complaint through the internal complaint-handling system pursuant to Article 20 of the DSA, or the complaint cannot be resolved for other reasons beyond the FapHouse’s control.

 

The median time needed for taking decisions within the internal complaint-handling system is 1 day.

 

There were no instances where decisions within the internal complaint-handling system were reversed.

 


 

5. Information about automated means for the purpose of content moderation – Art. 15(1)(e) DSA

 

At FapHouse we prioritize safety, privacy, and trust, and we are committed to protecting our users from illegal content. Upholding these values is essential to our corporate culture. As responsible members of the online community, we recognize the importance of devoting sufficient time and resources to combat inappropriate and illegal content, including non-consensual sexual and intimate content and child sexual abuse material (CSAM).

 

We use a combination of semi-automated tools and human review to combat illegal content and protect our community from it. The FapHouse content moderation process incorporates a substantial team of human moderators who are tasked with the responsibility of reviewing each upload prior to its publication. In addition, a comprehensive system has been implemented for the flagging, review, and removal of any material deemed to be in violation of the law. Also, FapHouse implemented parental control measures. While all content available on the platform is reviewed by human moderators prior to publishing, we also have additional semi-automated tools that help content moderation teams to moderate content and scrutinize materials for any potential violations of the FapHouse Terms and Conditions of Use.

 

Semi-automated tools are utilized to assist human moderators in making informed decisions. In instances where an applicable semi-automated tool detects a match between an uploaded piece of content and a previously identified hash list of illegal material, and this match is confirmed, the content is designated as potentially dangerous and illegal. Consequently, a specific warning is displayed to the moderator during the moderation process.

 

FapHouse uses the following automated tools and means for content moderation:

 

1. FapHouse is proud to partner with charities and organizations that support noble causes such as combating child exploitation, human trafficking, slavery, and providing general support to adult industry performers. Some of our partnerships include RTA (Restricted to Adults), ASACP (the Association of Sites Advocating Child Protection), and others.

 

2. FapHouse is committed to facilitating parental control over children"s access to adult content. All FapHouse pages contain "restricted to adults" (RTA) tags, enabling parental tools, controls, and similar solutions to block adult content. In simple words, RTA tag allows parents to protect minors from adult content on various browsers, devices (mobile phones, laptops), and operating systems by easily setting up parental controls.

 

3. All members must undergo a verification process to become verified uploaders (Content Providers). The process involves filling out a form on the dedicated web page of the platform. Live age verification is mandatory for all uploaders-individuals. The platform has a contract with reputable third-party service providers in the sphere of digital identity and age verification to conduct these live checks. Upon successful completion of the age verification process and manual approval by the platform"s moderation team, the member attains verified status.

 

4. FapHouse has developed an advanced system which analyzes text related to the content (title, description) for prohibited and suspicious words. Such words are flagged by the system and displayed to the moderation team during the moderation process.

 

5. FapHouse is in collaboration with various leading software providers detecting potentially harmful content. All content prior to moderation is processed by such software that allows the platform to initially identify content that may potentially violate the FapHouse Terms and Conditions of Use. The software enables the platform to detect inappropriate content using a shared database of digital hashes (fingerprints) and can also detect inappropriate content based on artificial intelligence technologies.

 

6. FapHouse uses digital fingerprinting technology which has been specifically designed for the platform. This software protects the platform against inappropriate content that has already been removed from the platform in the past. Digital fingerprinting technology compares the hashes (fingerprints) of newly uploaded content with a database of hashes (fingerprints) of previously removed content. If there is a match, this correlation is highlighted for the moderation team and a human-based decision is made about the content. In simple words, this software prevents inappropriate content from being re-uploaded.

 

To guarantee the accuracy of the automated tools, all uploaded content is subject to review and approval by our moderation team before being published. This is our quality control mechanism and safeguard for the automated tools.


 

6. Information about the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21 of the DSA – Art. 24(1)(a) DSA

 

We inform users, individuals, and entities that if they do not agree with our decisions, they may have the right to challenge the decision in a relevant court and that they may also be able to refer the decision to a certified dispute settlement body. We have clearly mentioned this information in section 18 of the FapHouse Terms and Conditions of Use. During the reporting period, we did not receive any disputes from certified out-of-court settlement bodies pursuant to Article 21 of the DSA.

 


 

7. Information about the number of suspensions imposed pursuant to Article 23 of the DSA – Art. 24(1)(b) DSA

 

During the reporting period, FapHouse have not imposed any suspensions on users under Article 23 of the DSA and clause 19 of the FapHouse Terms and Conditions of Use for the submission of manifestly unfounded notices.

 

During the reporting period, FapHouse did not impose suspensions on users under Article 23 of the DSA and clause 19 of the FapHouse Terms and Conditions of Use for (i) the provision of manifestly illegal content and for (i) the submission of manifestly unfounded complaints.


II. TRANSPARENCY REPORT for the period from 01 July 2025 to 31 December 2025

 

Pursuant to the Commission Implementing Regulation (EU) 2024/2835 of 4 November 2024 laying down templates concerning the transparency reporting obligations of providers of intermediary services and of providers of online platforms under Regulation (EU) 2022/2065 of the European Parliament and of the Council, particularly point 3 of ANNEX II thereof,

 

3. Transition period

A transition period following the full entry into application of Regulation (EU) 2022/2065 on 17 February 2024 is necessary to align the reporting timelines of providers of intermediary services, providers of hosting services and providers of online platforms with the timelines of providers of very large online platforms and of very large online search engines. The transition period ends on 31 December 2025. As of 1 January 2026, all providers of intermediary services shall follow the reporting periods outlined in Article 2 of this Regulation.

 

For providers of intermediary services, of hosting services, and of online platforms, the first reporting cycle following the full entry into application date of Regulation (EU) 2022/2065 ends with the publication of their first annual transparency report pursuant to Regulation (EU) 2022/2065 and latest on 16 February 2025. The second reporting cycle is a transitional reporting cycle. The transitional reporting cycle is shortened and covers the period until 31 December 2025. The start of the transitional reporting cycle depends on the data included in the first reporting cycle. For example, if a provider of an intermediary service covers the period 17 February 2024 – 31 January 2025 in their first reporting cycle, their transitional reporting cycle covers 1 February 2025 – 31 December 2025.

 

Providers of intermediary services, of hosting services and of online platforms shall collect information in compliance with this Regulation and the instruction outlined in this Annex on any content moderation in which they have engaged as of 1 July 2025. For the transitional reporting cycle, that means that the reporting for the period as of latest 17 February 2025 until 30 June 2025 is pursuant to Regulation (EU) 2022/2065 and the reporting for the period 1 July 2025 – 31 December 2025 must follow the templates set out in Annex I to this Regulation. For example, the provider mentioned above with a transitional reporting cycle covering the period 1 February 2025 – 31 December 2025 shall use the templates set out in Annex I for the period 1 July 2025 – 31 December 2025. For the period 1 February 2025 – 30 June 2025, the provider is encouraged to use the templates pursuant to this Regulation but is not obliged to do so. The reporting for the period 1 February 2025 – 30 June 2025 shall be pursuant to Regulation (EU) 2022/2065.

 

For the transitional reporting cycle, the deadline for publication set out in Article 2 to this Regulation applies. The first full annual reporting cycle of providers of intermediary services, of hosting services, and of online platform that must follow the templates set out in Annex I to this Regulation shall cover the period 1 January until 31 December 2026.

 

Following the entry into force of this Regulation, providers of very large online platforms and providers of very large online search engines shall collect information in compliance with the instructions outlined in this Annex on any content moderation in which they have engaged as of 1 July 2025. The first reporting cycle of providers of very large online platforms and of very large online search engines that must follow the templates set out in Annex I to this Regulation shall cover the period 1 July until 31 December 2025.”

 

Due to the abovementioned requirements, during the year 2025 the Transparency Reports made public by our company are divided into two:

1. TRANSPARENCY REPORT for the period from 01 January 2025 to 30 June 2025, prepared in accordance with the initial requirements for such in accordance with the Article 15 of the Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act);

2. TRANSPARENCY REPORT for the period from 01 July 2025 to 31 December 2025, prepared in accordance with the requirements laid out in the ANNEX 2 to the Commission Implementing Regulation (EU) 2024/2835 of 4 November 2024 laying down templates concerning the transparency reporting obligations of providers of intermediary services and of providers of online platforms under Regulation (EU) 2022/2065 of the European Parliament and of the Council.

 

More information regarding the requirements could be found at https://digital-strategy.ec.europa.eu/en/library/implementing-regulation-laying-down-templates-concerning-transparency-reporting-obligations

 

The Commission Implementing Regulation (EU) 2024/2835 of 4 November 2024 can be found at https://eur-lex.europa.eu/eli/reg_impl/2024/2835/oj/eng

 

Taking the above into account, FapHouse is providing a completed transparency report for the period from 01 July to 31 December 2025. Required parts of the transparency report under the Commission Implementing Regulation (EU) 2024/2835 of 4 November 2024 laying down templates concerning the transparency reporting obligations of providers of intermediary services and of providers of online platforms under Regulation (EU) 2022/2065 of the European Parliament and of the Council are publicly available and can be downloaded using the following links:

 

1. In accordance with Article 15.1.(a) of the Digital Services Act, the information on the following is available here for the relevant period: the number of orders received from Member States’ authorities including orders issued in accordance with Articles 9 and 10, categorised by the type of illegal content concerned, the Member State issuing the order, and the median time needed to inform the authority issuing the order, or any other authority specified in the order, of its receipt, and to give effect to the order.

 

2. In accordance with Article 15.1.(b) of the Digital Services Act, the information on the following is available here for the relevant period: the number of notices submitted in accordance with Article 16, categorised by the type of alleged illegal content concerned, the number of notices submitted by trusted flaggers, any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, the number of notices processed by using automated means and the median time needed for taking the action.

 

3. In accordance with Article 15.1.(c) of the Digital Services Act, the information on the following is available here, here, here and here for the relevant period: meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the use of automated tools, the measures taken to provide training and assistance to persons in charge of content moderation, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information through the service, and other related restrictions of the service; the information reported shall be categorised by the type of illegal content or violation of the terms and conditions of the service provider, by the detection method and by the type of restriction applied.

 

4. In accordance with Article 15.1.(d), Article 24.1.(a), Article 24.1.(b) of the Digital Services Act, the information on the following is available here for the relevant period:

- the number of complaints received through the internal complaint-handling systems in accordance with the provider’s terms and conditions and additionally, for providers of online platforms, in accordance with Article 20, the basis for those complaints, decisions taken in respect of those complaints, the median time needed for taking those decisions and the number of instances where those decisions were reversed;

- the number of disputes submitted to the out-of-court dispute settlement bodies referred to in Article 21, the outcomes of the dispute settlement, and the median time needed for completing the dispute settlement procedures, as well as the share of disputes where the provider of the online platform implemented the decisions of the body;

- the number of suspensions imposed pursuant to Article 23, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints.

 

5. In accordance with Article 15.1.(e) of the Digital Services Act, the information on the following is available here for the relevant period: any use made of automated means for the purpose of content moderation, including a qualitative description, a specification of the precise purposes, indicators of the accuracy and the possible rate of error of the automated means used in fulfilling those purposes, and any safeguards applied.

 

6. In accordance with Commission Implementing Regulation (EU) 2024/2835 of 4 November 2024 laying down templates concerning the transparency reporting obligations of providers of intermediary services and of providers of online platforms under Regulation (EU) 2022/2065 of the European Parliament and of the Council we made available information on the following:

6.1. General information about identification of this transparency report for the relevant period can be found here

6.2. Names of categories of the illegal or incompatible with the terms and conditions content can be found here

6.3. Information about the number of human resources dedicated to content moderation can be found here. Our company is not obliged to publish information in this table as we are not subject to Articles 42.2.(a) and 42.2.(b) of the DSA.

6.4. Information about average monthly active recipients of the service for each Member State can be found here. Our company is not obliged to publish information in this table as we are not subject to Article 42.3 of the DSA.

 


 

Name of service provider: Tecom Ltd.

Date of Publication: 27 February 2026

Period Covered:

I. 01 January 2025 - 30 June 2025

II. 01 July 2025 - 31 December 2025

Service: FapHouse

Truy cập vào các video sex đầy đủ từ các kênh tốt nhất

  • Nội dung độc quyền của FapHouse Originals
  • Tìm thể loại của bạn giữa 1,348,625 video
  • Mở khóa 2,683 kênh trong một lần đăng ký
  • Chất lượng bạn kỳ vọng — full HD và 4K
  • Không thể rời mắt — 600 video mới mỗi ngày
  • Thanh toán bảo mật và an toàn
  • Tải xuống ở tốc độ cao
  • Hủy đăng ký mọi lúc
Mua gói thành viên -50%
❄️ TẠM BIỆT, MÙA ĐÔNG! 🌞

Xin chào, siêu tiết kiệm

Bạn sắp truy cập FapHouse, trang web người lớn chứa nội dung khiêu dâm. Để tiếp tục, vui lòng xác nhận bạn đã đủ 18 tuổi trở lên.

Nội dung tôi quan tâm:

Nếu chưa đủ 18+, hãy thoát khỏi trang

TRANG WEB NÀY CHỨA TÀI LIỆU KHAI THÁC VỀ TÌNH DỤC (bao gồm cả tài liệu khiêu dâm). Bạn phải đủ mười tám (18) tuổi để sử dụng Trang web này, trừ khi độ tuổi chiếm đa số trong phạm vi quyền hạn của bạn lớn hơn mười tám (18) tuổi, trong trường hợp đó ít nhất bạn phải đủ tuổi thành niên trong phạm vi quyền hạn của mình. Việc sử dụng Trang web này không được phép trong trường hợp bị pháp luật cấm. Trang web này cũng yêu cầu sử dụng cookie. Bạn có thể tìm thêm thông tin về cookie tại Chính sách quyền riêng tưChính sách cookie.
BẰNG VIỆC VÀO VÀ SỬ DỤNG TRANG WEB NÀY, BẠN ĐỒNG Ý VỚI CHÍNH SÁCH BẢO MẬT VÀ VIỆC SỬ DỤNG COOKIE CỦA CHÚNG TÔI.