Regulation Published 2022-10-27 — Council of the European Union EUR-Lex

Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC — COM(2020) 825 Proposal

Chapter I – General provisions

Article 1 Subject matter and scope

1 This Regulation lays down harmonised rules on the provision of intermediary services in the internal market. In particular, it establishes: (a) a framework for the conditional exemption from liability of providers of intermediary services; (b) rules on specific due diligence obligations tailored to certain specific categories of providers of intermediary services; Subject matter (c) rules on the implementation and enforcement of the requirements set out in this Regulation, including as regards the cooperation of and coordination between the competent authorities. 2 The aims of this Regulation are to: (a) contribute to the proper functioning of the internal market for intermediary services; (b) set out uniform harmonised rules for a safe, safe , accessible , predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected. 3 This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services. 4 This Regulation shall not apply to any service that is not an intermediary service or to any requirements imposed in respect of such protected (ba) promote a service, irrespective of whether the service is provided through the use of an intermediary service. 5 This Regulation is without prejudice to the rules laid down by the following: (a) Directive 2000/31/EC; (b) Directive 2010/13/EC; (c) Union law on copyright and related rights; (d) Regulation (EU) …/…. on preventing the dissemination of terrorist content online [TCO once adopted]; (e) Regulation (EU) …./….on European Production and Preservation Orders for electronic evidence in criminal matters and Directive (EU) …./….laying down harmonised rules on the appointment of legal representatives for the purpose high level of gathering evidence in criminal proceedings [e-evidence once adopted] (f) Regulation (EU) 2019/1148; (g) Regulation (EU) 2019/1150; (h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394; (i) Union law on the protection of personal data, in particular Regulation (EU) 2016/679 contribute to increased consumer choice while facilitating innovation, support digital transition and Directive 2002/58/EC. encourage economic growth within the internal market. deleted deleted deleted

Artículo 1 a.

Article 1a Scope 1. This Regulation shall apply to intermediary services provided to recipients of the service that have their place of establishment or residence in the Union, irrespective of the place of establishment of the providers of those services. 2. This Regulation shall not apply to any service that is not an intermediary service or to any requirements imposed in respect of such a service, irrespective of whether the service is provided through the use of an intermediary service. 3. This Regulation is without prejudice to the rules laid down by the following: (a) Directive 2000/31/EC; (b) Directive 2010/13/EU; (c) Union law on copyright and related rights, in particular Directive (EU) 2019/790 on copyright and related rights in the Digital Single Market; (d) Regulation (EU) 2021/784 on addressing the dissemination of terrorist content online; (e) Regulation (EU) …./….on European Production and Preservation Orders for electronic evidence in criminal matters and Directive (EU) …./….laying down harmonised rules on the appointment of legal representatives for the purpose of gathering evidence in criminal proceedings [e-evidence once adopted] (f) Regulation (EU) 2019/1148; (g) Regulation (EU) 2019/1150; (h) Union law on consumer protection and product safety, including Regulation (EU) 2017/2394, Regulation (EU) 2019/1020 and Directive 2001/95/EC on general product safety; (i) Union law on the protection of personal data, in particular Regulation (EU) 2016/679 and Directive 2002/58/EC. (j) Directive (EU) 2019/882; (k) Directive (EU) 2018/1972; (l) Directive 2013/11/EU. 4. By [12 months after the entry into force of this Regulation] the Commission shall publish guidelines with regard to the relationship between this Regulation and the legal acts referred to in Article 1a (3).

Article 2 Definitions

For the purpose of this Regulation, the following definitions shall apply: (a) ‘information society services’ means services within the meaning of as defined in Article 1(1)(b) of Directive (EU) 2015/1535; (b) ‘recipient of the service’ means any natural or legal person who uses the relevant intermediary service; service in order to seek information or to make it accessible ; (c) ‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business , craft, or profession; (d) ‘to offer services in the Union’ means enabling legal or natural persons in one or more Member States to use the services of the a provider of information society services which has a substantial connection to the Union; such a substantial deleted deleted (da) ‘substantial connection is deemed to exist where the Union’ means the connection of a provider has an with one or more Member States resulting either from its establishment in the Union; Union, or in the absence of such an establishment, from the assessment of a substantial connection is based on specific factual criteria, such as: fact that the provider directs its activities towards one or more Member States; (e) ‘trader’ means any natural person, or any legal person irrespective of whether privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes directly relating to his or her trade, business, craft or profession; (f) ‘intermediary service’ means one a ‘mere conduit’ service that consists of the following services: transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network including technical auxiliary functional services ; a ‘caching’ service that consists of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request; (g) ‘illegal content’ means any information,, which, in itself information or by its reference to an activity, including the sale of products or provision of services which is not in compliance with Union law or the law of a Member State, irrespective of the precise subject matter or nature of that law; (h) ‘online platform’ means a provider of a hosting service which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and or a purely ancillary feature of another service or functionality of the principal service and, for objective and technical reasons reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation. (i) ‘dissemination to the public’ means making information available, at the request of the recipient of the service who provided the information, to a potentially unlimited number of third parties; (j) ‘distance contract’ means a contract within the meaning of Article 2(7) of Directive 2011/83/EU; (k) ‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications; (l) ‘Digital Services Coordinator of establishment’ means applications which enables the Digital Services Coordinator recipients of the Member State where service to access and interact with the provider of an relevant intermediary service is established or its legal representative resides or is established; (m) ‘Digital Services Coordinator of destination’ ; (ka) ‘trusted flagger’ means the an entity that has been awarded such status by a Digital Services Coordinator of a Member State where the intermediary service is provided; Coordinator; (n) ‘advertisement’ means information designed and disseminated to promote the message of a legal or natural person, irrespective of whether to achieve commercial or non-commercial purposes, and displayed by an online platform on its online interface against remuneration specifically in exchange for promoting that information; message ; (na) ‘remuneration’ means economic compensation consisting of direct or indirect payment for the service provided, including where the intermediary service provider is not directly compensated by the recipient of the service or where the recipient of the service provides data to the service provider, except where such data is collected for the sole purpose of meeting legal requirements; (o) ‘recommender system’ means a fully or partially automated system used by an online platform to suggest , prioritise or curate in its online interface specific information to recipients of the service, including as a result of a search initiated by the recipient or otherwise determining the relative order or prominence of information displayed; (p) ‘content moderation’ means the activities , either automated or not automated, undertaken by providers of intermediary services aimed at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility and accessibility of that illegal content or that information, such as demotion, disabling of access to, delisting, demonetisation or removal thereof, or the recipients’ ability to provide that information, such as the termination or suspension of a recipient’s account; (q) ‘terms and conditions’ means all terms and conditions or specifications, by the service provider irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the services. (qa) ‘persons with disabilities’ means persons with disabilities within the meaning of Article 3(1) of Directive (EU) 2019/882.

Chapter II – Liability of providers of intermediary services

Article 3 ‘Mere conduit’

1 Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network, the service provider shall not be liable for the information transmitted, on condition that the provider: (a) does not initiate the transmission; (b) does not select the receiver of the transmission; and (c) does not select or modify the information contained in the transmission. 2 The acts of transmission and of provision of access referred to in paragraph 1 include the automatic, intermediate and transient storage of the information transmitted in so far as this takes place for the sole purpose of carrying out the transmission in the communication network, and provided that the information is not stored for any period longer than is reasonably necessary for the transmission. 3 3. This Article shall not affect the possibility for a court judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

Article 4 ‘Caching’

1 1. Where an information society service is provided that consists of the transmission in a communication network of information provided by a recipient of the service, the service provider shall not be liable for the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient or secure the information's onward transmission to other recipients of the service upon their request, on condition that: (a) that the provider : (a) does not modify the information; (b) the provider complies with conditions on access to the information; (c) the provider complies with rules regarding the updating of the information, specified in a manner widely recognised and used by industry; (d) the provider does not interfere with the lawful use of technology, widely recognised and used by industry, to obtain data on the use of the information; and (e) the provider acts expeditiously to remove or to disable access to the information it has stored upon obtaining actual knowledge of the fact that the information at the initial source of the transmission has been removed from the network, or access to it has been disabled, or that a court or an administrative authority has ordered such removal or disablement. 2 2. This Article shall not affect the possibility for a court judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

Article 5 Hosting

1 Where an information society service is provided that consists of the storage of information provided by a recipient of the service the service provider shall not be liable for the information stored at the request of a recipient of the service on condition that the provider: (a) does not have actual knowledge of illegal activity or illegal content and, as regards claims for damages, is not aware of facts or circumstances from which the illegal activity or illegal content is apparent; or (b) upon obtaining such knowledge or awareness, acts expeditiously to remove or to disable access to the illegal content. 2 Paragraph 1 shall not apply where the recipient of the service is acting under the authority or the control of the provider. 3 3. Paragraph 1 shall not apply with respect to liability under consumer protection law of online platforms allowing consumers to conclude distance contracts with traders, where such an online platform presents the specific item of information or otherwise enables the specific transaction at issue in a way that would lead an average and reasonably well-informed a consumer to believe that the information, or the product or service that is the object of the transaction, is provided either by the online platform itself or by a recipient of the service who is acting under its authority or control. 4 4. This Article shall not affect the possibility for a court judicial or administrative authority, in accordance with Member States' legal systems, of requiring the service provider to terminate or prevent an infringement.

Article 6 Voluntary own-initiative investigations and legal compliance

1. Providers of intermediary services shall not be deemed ineligible for the exemptions from liability referred to in Articles 3, 4 4, and 5 solely because they carry out voluntary own-initiative investigations or other activities take measures aimed at detecting, identifying and removing, or disabling of access to, illegal content, content or take the necessary measures to comply with the requirements of national and Union law, including those the Charter and the requirements set out in this Regulation. 1a. Providers of intermediary services shall ensure that voluntary own-initiative investigations carried out and measures taken pursuant to paragraph 1 shall be effective and specific. Such own initiative investigations and measures shall be accompanied by appropriate safeguards, such as human oversight, documentation, or any additional measure to ensure and demonstrate that those investigations and measures are accurate, non-discriminatory, proportionate, transparent and do not lead to over-removal of content. Providers of intermediary services shall make best efforts to ensure that where automated means are used, the technology is sufficiently reliable to limit to the maximum extent possible the rate of errors where information is wrongly considered as illegal content.

Article 7 No general monitoring or active fact-finding obligations

1. No general obligation to monitor , neither de jure, nor de facto, through automated or non-automated means, the information which providers of intermediary services transmit or store, nor actively to seek facts or circumstances indicating illegal activity or for monitoring the behaviour of natural persons shall be imposed on those providers. 1a. Providers of intermediary services shall not be obliged to use automated tools for content moderation or for monitoring the behaviour of natural persons. 1b. Member States shall not prevent providers of intermediary services from offering end-to-end encrypted services. 1c. Member States shall not impose a general obligation on providers of intermediary services to limit the anonymous use of their services. Member States shall not oblige providers of intermediary services to generally and indiscriminately retain personal data of the recipients of their services. Any targeted retention of a specific recipient’s data shall be ordered by a judicial authority in accordance with Union or national law.

Article 8 Orders to act against illegal content

1 1. Providers of intermediary services shall, upon the receipt via a secure communications channel of an order to act against a one or more specific item items of illegal content, received from and issued by the relevant national judicial or administrative authorities, authorities on the basis of the applicable Union or national law, in conformity with Union law, inform the authority issuing the order of the effect given to the orders, without undue delay, specifying the action actions taken and the moment when the action was actions were taken. 2 Member States shall ensure that a reference to the orders referred legal basis for the order; a sufficiently detailed statement of reasons explaining why the information is illegal content, by reference to the specific provision of Union or national law in paragraph 1 meet conformity with Union law ; identification of the following conditions: (a) issuing authority including the orders contains date, timestamp and electronic signature of the following elements: authority, that allows the recipient to authenticate the order and contact details of a person of contact within the said authority; a clear indication of the exact electronic location of that information, such as the exact URL or URLs where appropriate or when the exact electronic location is not precisely identifiable; one or more exact uniform resource locators and, where necessary, additional information enabling the identification of the illegal content concerned; easily understandable information about redress mechanisms available to the provider of the service and to the recipient of the service who provided the content , including the deadlines for appeal ; where necessary and proportionate, the decision not to disclose information about the removal of or disabling of access to the content for reasons of public security, such as the prevention, investigation, detection and prosecution of serious crime, not exceeding six weeks from that decision; (b) the territorial scope of the order, order on the basis of the applicable rules of Union and national law in conformity with Union law, including the Charter, and, where relevant, general principles of international law, does not exceed what is strictly necessary to achieve its objective; the territorial scope of the order shall be limited to the territory of the Member State issuing the order unless the illegality of the content derives directly from Union law or the rights at stake require a wider territorial scope, in accordance with Union and international law; (c) the order is drafted in the language declared by the provider and is sent to the point of contact, appointed by the provider, in accordance with Article 10. 10 or in one of the official languages of the Member State that issues the order against the specific item of illegal content; in such case, the point of contact of the service provider may request the competent authority to provide translation into the language declared by the provider; (ca) the order is in compliance with Article 3 of Directive 2000/31/EC; (cb) where more than one provider of intermediary services is responsible for hosting the specific items of illegal content, the order is issued to the most appropriate provider that has the technical and operational ability to act against those specific items. 2a. The Commission shall adopt implementing acts in accordance with Article 70, after consulting the Board, laying down a specific template and form for the orders, referred to in paragraph 1. 2b. Providers of intermediary services who received an order shall have a right to an effective remedy. The Digital Services Coordinator from of the Member State of establishment may choose to intervene on behalf of the judicial provider in any redress, appeal or administrative other legal processes in relation to the order. The Digital Services Coordinator of the Member State of establishment may request the authority issuing the order to withdraw or repeal the order or adjust the territorial scope of the order to what is strictly necessary. Where such a request is refused, the Digital Services Coordinator of the Member State of establishment shall be entitled to seek the annulling, ceasing or adjustment of the effect of the order before the judicial authorities of the Member States issuing the order. Such proceedings shall be completed without undue delay. 2c. If the provider cannot comply with the removal order because it contains manifest errors or does not contain sufficient information for its execution, it shall, without undue delay, inform the judicial or administrative authority that issued the order asking for the necessary clarification. 2d. The authority issuing the order shall transmit a copy that order and the information received from the provider of intermediary services as to the orders referred effect given to in paragraph 1 the order to all other the Digital Services Coordinators through Coordinator from the system established in accordance with Article 67. 4 Member State of the issuing authority. 4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law and administrative procedural law in conformity with Union law. law , including the Charter . While acting in accordance with such laws, authorities shall not go beyond what is necessary in order to attain the objectives pursued. 4a. Member States shall ensure that the relevant authorities may, at the request of an applicant whose rights are infringed by illegal content, issue against the relevant provider of intermediary services an injunction order in accordance with this Article to remove or disable access to that content.

Article 9 Orders to provide information

1 1. Providers of intermediary services shall, upon receipt via a secure communications channel of an order to provide a specific item of information about one or more specific individual recipients of the service, received from and issued by the relevant national judicial or administrative authorities on the basis of the applicable Union or national law, in conformity with Union law, inform without undue delay the authority of issuing the order of its receipt and the effect given to the order. 2 Member States shall ensure the identification details of the judicial or administrative authority issuing the order and authentication of the order by that orders referred to in paragraph 1 meet authority, including the following conditions: (a) date, time stamp and electronic signature of the authority issuing the order contains to provide information; a reference to the following elements: (b) legal basis for the order; a clear indication of the exact electronic location, an account name, or a unique identifier of the recipient on whom information is sought; a sufficiently detailed statement of reasons explaining the objective for which the information is required and why the requirement to provide the information is necessary and proportionate to determine compliance by the recipients of the intermediary services with applicable Union or national rules, unless such a statement cannot be provided for reasons related to the prevention, investigation, detection and prosecution of criminal offences; where the information sought constitutes personal data within the meaning of Article 4, point (1), of Regulation (EU) 2016/679 or Article 3, point (1), of Directive (EU) 2016/680, a justification that the order only requires is in accordance with applicable data protection law; information about redress available to the provider and to provide the recipients of the service concerned including deadlines for appeal ; an indication on whether the provider should inform without undue delay the recipient of the service concerned, including information already collected about the data being sought; where information is requested in the context of criminal proceedings, the request for that information shall be in compliance with Directive (EU) 2016/680, and the purposes information to the recipient of providing the service concerned about that request may be delayed as long as necessary and which lies within its control; proportionate to avoid obstructing the relevant criminal proceedings, taking into account the rights of the suspected and accused persons and without prejudice to defence rights and effective legal remedies. Such a request shall be duly justified, specify the duration of the obligation of confidentiality and shall be subject to periodic review. (c) the order is drafted in the language declared by the provider and is sent to the point of contact appointed by that provider, in accordance with Article 10; 10 or in one of the official languages of the Member State that issues the order against the item of illegal content ; in such case, the point of contact may request the competent authority to provide translation into the language declared by the provider; 2a. The Commission shall adopt implementing acts in accordance with Article 70, after consulting the Board, laying down specific template and form for the orders referred to in paragraph 1. 2b. The provider of intermediary services who received an order shall have a right to an effective remedy. That right shall include the right to challenge the order before the judicial authorities of the Member State of the issuing competent authority, in particular where such an order is not incompliance with Article 3 of Directive 2000/31/EC. The Digital Services Coordinator from of the Member State of establishment may choose to intervene on behalf of the national judicial provider in any redress, appeal or administrative other legal proceedings in relation to the order. The Digital Services Coordinator of the Member State of establishment may request the authority issuing the order to withdraw or repeal the order. Where such a request is refused, the Digital Services Coordinator of the Member State of establishment shall be entitled to seek the annulling, ceasing or adjustment of the effect of the order before the judicial of the Member States of the order. Such proceedings shall be completed without undue delay. 2c. If the provider cannot comply with the order because it contains manifest errors or does not contain sufficient information to enable it to be executed, it shall, without undue delay, transmit inform the judicial or administrative authority that issued that information order and request the necessary clarifications. 2d. The authority issuing the order to provide a copy specific item of the information shall transmit that order referred and the information received from the provider of intermediary services as to in paragraph 1 the effect given to all the order to the Digital Services Coordinators through Coordinator from the system established in accordance with Article 67. 4 Member State of the issuing authority. 4. The conditions and requirements laid down in this article shall be without prejudice to requirements under national criminal procedural law or administrative procedural law in conformity with Union law.

Artículo 9 a.

Article 9a Effective remedies for recipients of the service 1. Recipients of the service whose content was removed according to Article 8 or whose information was sought according to Article 9 shall have the right to effective remedies against such orders, including, where applicable, restauration of content where such content has been in compliance with the terms and conditions, but has been erroneously considered as illegal by the service provider, without prejudice to remedies available under Directive (EU) 2016/680 and Regulation (EU) 2016/679. 2. Such right to an effective remedy shall be exercised before a judicial authority in the issuing Member State in accordance with national law and shall include the possibility to challenge the legality of the measure, including its necessity and proportionality. 3. Digital Services Coordinators shall develop national tools and guidance to recipients of the service as regards complaint and redress mechanisms applicable in their respective territory.

Chapter III – Due diligence obligations for a transparent and safe online environment

Article 10 Points of contact

1 Points of contact for Member States’ authorities, the Commission and the Board 1. Providers of intermediary services shall establish designate a single point of contact allowing for direct communication, enabling them to communicate directly , by electronic means, with Member States’ authorities, the Commission and the Board referred to in Article 47 for the application of this Regulation. 2 2. Providers of intermediary services shall make public communicate to the Member States' authorities, the Commission and the Board, the information necessary to easily identify and communicate with their single points of contact. 3 Providers of intermediary services contact , including the name, the email address, the physical address and the telephone number, and shall specify in ensure that the information referred is kept up to in paragraph 2, the official language or languages date . 2a. Providers of intermediary services may establish the Union, which can be used to communicate with their points same single point of contact for this Regulation and which shall include at least one of the official languages another single point of the Member State in which contact as required under other Union law. When doing so, the provider shall inform the Commission of intermediary services has its main establishment or where its legal representative resides or is established. this decision.

Artículo 10 a.

Article 10a Points of contact for recipients of services 1. Providers of intermediary services shall designate a single point of contact that enables recipients of services to communicate directly with them. 2. In particular, providers of intermediary services shall enable recipients of services to communicate with them by providing rapid, direct and efficient means of communication such as telephone number, email addresses, electronic contact forms, chatbots or instant messaging as well as the physical address of the establishment of the provider of intermediary services, in a user-friendly, and easily accessible manner. Providers of intermediary services shall also enable recipients of services to choose the means of direct communication, which shall not solely rely on automated tools. 3. Providers of intermediary services shall make all reasonable efforts to guarantee that sufficient human and financial resources are allocated to ensure that the communication, referred to in paragraph 1 is performed in a timely and efficient manner.

Article 11 Legal representatives

1 1. Providers of intermediary services which do not have an establishment in the Union but which offer services in the Union shall designate, in writing, a legal or natural person to act as their legal representative in one of the Member States where the provider offers its services. 2 2. Providers of intermediary services shall mandate their legal representatives to be addressed in addition to or instead of the provider by the Member States’ authorities, the Commission and the Board on all issues necessary for the receipt of, compliance with and enforcement of decisions issued in relation to this Regulation. Providers of intermediary services shall provide their legal representative with the necessary powers and resource sufficient resources in order to cooperate guarantee their efficient and timely cooperation with the Member States’ authorities, the Commission and the Board and comply with any of those decisions. 3 The designated legal representative can be held liable for non-compliance with obligations under this Regulation, without prejudice to the liability and legal actions that could be initiated against the provider of intermediary services. 4 4. Providers of intermediary services shall notify the name, postal address, the electronic mail address and telephone number of their legal representative to the Digital Service Coordinator in the Member State where that legal representative resides or is established. established They shall ensure that that information is kept up to date. 5 The designation of a Digital Service Coordinator in the Member State where that legal representative resides or is established shall, upon receiving that information, make reasonable efforts to assess its validity. 5a. Providers of intermediary services that qualify as micro, small or medium-sized enterprises (SMEs) within the Union pursuant meaning of the Annex to paragraph 1 Recommendation 2003/361/EC, and who have been unsuccessful in obtaining the services of a legal representative after reasonable effort, shall not amount be able to an establishment in request that the Union. Digital Service Coordinator of the Member State where the enterprise intends to establish a legal representative facilitates further cooperation and recommends possible solutions, including possibilities for collective representation.

Article 12 Terms and conditions

1 1. Providers of intermediary services shall use fair, non-discriminatory and transparent terms and conditions. Providers of intermediary services shall draft those terms and conditions in clear , plain, user friendly and unambiguous language and shall make them publicly available in an easily accessible and machine-readable format in the languages of the Member State towards which the service is directed . In their terms and conditions, providers of intermediary services shall respect the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms, as enshrined in the Charter as well as the rules applicable to the media in the Union. 1a. In their terms and conditions, providers of intermediary services shall include information on any restrictions or modifications that they impose in relation to the use of their service in respect of information content provided by the recipients of the service, in their terms and conditions. That service. Providers of intermediary services shall also include easily accessible information on the right of the recipients to terminate the use of their service. Providers of intermediary services shall also include information on any policies, procedures, measures and tools used by the provider of the intermediary service for the purpose of content moderation, including algorithmic decision-making and human review. It 1b. Providers of intermediary services shall be set out in clear notify expeditiously the recipients of the service of any significant change to the terms and unambiguous language conditions and provide an explanation thereof. 1c. Where an intermediary service is primarily directed at minors or is pre-dominantly used by them, the provider shall be publicly available explain conditions for and restrictions on the use of the service in an easily accessible format. 2 a way that minors can understand. 2. Providers of intermediary services shall act in a fair, transparent, coherent, diligent, objective timely, non-arbitrary, non-discriminatory and proportionate manner in applying and enforcing the restrictions referred to in paragraph 1, with due regard to the rights and legitimate interests of all parties involved, including the applicable fundamental rights of the recipients of the service as enshrined in the Charter. 2a. Providers of intermediary services shall provide recipients of services with a concise, easily accessible and in machine-readable format summary of the terms and conditions, in clear, user-friendly and unambiguous language. That summary shall identify the main elements of the information requirements, including the possibility of easily opting-out from optional clauses and the remedies and redress mechanisms available. 2b. Providers of intermediary services may use graphical elements such as icons or images to illustrate the main elements of the information requirements. 2c. Very large online platforms as defined in Article 25 shall publish their terms and conditions in the official languages of all Member States in which they offer their services. 2d. Providers of intermediary services shall not require recipients of the service other than traders to make their legal identity public in order to use the service. 2e. Terms and conditions of providers of intermediary services shall respect the essential principles of fundamental rights enshrined in the Charter. 2f. Terms that do not comply with this Article shall not be binding on recipients.

Article 13 Transparency reporting obligations for providers of intermediary services

1 1. Providers of intermediary services shall publish, publish in a standardised and machine-readable format and in an easily accessible manner , at least once a year, clear, easily comprehensible and detailed reports on any content moderation they engaged in during the relevant period. Those reports shall include, in particular, information on the following, as applicable: (a) the number of orders received from Member States’ authorities, categorised by the type of illegal content concerned, including orders issued in accordance with Articles 8 and 9, and the average time needed for taking to inform the action specified authority issuing the order of its receipt and the effect given to the order ; (aa) where applicable, the complete number of content moderators allocated for each official language per Member State, and a qualitative description of whether and how automated tools for content moderation are used in those orders; each official language; (b) the number of notices submitted in accordance with Article 14, categorised by the type of alleged illegal content concerned, concerned , the number of notices submitted by trusted flaggers , any action taken pursuant to the notices by differentiating whether the action was taken on the basis of the law or the terms and conditions of the provider, and the average and median time needed for taking the action; providers of intermediary services may add additional information as to the reasons for the average time for taking the action; (c) meaningful and comprehensible information about the content moderation engaged in at the providers’ own initiative, including the use of automated tools, the number and type of measures taken that affect the availability, visibility and accessibility of information provided by the recipients of the service and the recipients’ ability to provide information, categorised by the type of reason and basis for taking those measures; measures , as well as, where applicable, measures taken to provide training and assistance to members of staff who are engaged in content moderation, and to ensure that non-infringing content is not affected ; (d) the number of complaints received through the internal complaint-handling system referred to in Article 17, the basis for those complaints, decisions taken in respect of those complaints, the average and median time needed for taking those decisions and the number of instances where those decisions were reversed. 2 1a. The information provided shall be presented per Member State in which services are offered and in the Union as a whole. 2. Paragraph 1 shall not apply to providers of intermediary services that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC. 2003/361/EC , which do not also qualify as very large online platforms .

Artículo 13 a.

Article 13a Online interface design and organisation 1. Providers of intermediary services shall not use the structure, function or manner of operation of their online interface, or any part thereof, to distort or impair recipients of services’ ability to make a free, autonomous and informed decision or choice. In particular, providers of intermediary services shall refrain from: (a) giving more visual prominence to any of the consent options when asking the recipient of the service for a decision; (b) repeatedly requesting that a recipient of the service consents to data processing, where such consent has been refused, pursuant to Article 7(3) of Regulation (EU) 2016/679, regardless of the scope or purpose of such processing, especially by presenting a pop-up that interferes with user experience; (c) urging a recipient of the service to change a setting or configuration of the service after the recipient has already made a choice; (d) making the procedure of terminating a service significantly more cumbersome than signing up to it; or (e) requesting consent where the recipient of the service exercises his or her right to object by automated means using technical specifications, in line with Article 21(5) of Regulation (EU) 2016/679. This paragraph shall be without prejudice to Regulation(EU) 2016/679. 2. The Commission is empowered to adopt a delegated act to update the list of practices referred to in paragraph 1. 3. Where applicable, providers of intermediary services shall adapt their design features to ensure a high level of privacy, safety, and security by design for minors.

Article 14 Notice and action mechanisms

1 Providers of hosting services shall put mechanisms in place to allow any individual or entity to notify them of the presence on their service of specific items of information that the individual or entity considers to be illegal content. Those mechanisms shall be easy to access, user-friendly, and allow for the submission of notices exclusively by electronic means. 2 2. The mechanisms referred to in paragraph 1 shall be such as to facilitate the submission of sufficiently precise and adequately substantiated notices, on the basis of which a diligent economic operator can identify the illegality of the content in question. notices. To that end, the providers shall take the necessary measures to enable and facilitate the submission of valid notices containing all of the following elements: (a) an explanation of the reasons why the individual or entity considers (aa) where possible, evidence that substantiates the information in question to be illegal content; claim; (b) where relevant, a clear indication of the exact electronic location of that information, in particular for example, the exact URL or URLs, and, or , where necessary, additional information enabling the identification of the illegal content; (c) the name and an electronic mail address of the individual or entity submitting the notice, except in the case of information considered content as applicable to involve one of the offences referred to in Articles 3 to 7 type of Directive 2011/93/EU; (d) a statement confirming content and to the good faith belief specific type of the individual or entity submitting the notice that the information and allegations contained therein are accurate and complete. 3 hosting service ; 3. Notices that include the elements referred to in paragraph 2 , on the basis of which a diligent hosting service provider is able to establish the illegality of the content in question without conducting a legal or factual examination, shall be considered to give rise to actual knowledge or awareness for the purposes of Article 5 in respect of the specific item of information concerned. 4 3a. Information that has been the subject of a notice shall remain accessible while the assessment of its legality is still pending, without prejudice to the right of providers of hosting services to apply their terms and conditions. Providers of hosting services shall not be held liable for failure to remove notified information, while the assessment of legality is still pending. 4. Where the notice contains the name and an electronic mail address of the individual or entity that submitted it, the provider of hosting services shall promptly , without undue delay, send a confirmation of receipt of the notice to that individual or entity. 5 5. The provider shall also, without undue delay, notify that individual or entity of its decision action in respect of the information to which the notice relates, providing information on the redress possibilities possibilities. 5a. The anonymity of individuals who submitted a notice shall be ensured towards the recipient of the service who provided the content, except in respect cases of that decision. 6 alleged violations of personality rights or of intellectual property rights. 6. Providers of hosting services shall process any notices that they receive under the mechanisms referred to in paragraph 1, 1 and take their decisions in respect of the information to which the notices relate, in a timely, diligent , non-discriminatory and objective non-arbitrary manner. Where they use automated means for that processing or decision-making, they shall include information on such use in the notification referred to in paragraph 4. Where the provider has no technical, operational or contractual ability to act against specific items of illegal content, it may hand over a notice to the provider that has direct control of specific items of illegal content, while informing the notifying person or entity and the relevant Digital Services Coordinator.

Article 15 Statement of reasons

1 1. Where a provider of hosting services decides to remove or remove, disable access to, demote or to impose other measures with regard to specific items of information provided by the recipients of the service, irrespective of the means used for detecting, identifying or removing or disabling access to that information and of the reason for its decision, it shall inform the recipient, at the latest at the time of the removal or disabling of access, of the decision and provide a clear and specific statement of reasons for that decision. 2 The statement of reasons referred to in paragraph 1 This obligation shall at least contain not apply where the following information: content is deceptive high-volume commercial content, or it has been requested by a judicial or law enforcement authority not to inform the recipient due to an ongoing criminal investigations until the criminal investigations is closed. (a) whether the decision action entails either the removal of, or removal, the disabling of access to, access, the demotion of, or imposes other measures with regard to information and, where relevant, the territorial scope of the disabling of access; action and its duration, including, where an action was taken pursuant to Article 14, an explanation about why the action did not exceed what was strictly necessary to achieve its purpose ; (b) the facts and circumstances relied on in taking the decision, action , including where relevant whether the decision action was taken pursuant to a notice submitted in accordance with Article 14; 14 or based on voluntary own-initiative investigations or to an order issued in accordance with Article 8 and where appropriate, the identity of the notifier ; (c) where applicable, information on the use made of automated means in taking the decision, action , including where the decision action was taken in respect of content detected or identified using automated means; (d) where the decision action concerns allegedly illegal content, a reference to the legal ground relied on and explanations as to why the information is considered to be illegal content on that ground; (e) where the decision action is based on the alleged incompatibility of the information with the terms and conditions of the provider, a reference to the contractual ground relied on and explanations as to why the information is considered to be incompatible with that ground; (f) clear, user-friendly information on the redress possibilities available to the recipient of the service in respect of the decision, action , in particular , where applicable through internal complaint-handling mechanisms, out-of-court dispute settlement and judicial redress. 3 The information provided by the providers of hosting services in accordance with this Article shall be clear and easily comprehensible and as precise and specific as reasonably possible under the given circumstances. The information shall, in particular, be such as to reasonably allow the recipient of the service concerned to effectively exercise the redress possibilities referred to in point (f) of paragraph 2. 4 4. Providers of hosting services shall publish at least once a year the decisions actions and the statements of reasons, referred to in paragraph 1 in a publicly accessible machine-readable database managed and published by the Commission. That information shall not contain personal data.

Artículo 15 a.

Article 15 a Notification of suspicions of criminal offences 1. Where a provider of hosting services becomes aware of any information giving rise to a suspicion that a serious criminal offence involving an imminent threat to the life or safety of persons has taken place, is taking place or planned to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide, upon their request, all the relevant information available. 2. Where the provider of hosting services cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative and may inform Europol. For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, to be taking place or to be planned to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located. For the purpose of this Article, Member States shall notify to the Commission the list of its competent law enforcement or judicial authorities. 3. Unless instructed otherwise by the informed authority, the provider of hosting services shall remove or disable the content. 4. Information obtained by a law enforcement or judicial authority of a Member State in accordance with paragraph 1 shall not be used for any purpose other than those directly related to the individual serious criminal offence notified. 5. The Commission shall adopt an implementing act setting down a template for notifications under paragraph 1.

Article 16 Exclusion for micro and small enterprises

1. This Section shall not apply to online platforms that qualify as micro or small enterprises within the meaning of the Annex to Recommendation 2003/361/EC and which do not qualify as a very large online platforms as defined by Article 25 of this Regulation . 2. Providers of intermediary services may submit an application accompanied by a justification for a waiver from the requirements of this section provided that they: (a) do not present significant systemic risks and have limited exposure to illegal content; and (b) qualify as non-for-profit or qualify as a medium enterprise within the meaning of the Annex to Recommendation 2003/361/EC. 3. The application shall be submitted to the Digital Services Coordinator of establishment who shall conduct a preliminary assessment. The Digital Services Coordinator of establishment shall transmit to the Commission the application accompanied by its assessment and where applicable, a recommendation on the Commission’s decision. The Commission shall examine such an application and, after consulting the Board, may issue a total or a partial waiver from the requirements of this Section. 4. Where the Commission grants such a waiver, it shall monitor the use of the waiver by the provider of intermediary services to ensure that the conditions for use of the waiver are respected. 5. Upon the request of the Board, the Digital Services Coordinator of establishment or the provider, or on its own initiative, the Commission may review or revoke the waiver in whole or in parts. 6. The Commission shall maintain a list of all waivers issued and their conditions and shall make the list publicly available. 7. The Commission shall be empowered to adopt a delegated act in accordance with Article 69 as to the process and procedure for the implementation of the waiver system in relation with this Article.

Article 17 Internal complaint-handling system

1 Online platforms shall provide recipients of the service, for a period of at least six months following the decision referred to in this paragraph, the access to an effective internal complaint-handling system, which enables the complaints to be lodged electronically and free of charge, against the following decisions taken by the online platform on the ground that the information provided by the recipients is illegal content or incompatible with its terms and conditions: (a) decisions to remove or , demote, disable access to or impose other measures that restrict visibility, availability or accessibility of the information; (b) decisions to suspend or terminate , or limit the provision of the service, in whole or in part, to the recipients; (c) (ca) decisions to suspend or terminate restrict the recipients’ account. 2 ability to monetise content provided by the recipients. 1a. The period of at least six months as set out in paragraph 1 shall be considered to start on the day on which the recipient of the service is informed about the decision in accordance with Article 15. 2. Online platforms shall ensure that their internal complaint-handling systems are easy to access, user-friendly , including for persons with disabilities and minors, non-discriminatory and enable and facilitate the submission of sufficiently precise and adequately substantiated complaints. 3 Online platforms shall set out the rules of procedure of their internal complaint handling system in their terms and conditions in a clear, user-friendly and easily accessible manner. 3. Online platforms shall handle complaints submitted through their internal complaint-handling system in a timely, non-discriminatory, diligent and objective manner. non-arbitrary manner and within ten working days starting on the date on which the online platform received the complaint . Where a complaint contains sufficient grounds for the online platform to consider that the information to which the complaint relates is not illegal and is not incompatible with its terms and conditions, or contains information indicating that the complainant’s conduct does not warrant the suspension or termination of the service or the account, it shall reverse its decision referred to in paragraph 1 without undue delay. 4 5. Online platforms shall inform complainants without undue delay ensure that recipients of the decision they have taken in respect of service are given the information possibility, where necessary, to which contact a human interlocutor at the complaint relates and shall inform complainants time of the possibility submission of out-of-court dispute settlement provided for in Article 18 the complaint and other available redress possibilities. 5 Online platforms shall ensure that the decisions, referred to in paragraph 4, are not solely taken on the basis of automated means. Online platform shall ensure that decisions are taken by qualified staff. 5a. Recipients of the service shall have the possibility to seek swift judicial redress in accordance with the laws of the Member States concerned.

Article 18 Out-of-court dispute settlement

1 1. Recipients of the service addressed by the decisions referred to in Article 17(1), taken by the online platform on the grounds that the information provided by the recipients is illegal content or incompatible with its terms and conditions , shall be entitled to select any out-of-court dispute settlement body that has been certified in accordance with paragraph 2 in order to resolve disputes relating to those decisions, including complaints that could not be resolved by means of the internal complaint-handling system referred to in that Article. Online platforms 1a. Both parties shall engage, in good faith, with the independent, external certified body selected with a view to resolving the dispute and shall be bound by the decision taken by the body. The first subparagraph is without prejudice possibility to select any out-of-court dispute settlement body shall be easily accessible on the right online interface of the recipient concerned to redress against the decision before a court online platform in accordance with the applicable law. 2 a clear and user-friendly manner. 2. The Digital Services Coordinator of the Member State where the out-of-court dispute settlement body is established shall, at the request of that body, certify the body, body for a maximum of three years, which can be renewed , where the body and persons in charge of the out-of-court dispute settlement body has demonstrated that it meets all of the following conditions: (a) it is impartial and independent, including financially independent of online platforms , and impartial towards online platforms, recipients of the service provided by the online platforms; (b) it has the necessary expertise platforms and towards individuals or entities that have submitted notices ; (ba) its members are remunerated in relation a way that is not linked to the issues arising in one or more particular areas outcome of illegal content, or the procedure; (bb) the natural persons in relation charge of dispute resolution commit not to work for the application and enforcement of terms and conditions of one online platform or more types a professional organisation or business association of which the online platforms, allowing platform is a member for a period of three years after their position in the body to contribute effectively has ended, and have not worked for such an organisation for two years prior to the settlement of a dispute; taking up this role; (c) the dispute settlement is easily accessible , including for persons with disabilities, through electronic communication technology; (d) it is capable of settling dispute in a swift, efficient technology and cost-effective manner provides for the possibility to submit a complaint and in at least one official language of the Union; requisite supporting documents online ; (e) the dispute settlement takes place in accordance with clear and fair rules of procedure. procedure which are clearly visible and easily and publicly accessible . 2a. The Digital Services Coordinator shall, where applicable, specify in shall reassess on a yearly basis whether the certificate certified out-of-court dispute settlement body continues to fulfil the particular issues conditions, referred to which in paragraph 2. If this is not the body’s expertise relates and case, the official language or languages of Digital Services Coordinator shall revoke the Union in which status from the out-of-court dispute settlement body. 2b. The Digital Service Coordinator shall draw up a report every two years listing the number of complaints the out of court dispute settlement body is capable has received annually, the outcomes of settling disputes, as referred the decisions delivered, any systematic or sectoral problems identified, and the average time taken to resolve the disputes. The report shall in points particular: (a) identify best practices of the out-of-court dispute settlement bodies; (b) report, where appropriate, on any shortcomings, supported by statistics, that hinder the functioning of the out-of-court dispute settlement bodies for both domestic and (d) cross-border disputes; (c) make recommendations on how to improve the effective and efficient functioning of the first subparagraph, respectively. 3 out-of-court dispute settlement bodies, where appropriate. 2c. Certified out-of-court dispute settlement bodies shall conclude dispute resolution proceedings within a reasonable period of time and no later than 90 calendar days after the date on which the certified body has received the complaint. The procedure shall be considered terminated on the date on which the certified body has made the decision of out-of-court dispute settlement procedure available. 3. If the body decides the dispute in favour of the recipient of the service, service , individuals or entities mandated under Article 68 that have submitted notices , the online platform shall reimburse the recipient for any fees and other reasonable expenses that the recipient has or individuals or entities that have submitted notices have paid or is are to pay in relation to the dispute settlement. If the body decides the dispute in favour of the online platform, and the body does not find that the recipient acted in bad faith in the dispute, the recipient or the individuals or entities that have submitted notices shall not be required to reimburse any fees or other expenses that the online platform paid or is to pay in relation to the dispute settlement. The fees charged by the body for the dispute settlement shall be reasonable and shall in any event not exceed the costs thereof. Certified out-of-court dispute settlement bodies shall make the fees, or the mechanisms used to determine the fees, known to the recipient of the services and the online platform concerned before engaging in the dispute settlement. 4 Member States may establish out-of-court dispute settlement bodies thereof for the purposes of paragraph 1 or support the activities of some or all out-of-court online platforms . Out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2. Member States procedures shall ensure that any be free of their activities undertaken under the first subparagraph do not affect charge or available at a nominal fee for the ability recipient of their Digital Services Coordinators to certify the bodies concerned in accordance with paragraph 2. 5 service. 5. Digital Services Coordinators shall notify to the Commission the out-of-court dispute settlement bodies that they have certified in accordance with paragraph 2, including where applicable the specifications referred to in the second subparagraph of that paragraph. paragraph as well as out-of-court dispute settlement bodies whose status has been revoked . The Commission shall publish a list of those bodies, including those specifications, on a dedicated website, and keep it updated. 6 This Article is without prejudice to Directive 2013/11/EU and alternative dispute resolution procedures and entities for consumers established under that Directive.

Article 19 Trusted flaggers

1 1. Online platforms shall take the necessary technical and organisational measures to ensure that notices submitted by trusted flaggers , acting within their designated area of expertise, through the mechanisms referred to in Article 14, are processed and decided upon with priority and expeditiously, taking into account due process . 1a. Online platforms shall take the necessary technical and organisational measures to ensure that trusted flaggers can issue correction notices of incorrect removal, restriction or disabling access to content, or of suspensions or terminations of accounts, and that those notices to restore information are processed and decided upon with priority and without delay. 2 2. The status of trusted flaggers under this Regulation shall be awarded, upon application by any entities, entity , by the Digital Services Coordinator of the Member State in which the applicant is established, where the applicant has demonstrated to meet all of the following conditions: (a) it has particular expertise and competence for the purposes of detecting, identifying and notifying illegal content; (b) it represents collective interests and is independent from any online platform; (c) it carries out its activities for the purposes of submitting notices in a timely, diligent an accurate and objective manner. 3 (ca) it has a transparent funding structure, including publishing the sources and amounts of all revenue annually; (cb) it publishes, at least once a year, clear, easily comprehensible, detailed and standardised reports on all notices submitted in accordance with Article 14 during the relevant period. The report shall list: notices categorised by the identity of the provider of hosting services; the type of content notified; the specific legal provisions allegedly breached by the content notified; the action taken by the provider; any potential conflicts of interest and sources of funding, and an explanation of the procedures in place to ensure that the trusted flagger retains its independence. The reports referred to in point (cb) shall be sent to the Commission which shall make them publicly available. 3. Digital Services Coordinators shall award the trusted flagger status for a period of two years, upon which the status may be renewed where the trusted flagger concerned continues to meet the requirements of this Regulation. The Digital Services Coordinators shall communicate to the Commission and the Board the names, addresses and electronic mail addresses of the entities to which they have awarded the status of the trusted flagger in accordance with paragraph 2. 4 2 or have been revoked in accordance with paragraph 6 . The Digital Services Coordinator of the Member State of establishment of the platform shall engage in dialogue with platforms and stakeholders for maintaining the accuracy and efficacy of a trusted flagger system. 4. The Commission shall publish the information referred to in paragraph 3 in a publicly available database in an easily accessible and machine-readable format and keep the database updated. 5 5. Where an online platform has information indicating that a trusted flagger submitted a significant number of insufficiently precise , inaccurate or inadequately substantiated notices through the mechanisms referred to in Article 14, including information gathered in connection to the processing of complaints through the internal complaint-handling systems referred to in Article 17(3), it shall communicate that information to the Digital Services Coordinator that awarded the status of trusted flagger to the entity concerned, providing the necessary explanations and supporting documents. 6 Upon receiving the information from the online platforms and if the Digital Services Coordinator considers that there are legitimate reasons to open an investigation, the status of trusted flagger shall be suspended during the period of the investigation. 6. The Digital Services Coordinator that awarded the status of trusted flagger to an entity shall revoke that status if it determines, following an investigation either on its own initiative or on the basis information received by from third parties, including the information provided by an online platform pursuant to paragraph 5, 5 , carried out without undue delay , that the entity no longer meets the conditions set out in paragraph 2. Before revoking that status, the Digital Services Coordinator shall afford the entity an opportunity to react to the findings of its investigation and its intention to revoke the entity’s status as trusted flagger 7 flagger. 7. The Commission, after consulting the Board, may shall issue guidance to assist online platforms and Digital Services Coordinators in the application of paragraphs 2, 5 and 6.

Artículo 19 a.

Article 19 a Accessibility requirements for online platforms 1. Providers of online platforms which offer services in the Union shall ensure that they design and provide services in accordance with the accessibility requirements set out in Section III, Section IV, Section VI, and Section VII of Annex I of Directive (EU) 2019/882. 2. Providers of online platforms shall prepare the necessary information in accordance with Annex V of Directive (EU) 2019/882 and shall explain how the services meet the applicable accessibility requirements. The information shall be made available to the public in an accessible manner for persons with disabilities. Providers of online platforms shall keep that information for as long as the service is in operation. 3. Providers of online platforms shall ensure that information, forms and measures provided pursuant to this Regulation are made available in a manner that they are easy to find, easy to understand, and accessible to persons with disabilities. 4. Providers of online platforms which offer services in the Union shall ensure that procedures are in place so that the provision of services remains in conformity with the applicable accessibility requirements. Changes in the characteristics of the provision of the service, changes in applicable accessibility requirements and changes in the harmonised standards or in technical specifications by reference to which a service is declared to meet the accessibility requirements shall be adequately taken into account by the provider of intermediary services. 5. In the case of non-conformity, providers of online platforms shall take the corrective measures necessary to bring the service into conformity with the applicable accessibility requirements. 6. They shall cooperate with that authority, at the request of that authority, on any action taken to bring the service into compliance with those requirements. 7. Online platforms which are in conformity with harmonised standards or parts thereof derived from Directive (EU) 2019/882 the references of which have been published in the Official Journal of the European Union, shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those standards or parts thereof cover those requirements. 8. Online platforms which are in conformity with the technical specifications or parts thereof adopted for the Directive (EU) 2019/882 shall be presumed to be in conformity with the accessibility requirements of this Regulation in so far as those technical specifications or parts thereof cover those requirements.

Article 20 Measures and protection against misuse

1 1. Online platforms shall be entitled to suspend, for a reasonable period of time and after having issued a prior warning, the provision of their services to recipients of the service that frequently provide manifestly illegal content. 2 content, for which the illegality can be established without conducting a legal or factual examination or for which they have received two or more orders to act regarding illegal content in the previous 12 months, unless those orders were later overturned . 2. Online platforms shall be entitled to suspend, for a reasonable period of time and after having issued a prior warning, the processing of notices and complaints submitted through the notice and action mechanisms and internal complaints-handling systems referred to in Articles 14 and 17, respectively, by individuals or entities or by complainants that frequently repeatedly submit notices or complaints that are manifestly unfounded. 3 Online 3. When deciding on the suspension, providers of online platforms shall assess, on a case-by-case basis and in a timely, diligent and objective manner, whether a recipient, individual, entity or complainant engages in the misuse referred to in paragraphs 1 and 2, taking into account all relevant facts and circumstances apparent from the information available to the provider of the online platform. Those circumstances shall include at least the following: (a) the absolute numbers of items of manifestly illegal content or manifestly unfounded notices or complaints, submitted in the past year; (b) the relative proportion thereof in relation to (d) where identifiable the total number of items intention of information provided the recipient, individual, entity or notices complainant; (da) whether a notice was submitted in by an individual user or by an entity or persons with specific expertise related to the past year; (c) content in question or following the gravity use of an automated content recognition system. 3a. Suspensions referred to in paragraphs 1 and 2 may be declared permanent where: (a) there are compelling reasons of law or public policy, including ongoing criminal investigations; (b) the misuses items removed were components of high-volume campaigns to deceive users or manipulate platform content moderation efforts; (c) a trader has repeatedly offered goods and its consequences; services that do not comply with Union or national law; (d) the intention items removed were related to serious crimes. 4. Providers of the recipient, individual, entity or complainant. 4 Online online platforms shall set out, in a clear , user-friendly, and detailed manner, manner with due regard to their obligations under Article 12(2) their policy in respect of the misuse referred to in paragraphs 1 and 2 in their terms and conditions, including as regards examples of the facts and circumstances that they take into account when assessing whether certain behaviour constitutes misuse and the duration of the suspension.

Article 21 Notification of suspicions of criminal offences

1 Where an online platform becomes aware of any information giving rise to a suspicion that a serious criminal offence involving a threat to the life or safety of persons has taken place, is taking place or is likely to take place, it shall promptly inform the law enforcement or judicial authorities of the Member State or Member States concerned of its suspicion and provide all relevant information available. 2 Where the online platform cannot identify with reasonable certainty the Member State concerned, it shall inform the law enforcement authorities of the Member State in which it is established or has its legal representative or inform Europol. For the purpose of this Article, the Member State concerned shall be the Member State where the offence is suspected to have taken place, be taking place and likely to take place, or the Member State where the suspected offender resides or is located, or the Member State where the victim of the suspected offence resides or is located.

Article 22 Traceability of traders

1 Where an online platform allows 1. Online platforms allowing consumers to conclude distance contracts with traders, it traders shall ensure that traders can only use its their services to promote messages on or to offer products or services to consumers located in the Union if, prior to the use of its services, the online platform has obtained their services for those purposes, they have been provided with the following information: (a) the name, address, telephone number and electronic mail address of the trader; (b) a copy of the identification document of the trader or any other electronic identification as defined by Article 3 of Regulation (EU) No 910/2014 of the European Parliament and of the Council ; (c) the bank account details of the trader, where the trader is a natural person; (d) the name, address, telephone number and electronic mail address of the economic operator, within the meaning of Article 3(13) and Article 4 of Regulation (EU) 2019/1020 of the European Parliament and the Council (51) or any relevant act of Union law; (e) where the trader is registered in a trade register or similar public register, the trade register law , including in which the trader is registered and its registration number or equivalent means area of identification in that register; product safety ; (f) a self-certification by the trader committing to only offer products or services that comply with the applicable rules of Union law. 2 law and where applicable confirming that all products have been checked against available databases, such as the Union Rapid Alert System for dangerous non-food products (RAPEX); (fa) the type of products or services the trader intends to offer on the online platform. 2. The online platform allowing consumers to conclude distance contracts with traders shall, upon receiving that information, information before allowing the display of the product or service on its online interface, and until the end of the contractual relationship, make reasonable best efforts to assess whether the information referred to in points (a), (d) and (e) (a) to (fa) of paragraph 1 is reliable and complete. The online platform shall make best efforts to check the information provided by the trader through the use of any freely accessible official online database or online interface made available by an authorised administrator or a Member States or the Union or through direct requests to the trader to provide supporting documents from reliable sources. 3 No later than one year after the entry into force of this Regulation, the Commission shall publish the list of online databases and online interfaces mentioned in the paragraph above and keep it up-to-date. The obligations for online platforms referred to in paragraphs 1 and 2 shall apply with regard to new and existing traders. 2a. The online platform shall make best efforts to identify and prevent the dissemination, by traders using its service, of offers for products or services which do not comply with Union or national law through measures such as random checks on the products and services offered to consumers in addition to the obligations referred to in paragraph 1 and 2 of this Article. 3. Where the online platform obtains sufficient indications or has reasons to believe that any item of information referred to in paragraph 1 obtained from the trader concerned is inaccurate or incomplete, that platform shall request the trader to correct the information in so far as necessary to ensure that all information is accurate and complete, without delay or within the time period set by Union and national law. Where the trader fails to correct or complete that information, the online platform shall swiftly suspend the provision of its service to the trader in relation to the offering of products or services to consumers located in the Union until the request is fully complied with. 4 3a. If an online platform rejects an application for services or suspends services to a trader, the trader shall have recourse to the mechanisms under Article 17 and Article 43 of this Regulation. 3b. Online platforms allowing consumers to conclude contracts with traders shall ensure that the identity, such as the trademark or logo, of the business user providing content, goods or services is clearly visible alongside the content, goods or services offered. For this purpose, the online platform shall establish a standardised interface for business users. 3c. Traders shall be solely liable for the accuracy of the information provided and shall inform without delay the online platform of any changes to the information provided. 4. The online platform shall store the information obtained pursuant to paragraph 1 and 2 in a secure manner for the duration of their contractual relationship with the trader concerned. They shall subsequently delete the information. 5 Without prejudice to paragraph 2, the platform shall only disclose the information to third parties where so required in accordance with the applicable law, including the orders referred to in Article 9 and any orders issued by Member States’ competent authorities or the Commission for no later than six months after the performance final conclusion of their tasks under this Regulation. 6 a distance contract . 6. The online platform shall make the information referred to in points (a), (d), (e) , (f), and (f) (fa) of paragraph 1 available easily accessible to the recipients of the service, service,, in a clear, easily accessible and comprehensible manner. 7 The online platform shall design and organise its online interface manner in a way that enables traders to comply accordance with their obligations regarding pre-contractual information and product safety information under applicable Union law. the accessibility requirements of Annex I to Directive (EU) 2019/882 .

Artículo 22 a.

Article 22a Obligation to inform consumers and authorities about illegal products and services 1. Where an online platforms allowing consumers to conclude distance contracts with traders becomes aware, irrespective of the means used to, that a product or a service offered by a trader on the interface of that platform is illegal with regard to applicable requirements in Union or national law, it shall: (a) remove the illegal product or service from its interface expeditiously and, where appropriate, inform the relevant authorities, such as the market surveillance authority or the custom authority of the decision taken; (b) where the online platform has the contact details of the recipient of the services, inform those recipients of the service that had acquired such product or service about the illegality, the identity of the trader and options for seeking redress; (c) compile and make publicly available through application programming interfaces a repository containing information about illegal products and services removed from its platform in the past twelve months along with information about the concerned trader and options for seeking redress. 2. Online platforms allowing consumers to conclude distance contracts with traders shall maintain an internal database of illegal products and services removed and/or recipients suspended pursuant to Article 20.

Article 23 Transparency reporting obligations for providers of online platforms

1 In addition to the information referred to in Article 13, online platforms shall include in the reports referred to in that Article information on the following: (a) (aa) the number of disputes submitted to complaints received through the out-of-court dispute settlement bodies internal complaint-handling system referred to in Article 18, 17, the outcomes basis for those complaints, decisions taken in respect of the dispute settlement and those complaints, the average and median time needed for completing taking those decisions and the dispute settlement procedures; number of instances where those decisions were reversed; (b) the number of suspensions imposed pursuant to Article 20, distinguishing between suspensions enacted for the provision of manifestly illegal content, the submission of manifestly unfounded notices and the submission of manifestly unfounded complaints; (c) any use made of automatic means for the purpose of content moderation, including a specification of (ca) the precise purposes, indicators number of advertisements that were removed, labelled or disabled by the accuracy online platform and justification of the automated means in fulfilling those purposes and any safeguards applied. 2 decisions. 2. Online platforms shall publish, at least once every six twelve months, information on the average monthly active recipients of the service in each Member State, calculated as an average over the period of the past six months, in accordance with the methodology laid down in the delegated acts adopted pursuant to Article 25(2). 3 Online platforms 2a. Member States shall communicate to the Digital Services Coordinator of establishment, upon its request, the information referred to in paragraph 2, updated to the moment of such request. That Digital Services Coordinator may require the online platform to provide refrain from imposing additional information as regards transparency reporting obligations on the calculation referred to in that paragraph, including explanations and substantiation online platforms, other than specific requests in respect of connection with the data used. That information shall not include personal data. 4 exercise of their supervisory powers. 4. The Commission may shall adopt implementing acts to establish a set of key performance indicators and lay down templates concerning the form, content and other details of reports pursuant to paragraph 1.

Article 24 Online advertising transparency

1. Online platforms that display advertising on their online interfaces shall ensure that the recipients of the service can identify, for each specific advertisement displayed to each individual recipient, in a clear , concise, and unambiguous manner and in real time: (a) that the information displayed on the interface or parts thereof is an advertisement; (b) online advertisement , including through prominent and harmonised marking ; (ba) the natural or legal person on whose behalf who finances the advertisement where this person is displayed; different from the natural or legal person referred to in point (b); (c) clear, meaningful , and uniform information about the main parameters used to determine the recipient to whom the advertisement is displayed. displayed , and where applicable about how to change those parameters . 1a. Online platforms shall ensure that recipients of services can easily make an informed choice on whether to consent, as defined in Article 4 (11) and Article 7 of Regulation (EU) 2016/679, in processing their personal data for the purposes of advertising by providing them with meaningful information, including information about how their data will be monetised. Online platforms shall ensure that refusing consent shall be no more difficult or time-consuming to the recipient than giving consent. In the event that recipients refuse to consent, or have withdrawn consent, recipients shall be given other fair and reasonable options to access the online platform. 1b. Targeting or amplification techniques that process, reveal or infer personal data of minors or personal data referred to in Article 9(1) of Regulation (EU) 2016/679 for the purpose of displaying advertisements are prohibited.

Artículo 24 b.

Article 24b Additional obligations for platforms primarily used for the dissemination of user-generated pornographic content Where an online platform is primarily used for the dissemination of user generated pornographic content, the platform shall take the necessary technical and organisational measures to ensure: (a) that users who disseminate content have verified themselves through a double opt-in e-mail and cell phone registration; (b) professional human content moderation, trained to identify image-based sexual abuse, including content having a high probability of being illegal; (c) the accessibility of a qualified notification procedure in the form that, additionally to the mechanism referred to in Article 14, individuals may notify the platform with the claim that image material depicting them or purporting to be depicting them is being disseminated without their consent and supply the platform with prima facie evidence of their physical identity; content notified through this procedure is to be suspended without undue delay.

Artículo 24 a.

Article 24a Recommender system transparency 1. Online platforms shall set out in their terms and conditions and via a designated online resource that can be directly reached and easily found from the online platform’s online interface when content is recommended, in a clear, accessible and easily comprehensible manner the main parameters used in their recommender systems, as well as any options for the recipient of the service to modify or influence those main parameters that they have made available. 2. The main parameters referred to in paragraph 1 shall include, at a minimum: (a) the main criteria used by the relevant system which individually or collectively are most significant in determining recommendations; (b) the relative importance of those parameters; (c) what objectives the relevant system has been optimised for; and (d) if applicable, an explanation of the role that the behaviour of the recipients of the service plays in how the relevant system produces its outputs. The requirements set out in paragraph 2 shall be without prejudice to rules on protection of trade secrets and intellectual property rights. 3. Where several options are available pursuant to paragraph 1, online platforms shall provide a clear and easily accessible function on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them.

Article 25 Very large online platforms

1 1. This Section shall apply to online platforms which which: (a) provide for at least four consecutive months their services to a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, calculated in accordance with the methodology set out in the delegated acts referred to in paragraph 3. 2 The Commission Such a methodology shall adopt delegated acts take into account, in accordance with Article 69 to adjust particular: (i) the number of average monthly active recipients of the shall be based on each service in the Union referred to in paragraph 1, where the Union’s population increases or decreases at least with 5 % in relation to its population in 2020 or, after adjustment by means individually; (ii) active recipients connected on multiple devices are counted only once; (iii) indirect use of service, via a delegated act, third party or linking, shall not be counted; (iv) where an online platform is hosted by another provider of its population in the year in which the latest delegated act was adopted. In intermediary services, that case, it shall adjust the number so that it corresponds active recipients are assigned solely to 10% of the Union’s population in the year in which it adopts the delegated act, rounded up or down online platform closest to allow the number to be expressed in millions. 3 recipient; (v) that automated interactions, accounts or data scans by a non-human (‘bots’) are not included. 3. The Commission shall adopt delegated acts in accordance with Article 69, after consulting the Board, to lay down a specific methodology for calculating the number of average monthly active recipients of the service in the Union, for the purposes of paragraph 1. 1 (a) . The methodology shall specify, in particular, how to determine the Union’s population and criteria to determine the average monthly active recipients of the service in the Union, taking into account different accessibility features. 4 The Digital Services Coordinator of establishment shall verify, at least every six months, whether the number of average monthly active recipients of the service in the Union of online platforms under their jurisdiction is equal to or higher than the number referred to in paragraph 1. On the basis of that verification, it shall adopt a decision designating the online platform as a very large online platform for the purposes of this Regulation, or terminating that designation, and communicate that decision, without undue delay, to the online platform concerned and to the Commission. The Commission shall ensure that the list of designated very large online platforms is published in the Official Journal of the European Union and keep that list updated. The obligations of this Section shall apply, or cease to apply, to the very large online platforms concerned from four months after that publication.

Article 26 Risk assessment

1 1. Very large online platforms shall effectively and diligently identify, analyse and assess, from the date of application referred to in the second subparagraph of Article 25(4), at least once a year thereafter, and in any event before launching new services, the probability and severity of any significant systemic risks stemming from , the design, algorithmic systems, intrinsic characteristics, functioning and use made of their services in the Union. Union . The risk assessment shall take into account risks per Member State in which services are offered and in the Union as a whole, in particular to a specific language or region . This risk assessment shall be specific to their services and activities, including technology design, business-model choices, and shall include the following systemic risks: (a) the dissemination of illegal content through their services; services or content that is in breach with their terms and conditions ; (b) any actual and foreseeable negative effects for the exercise of the fundamental rights , including for consumer protection, to respect for human dignity, private and family life, the protection of personal data and the freedom of expression and information, as well as to the freedom and the pluralism of the media, the prohibition of discrimination , the right to gender equality, and the rights of the child, as enshrined in Articles 1, 7, 8 , 11, 21 and , 23, 24 and 38 of the Charter respectively; (c) any malfunctioning or intentional manipulation of their service, including by means of inauthentic use or automated exploitation of the service, service or risks inherent to the intended operation of the service , including the amplification of illegal content, of content that is in breach with their terms and conditions or any other content with an actual or foreseeable negative effect on the protection of public health, minors, minors and of other vulnerable groups of recipients of the service , on democratic values, media freedom, freedom of expression and civic discourse, or actual or foreseeable effects related to electoral processes and public security. 2 security; (ca) any actual and foreseeable negative effects on the protection of public health as well as behavioural addictions or other serious negative consequences to the person's physical, mental, social and financial well-being. 2. When conducting risk assessments, very large online platforms shall take into account, in particular, whether and how their content moderation systems, terms and conditions, community standards, algorithmic systems, recommender systems and systems for selecting and displaying advertisement , as well as the underlying data collection, processing and profiling, influence any of the systemic risks referred to in paragraph 1, including the potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. 2a. When conducting risk assessments, very large online platforms shall consult, where appropriate, representatives of the recipients of the service, representatives of groups potentially impacted by their services, independent experts and civil society organisations. Their involvement shall be tailored to the specific systemic risks that the very large online platform aim to assess. 2b. The supporting documents of the risk assessment shall be communicated to the Digital Services Coordinator of establishment and to the Commission. 2c. The obligations referred to in paragraphs 1 and 2 shall by no means lead to a general monitoring obligation.

Article 27 Mitigation of risks

1 1. Very large online platforms shall put in place reasonable, reasonable , transparent , proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 26. Such measures may include, where applicable: (a) adapting content moderation , algorithmic systems, or recommender systems, systems and online interfaces , their decision-making processes, processes , the design , the features or functioning of their services, their advertising model or their terms and conditions; (aa) ensuring appropriate resources to deal with notices and internal complaints, including appropriate technical and operational measures or capacities; (b) targeted measures aimed at limiting the display of advertisements in association with the service they provide; provide , or the alternative placement and display of public service advertisements or other related factual information ; (ba) where relevant, targeted measures aimed at adapting online interfaces and features to protect minors; (c) reinforcing the internal processes , and resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk; (d) initiating or adjusting cooperation 1a. Very large online platforms shall, where appropriate, design their risk mitigation measures with trusted flaggers the involvement of representatives of the recipients of the service, independent experts and civil society organisations. Where no such involvement is foreseen, this shall be made clear in accordance with the transparency report referred to in Article 19; (e) initiating or adjusting cooperation with other 33. 1b. Very large online platforms through the codes shall provide a detailed list of conduct the risk mitigation measures taken and their justification to the crisis protocols independent auditors in order to prepare the audit report referred to in Article 35 28. 1c. The Commission shall evaluate the implementation and 37 respectively. 2 effectiveness of mitigating measures undertaken by very large online platforms referred to in Article 27(1) and where necessary, may issue recommendations. 2. The Board, in cooperation with the Commission, shall publish comprehensive reports, once a year, which year . The reports shall include the following: (a) identification and assessment of the most prominent and recurrent systemic risks reported by very large online platforms or identified through other information sources, in particular those provided in compliance with Article Articles 30, 31 and 33; (b) best practices for very large online platforms to mitigate The reports shall be presented per Member State in which the systemic risks identified. 3 occurred and in the Union as a whole. The reports shall be published in all the official languages of the Member States of the Union. 3. The Commission, in cooperation with the Digital Services Coordinators, may and following public consultation shall issue general guidelines on the application of paragraph 1 in relation to specific risks, in particular to present best practices and recommend possible measures, having due regard to the possible consequences of the measures on fundamental rights enshrined in the Charter of all parties involved. When preparing those guidelines the Commission 3a. The requirement to put in place mitigation measures shall organise public consultations. not lead to a general monitoring obligation or active fact-finding obligations.

Article 28 Independent audit

1 1. Very large online platforms shall be subject, at their own expense and at least once a year, to independent audits to assess compliance with the following: (a) the obligations set out in Chapter III; (b) any commitments undertaken pursuant 1a. Very large online platforms shall ensure auditors have access to the codes of conduct referred all relevant data necessary to in Articles 35 and 36 and perform the crisis protocols referred to in Article 37. 2 audit properly. 2. Audits performed pursuant to paragraph 1 shall be performed by organisations which: which having been recognised and vetted by the Commission and which : (a) are legally and financially independent from , and do not have conflicts of interest with the very large online platform concerned; (b) have proven expertise in the area of risk management, technical competence concerned and capabilities; (c) other very large online platforms ; (aa) auditors and their employees have proven objectivity not provided any other service to the very large online platform audited 12 months before the audit and professional ethics, based in particular on adherence commit not to codes of practice work for the very large online platform audited or appropriate standards. 3 a professional organisation or business association of which the platform is a member for 12 months after their position in the auditing organisation has ended; 3. The organisations that perform the audits shall establish an audit report for each audit. audit subject as referred to in paragraph 1 . The report shall be in writing and include at least the following: (a) the name, address and the point (ba) a declaration of contact interests; (d) a description of the very large online platform subject to main findings drawn from the audit and the period covered; (b) a summary of the name and address main findings ; (da) a description of the organisation performing third parties consulted as part of the audit; (c) (fa) a description of the specific elements that could not be audited, and the methodology applied; (d) a description of the main findings drawn from the audit; (e) an audit opinion on whether the very large online platform subject to the audit complied with the obligations and with the commitments referred to in paragraph 1, either positive, positive with comments or negative; (f) explanation of why these could not be audited; (fb) where the audit opinion is could not positive, operational recommendations on reach a conclusion for specific measures elements within the scope of the audit, a statement of reasons for the failure to achieve compliance. 4 Very reach such conclusion. 4a. The Commission shall publish and regularly update a list of vetted organisations. 4b. Where a very large online platforms receiving an audit report that is not platform receives a positive audit report, it shall take due account of any operational recommendations addressed to them with a view to take the necessary measures be entitled to implement them. They shall, within one month request from receiving those recommendations, adopt an audit implementation report setting out those measures. Where they do not implement the operational recommendations, they shall justify in the audit implementation report the reasons for not doing so and set out any alternative measures they may have taken to address any instances Commission a seal of non-compliance identified. excellence.

Article 29 Recommender systems

1 Very 1. In addition to the requirements set out in Article 24a, very large online platforms that use recommender systems shall set out in their terms and conditions, in a clear, accessible and easily comprehensible manner, the main parameters used in their recommender systems, as well as any options for the recipients of the service to modify or influence those main parameters that they may have made available, including provide at least one option recommender system which is not based on profiling, within the meaning of Article 4 (4) of Regulation (EU) 2016/679. 2 Where several options are available pursuant to paragraph 1, very large online platforms shall provide 2016/679 , as well as an easily accessible functionality on their online interface allowing the recipient of the service to select and to modify at any time their preferred option for each of the recommender systems that determines the relative order of information presented to them. them . deleted

Article 30 Additional online advertising transparency

1 1. Very large online platforms that display advertising on their online interfaces shall compile and make publicly available and searchable through easy to access, efficient and reliable tools through application programming interfaces a repository containing the information referred to in paragraph 2, until one year after the advertisement was displayed for the last time on their online interfaces. They shall ensure that multicriterion queries can be performed per advertiser and per all data points present in the advertisement, the target of the advertisement, and the audience the advertiser wishes to reach. They shall ensure that the repository does not contain any personal data of the recipients of the service to whom the advertisement was or could have been displayed. 2 The repository displayed and shall include at least all of make reasonable efforts to ensure that the following information: information is accurate and complete . (a) the content of the advertisement; (b) advertisement , including the name of the product, service or brand and the object of the advertisement ; (ba) the natural or legal person on whose behalf who paid for the advertisement advertisement, where that person is displayed; (c) the period during which different from the advertisement was displayed; one referred to in point (b); (d) whether the advertisement was intended to be displayed specifically to one or more particular groups of recipients of the service and if so, the main parameters used for that purpose; (e) the total number of recipients purpose including any parameters used to exclude particular groups ; (da) where it is disclosed, a copy of the service reached and, where applicable, aggregate numbers for content of commercial communications published on the group very large online platforms that are not marketed, sold or groups of recipients arranged by the very large online platform, which have through appropriate channels been declared as such to whom the very large online platform; (ea) cases where the advertisement was targeted specifically. removed on the basis of a notice submitted in accordance with Article 14 or an order issued pursuant to Article 8. 2a. The Board shall, after consulting vetted researchers, publish guidelines on the structure and organisation on repositories created pursuant to paragraph 1.

Artículo 30 a.

Article 30a Deep fakes Where a very large online platform becomes aware that a piece of content is a generated or manipulated image, audio or video content that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful (deep fakes), the provider shall label the content in a way that informs that the content is inauthentic and that is clearly visible for the recipient of the services.

Article 31 Data access and scrutiny

1 1. Very large online platforms shall provide the Digital Services Coordinator of establishment or the Commission, upon their reasoned request and within a reasonable period, and without delay specified in the request, access to data that are necessary to monitor and assess compliance with this Regulation. That Digital Services Coordinator and the Commission shall only request, access and use that data for those purposes. 2 1a. The very large online platform shall be obliged to explain the design, logic and the functioning of the algorithms if requested by the Digital Service Coordinator of establishment. 2. Upon a reasoned request from the Digital Services Coordinator of establishment or the Commission, very large online platforms shall, within a reasonable period, as specified in the request, provide access to data to vetted researchers , vetted not-for-profit bodies, organisations or associations, who meet the requirements in paragraphs 4 of this Article, for the sole purpose of conducting research that contributes to the identification , mitigation and understanding of systemic risks as set out in Article 26(1). 3 26(1) and Article 27(1) . 2a. Vetted researchers, vetted not-for-profit bodies, organisations and associations shall have access to aggregate numbers for the total views and view rate of content prior to a removal on the basis of orders issued in accordance with Article 8 or content moderation engaged in at the provider’s own initiative and under its terms and conditions. 3. Very large online platforms shall provide access to data pursuant to paragraphs 1 and 2 through online databases or application programming interfaces, as appropriate. 4 appropriate , and with an easily accessible and user-friendly mechanism to search for multiple criteria . 4. In order to be vetted, vetted by the Digital Services Coordinator of establishment or the Commission , researchers shall , not-for-profit bodies, organisations or associations shall: (a) be affiliated with academic institutions, institutions or civil society organisations representing the public interest and meeting the requirements under Article 68; (b) be independent from commercial interests, including from any very large online platform; (c) disclose the funding financing the research; (d) be independent from any government, administrative or other state bodies, outside the academic institution of affiliation if public; (e) have proven records of expertise in the fields related to the risks investigated or related research methodologies, and shall commit methodologies ; and be in a capacity to (f) preserve the specific data security and confidentiality requirements corresponding to each request. 5 4a. Where a very large online platform has grounds to believe that a researcher, a not-for-profit body, an organisation or association is acting outside the purpose of paragraph 2 or no longer respects the conditions of paragraph 4, it shall immediately inform the relevant authority, either the Digital Service Coordinator of establishment or the Commission, which shall decide without undue delay if access shall be withdrawn and when the access shall be restored and under what conditions. 4b. Where the Digital Services Coordinator of establishment, or the Commission have grounds to believe that a researcher, a not-for-profit body, an organisation or association is acting outside the purpose of paragraph 2 or no longer respects the conditions of paragraph 4, it shall immediately inform the very large online platform. The very large online platform shall be entitled to withdraw access to data upon receiving the information. The Digital Services Coordinator of establishment, or the Commission shall decide if and when access shall be restored and under what conditions. 5. The Commission shall, after consulting the Board, Board , and no later than one year after entry into force of this legislation , adopt delegated acts laying down the technical conditions under which very large online platforms are to share data pursuant to paragraphs 1 and 2 and the purposes for which the data may be used. Those delegated acts shall lay down the specific conditions under which such sharing of data with vetted researchers or not-for-profit bodies, organisations or associations can take place in compliance with Regulation (EU) 2016/679, taking into account the rights and interests of the very large online platforms and the recipients of the service concerned, including the protection of confidential information, in particular trade secrets, and maintaining the security of their service. 6 Within 15 days following receipt of a request as referred to in paragraph 1 and 2, a very large online platform may request the Digital Services Coordinator of establishment or the Commission, as applicable, to amend the request, where it considers that it is unable to give access to the data requested because one of following two reasons: (a) it does not have access to the data; (b) giving access to the data will lead to significant vulnerabilities for the security of its service or the protection of confidential information, in particular trade secrets. 7 Requests for amendment pursuant to point (b) information. 7a. Digital Service Coordinators and the Commission shall, once a year, report the following information: (a) the number of paragraph 6 shall contain proposals for one or more alternative means through which access may be provided requests made to the requested data or other data which are appropriate them as referred to in paragraphs 1, 2 and sufficient for 6; (b) the purpose number of such requests that have been declined or withdrawn by the request. The Digital Services Service Coordinator of establishment or the Commission shall decide upon and the request reasons for amendment within 15 days and communicate which they have been declined or withdrawn, including following a request to the Digital Service Coordinator or the Commission from a very large online platform its decision and, where relevant, the amended to amend a request as referred to in paragraphs 1, 2 and 6. 7b. Upon completion of their research, the new time period vetted researchers that have been granted access to comply data shall publish their findings without disclosing confidential data and in compliance with the request. Regulation (EU) 2016/679.

Article 32 Compliance officers

1 Very large online platforms shall appoint one or more compliance officers responsible for monitoring their compliance with this Regulation. 2 2. Very large online platforms shall only designate as compliance officers persons who have the professional qualifications, knowledge, experience and ability necessary to fulfil the tasks referred to in paragraph 3. 3 as compliance officers . Compliance officers may either be staff members of, or fulfil those tasks on the basis of a contract with, the very large online platform concerned. 3 Compliance officers shall have the following tasks: (a) cooperating with the Digital Services Coordinator of establishment , the Board and the Commission for the purpose of this Regulation; (b) organising and supervising the very large online platform’s activities relating to the independent audit pursuant to Article 28; (c) informing and advising the management and employees of the very large online platform about relevant obligations under this Regulation; (d) monitoring the very large online platform’s compliance with its obligations under this Regulation. 4 Very large online platforms shall take the necessary measures to ensure that the compliance officers can perform their tasks in an independent manner. 5 Very large online platforms shall communicate the name and contact details of the compliance officer to the Digital Services Coordinator of establishment and the Commission. 6 Very large online platforms shall support the compliance officer in the performance of his or her tasks and provide him or her with the resources necessary to adequately carry out those tasks. The compliance officer shall directly report to the highest management level of the platform.

Article 33 Transparency reporting obligations for very large online platforms

1 1. Very large online platforms shall publish the reports referred to in Article 13 within six months from the date of application referred to in Article 25(4), and thereafter every six months. 2 In addition to the reports provided for months in Article 13, very large online platforms a standardised, machine-readable and easily accessible format . 1a. Such reports shall make publicly available include content moderation information separated and transmit to presented for each Member State in which the Digital Services Coordinator of establishment services are offered and for the Commission, Union as a whole. The reports shall be published in at least once a year and within 30 days following the adoption one of the audit implementing report provided for in Article 28(4): (a) a report setting out official languages of the results Member States of the risk assessment pursuant to Article 26; Union in which services are offered. (b) the related risk specific mitigation measures identified and implemented pursuant to Article 27; (c) (da) where appropriate, information about the audit report provided for in Article 28(3); (d) representatives of the audit implementation report provided recipients of the service, independent experts and civil society organisations, consulted for the risk assessment in accordance with Article 28(4). 3 26. 3. Where a very large online platform considers that the publication of information pursuant to paragraph 2 may result in the disclosure of confidential information of that platform or of the recipients of the service, may cause significant vulnerabilities for the security of its service, may undermine public security or may harm recipients, the platform may remove such information from the reports. In that case, that platform shall transmit the complete reports to the Digital Services Coordinator of establishment and the Commission, accompanied by a statement of the reasons for removing the information from the public reports. reports , in compliance with Regulation (EU) 2016/679 .

Article 34 Standards

1 1. The Commission shall support and promote the development and implementation of voluntary industry standards set by relevant European and international standardisation bodies , in accordance with Regulation (EU) No 1025/2012, at least for the following: (a) electronic submission of notices under Article 14; (b) electronic submission of notices by trusted flaggers (aa) terms and conditions under Article 19, including through application programming interfaces; (c) specific interfaces, 12, including application programming interfaces, as regards acceptance of and changes to facilitate compliance with the obligations set out in Articles 30 those terms and 31; (d) auditing conditions; (ab) information on traceability of very large online platforms traders under Article 22; (ac) advertising practices under Article 24 and recommender systems under Article 24a; (fa) transparency reporting obligations pursuant to Article 28; (e) interoperability of the advertisement repositories referred 13; (fb) technical specifications to ensure that intermediary services shall be made accessible for persons with disabilities in Article 30(2); (f) transmission accordance with the accessibility requirements of data between advertising intermediaries in Directive (EU) 2019/882. 1a. The Commission shall support and promote the development and implementation of transparency obligations pursuant to points (b) voluntary standards set by the relevant European and (c) international standardisation bodies aimed at the protection of Article 24. 2 minors. 2a. The Commission shall support be empowered to adopt implementing acts laying down common specifications for the update items listed in points (a) to (fb) of paragraph 1 where the standards Commission has requested one or more European standardisation organisations to draft a harmonised standard and there has not been a publication of the reference to that standard in the light Official Journal of technological developments and the behaviour European Union within [24 months after the entry into force of this Regulation] or the recipients request has not been accepted by any of the services in question. European standardisation organisations.

Article 35 Codes of conduct

1 1. The Commission and the Board shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level to contribute to the proper application of this Regulation, taking into account in particular the specific challenges of tackling different types of illegal content and systemic risks, in accordance with Union law, in particular law. Particular attention shall be given to avoiding negative effects on fair competition , data access and security, the general monitoring prohibition and the protection of privacy and personal data. 2 The Commission and the Board shall also encourage and facilitate regular review and adaption of the Codes of conduct to ensure that they are fit for purpose. 2. Where significant systemic risk within the meaning of Article 26(1) emerge and concern several very large online platforms, the Commission may invite request the very large online platforms concerned, other very large online platforms, other online platforms and other providers of intermediary services, as appropriate, as well as relevant competent authorities, civil society organisations and other interested parties, relevant stakeholders , to participate in the drawing up of codes of conduct, including by setting out commitments to take specific risk mitigation measures, as well as a regular reporting framework on any measures taken and their outcomes. 3 3. When giving effect to paragraphs 1 and 2, the Commission and the Board shall aim to ensure that the codes of conduct clearly set out their objectives, specific objectives , define the nature of the public policy objective pursued and, where appropriate, the role of competent authorities , contain key performance indicators to measure the achievement of those objectives and take due fully into account of the needs and interests of all interested parties, including and in particular citizens, at Union level. The Commission and the Board shall also aim to ensure that participants report regularly to the Commission and their respective Digital Service Coordinators of establishment on any measures taken and their outcomes, as measured against the key performance indicators that they contain. 4 Key performance indicators and reporting commitments shall take into account differences in size and capacity between different participants. 4. The Commission and the Board shall assess whether the codes of conduct meet the aims specified in paragraphs 1 and 3, and shall regularly monitor and evaluate the achievement of their objectives. They shall publish their conclusions. 5 conclusions and request that the organisations involved amend their codes of conduct accordingly . 5. The Commission and the Board shall regularly monitor and evaluate the achievement of the objectives of the codes of conduct, having regard to the key performance indicators that they may contain. In case of systematic failure to comply with the Codes of Conduct, the Commission and the Board may take a decision to temporarily suspend or definitively exclude platforms that do not meet their commitments as signatories to the codes of conduct.

Article 36 Codes of conduct for online advertising

1 1. The Commission shall encourage and facilitate the drawing up of voluntary codes of conduct at Union level between, online platforms and other relevant service providers, such as providers of online advertising intermediary services or organisations representing recipients of the service and civil society organisations or relevant authorities to contribute to further transparency for all actors in the online advertising eco-system, beyond the requirements of Articles 24 and 30. 2 2. The Commission shall aim to ensure that the codes of conduct pursue an effective transmission of information, in full respect for the rights and interests of all parties involved, and a competitive, transparent and fair environment in online advertising, in accordance with Union and national law, in particular on competition and the protection of privacy and personal data. The Commission shall aim to ensure that the codes of conduct address at least: (a) the transmission of information held by providers of online advertising intermediaries to recipients of the service with regard to requirements set in points (b) and (c) of Article 24; (b) (ba) the transmission of information held by providers different types of online advertising intermediaries to the repositories pursuant to Article 30. 3 data that can be used. 3. The Commission shall encourage the development of the codes of conduct within one year following the date of application of this Regulation and their application no later than six months after that date. The Commission shall evaluate the application of those codes three years after the application of this Regulation. 3a. The Commission shall encourage all the actors in the online advertising eco-system referred to in paragraph 1 to endorse and comply with the commitments stated in the codes of conduct.

Article 37 Crisis protocols

1 1. The Board may recommend the Commission to initiate the drawing up, in accordance with paragraphs 2, 3 and 4, of voluntary crisis protocols for addressing crisis situations strictly limited to extraordinary circumstances affecting public security or public health. 2 The Commission shall encourage and facilitate very large online platforms and, where appropriate, other online platforms, with the involvement of the Commission, to participate in the drawing up, testing and application of those crisis protocols, which include one or more of the following measures: (a) displaying prominent information on the crisis situation provided by Member States’ authorities or at Union level; (b) ensuring that the point of contact referred to in Article 10 is responsible for crisis management; (c) where applicable, adapt the resources dedicated to compliance with the obligations set out in Articles 14, 17, 19, 20 and 27 to the needs created by the crisis situation. 3 The Commission may involve, as appropriate, Member States’ authorities and Union bodies, offices and agencies in drawing up, testing and supervising the application of the crisis protocols. The Commission may, where necessary and appropriate, also involve civil society organisations or other relevant organisations in drawing up the crisis protocols. 4 The Commission shall aim to ensure that the crisis protocols set out clearly all of the following: (a) the specific parameters to determine what constitutes the specific extraordinary circumstance the crisis protocol seeks to address and the objectives it pursues; (b) the role of each participant and the (fa) measures they are to put in place in preparation and once the crisis protocol has been activated; (c) a clear procedure for determining when the crisis protocol is to be activated; (d) a clear procedure ensure accessibility for determining the period persons with disabilities during which the measures to be taken once the crisis protocol has been activated are to be taken, which is strictly limited to what is necessary for addressing the specific extraordinary circumstances concerned; (e) safeguards to address any negative effects on the exercise of the fundamental rights enshrined in the Charter, in particular the freedom of expression and information and the right to non-discrimination; (f) a process to publicly report on any measures taken, their duration and their outcomes, upon the termination implementation of the crisis situation. 5 protocols, including by providing accessible description about these protocols. 5. If the Commission considers that a crisis protocol fails to effectively address the crisis situation, or to safeguard the exercise of fundamental rights as referred to in point (e) of paragraph 4, it may shall request the participants to revise the crisis protocol, including by taking additional measures.

Chapter IV – Implementation, cooperation, sanctions and enforcement

Article 38 Competent authorities and Digital Services Coordinators

1 Member States shall designate one or more competent authorities as responsible for the application and enforcement of this Regulation (‘competent authorities’). 2 4a. Member States shall designate one of the competent authorities as their Digital Services Coordinator. The Digital Services Coordinator shall be responsible for all matters relating to application and enforcement of this Regulation in ensure that Member State, unless the Member State concerned has assigned certain specific tasks or sectors to other competent authorities. The Digital Services Coordinator shall in any event be responsible for ensuring coordination at national level in respect of those matters and for contributing to the effective and consistent application and enforcement of this Regulation throughout the Union. For that purpose, Digital Services Coordinators shall cooperate with each other, other national competent authorities, the Board and the Commission, without prejudice to the possibility for Member States referred to provide for regular exchanges of views with other authorities where relevant for the performance of the tasks of those other authorities and of the Digital Services Coordinator. Where a Member State designates more than one competent authority in addition to the Digital Services Coordinator, it shall ensure that the respective tasks of those authorities paragraph 1 and of the in particular their Digital Services Coordinator are clearly defined and that they cooperate closely Coordinators, have adequate technical financial and effectively when performing their tasks. The Member State concerned shall communicate the name of the other competent authorities as well as human resources to carry out their respective tasks to the Commission and the Board. 3 Member States shall designate the Digital Services Coordinators within two months from the date of entry into force of under this Regulation. Member States shall make publicly available, and communicate to the Commission and the Board, the name of their competent authority designated as Digital Services Coordinator and information on how it can be contacted. 4 The requirements applicable to Digital Services Coordinators set out in Articles 39, 40 and 41 shall also apply to any other competent authorities that the Member States designate pursuant to paragraph 1.

Article 39 Requirements for Digital Services Coordinators

1 1. Member States shall ensure that their Digital Services Coordinators perform their tasks under this Regulation in an impartial, transparent and timely manner. Member States shall ensure that their Digital Services Coordinators have adequate technical, financial and human resources to carry out their tasks. 2 When carrying out their tasks and exercising their powers in accordance with this Regulation, the Digital Services Coordinators shall act with complete independence. They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any other public authority or any private party. 3 Paragraph 2 is without prejudice to the tasks of Digital Services Coordinators within the system of supervision and enforcement provided for in this Regulation and the cooperation with other competent authorities in accordance with Article 38(2). Paragraph 2 shall not prevent supervision of the authorities concerned in accordance with national constitutional law.

Article 40 Jurisdiction

1 1. The Member State in which the main establishment of the provider of intermediary services is located shall have jurisdiction for the purposes of Chapters III the supervision and IV enforcement by the national competent authorities, in accordance with this Chapter, of the obligations imposed on intermediaries under this Regulation. 2 2. A provider of intermediary services which does not have an establishment in the Union but which offers services in the Union shall, for the purposes of Chapters III and IV, this Article , be deemed to be under the jurisdiction of the Member State where its legal representative resides or is established. 3 3. Where a provider of intermediary services fails to appoint a legal representative in accordance with Article 11, all Member States shall have jurisdiction for the purposes of Chapters III and IV. this Article . Where a Member State decides to exercise jurisdiction under this paragraph, it shall inform all other Member States and ensure that the principle of ne bis in idem is respected. 4 Paragraphs 1, 2 and 3 are without prejudice to the second subparagraph of Article 50(4) and the second subparagraph of Article 51(2) and the tasks and powers of the Commission under Section 3.

Article 41 Powers of Digital Services Coordinators

1 Where needed for carrying out their tasks, Digital Services Coordinators shall have at least the following powers of investigation, in respect of conduct by providers of intermediary services under the jurisdiction of their Member State: (a) the power to require those providers, as well as any other persons acting for purposes related to their trade, business, craft or profession that may reasonably be aware of information relating to a suspected infringement of this Regulation, including, organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period; (b) the power to carry out on-site inspections of any premises that those providers or those persons use for purposes related to their trade, business, craft or profession, or to request other public authorities to do so, in order to examine, seize, take or obtain copies of information relating to a suspected infringement in any form, irrespective of the storage medium; (c) the power to ask any member of staff or representative of those providers without undue delay, or those persons to give explanations in respect of any information relating to a suspected infringement and to record the answers. 2 Where needed for carrying out their tasks, Digital Services Coordinators shall have at least the following enforcement powers, in respect of providers of intermediary services under the jurisdiction of their Member State: (a) the power to accept the commitments offered by those providers in relation to their compliance with this Regulation and to make those commitments binding; (b) latest within three months ; (e) the power to order the cessation of infringements and, where appropriate, to impose remedies adopt proportionate to the infringement and necessary to bring the infringement effectively to an end; (c) the power to impose fines in accordance with Article 42 for failure to comply with this Regulation, including with any of the orders issued pursuant to paragraph 1; (d) the power to impose a periodic penalty payment in accordance with Article 42 to ensure that an infringement is terminated in compliance with an order issued pursuant to point (b) of this paragraph interim measures or for failure to comply with any of the orders issued pursuant to paragraph 1; (e) request the power relevant judicial authority to adopt interim measures do so, to avoid the risk of serious harm. As regards points (c) and (d) of the first subparagraph, Digital Services Coordinators shall also have the enforcement powers set out in those points in respect of the other persons referred to in paragraph 1 for failure to comply with any of the orders issued to them pursuant to that paragraph. They shall only exercise those enforcement powers after having provided those others other persons in good time with all relevant information relating to such orders, including the applicable time period, the fines or periodic payments that may be imposed for failure to comply and redress possibilities. 3 3. Where needed for carrying out their tasks, Digital Services Coordinators shall also have, in respect of providers of intermediary services under the jurisdiction of their Member State, where all other powers pursuant to this Article to bring about the cessation of an infringement have been exhausted, the infringement persists or is continuously repeated and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the power to take the following measures: (a) require the management body of the providers, within a reasonable time period, period , which shall in any case not exceed three months , to examine the situation, adopt and submit an action plan setting out the necessary measures to terminate the infringement, ensure that the provider takes those measures, and report on the measures taken; (b) where the Digital Services Coordinator considers that the provider has not sufficiently complied with the requirements of the first indent, that the infringement persists or is continuously repeated and causes serious harm, and that the infringement entails a serious criminal offence involving a threat to the life or safety of persons, request the competent judicial authority of that Member State to order the temporary restriction of access of recipients of the service concerned by the infringement or, only where that is not technically feasible, to the online interface of the provider of intermediary services on which the infringement takes place. The Digital Services Coordinator shall, except where it acts upon the Commission’s request referred to in Article 65, prior to submitting the request referred to in point (b) of the first subparagraph, invite interested parties to submit written observations within a time period that shall not be less than two weeks, describing the measures that it intends to request and identifying the intended addressee or addressees thereof. The provider, the intended addressee or addressees and any other third party demonstrating a legitimate interest shall be entitled to participate in the proceedings before the competent judicial authority. Any measure ordered shall be proportionate to the nature, gravity, recurrence and duration of the infringement, without unduly restricting access to lawful information by recipients of the service concerned. The restriction shall be for a period of four weeks, subject to the possibility for the competent judicial authority, in its order, to allow the Digital Services Coordinator to extend that period for further periods of the same lengths, subject to a maximum number of extensions set by that judicial authority. 6a. The Digital Services Coordinator Commission shall only extend the period where it considers, having regard to the rights and interests of all parties affected by the restriction and all relevant circumstances, including any information that the provider, the addressee or addressees and any other third party that demonstrated a legitimate interest may provide to it, that both of the following conditions have been met: (a) the provider has failed to take the necessary measures to terminate the infringement; (b) the temporary restriction does not unduly restrict access to lawful information publish guidelines by recipients of the service, having regard to the number of recipients affected and whether any adequate and readily accessible alternatives exist. Where the Digital Services Coordinator considers that those two conditions have been met but it cannot further extend the period pursuant to the third subparagraph, it shall submit a new request to [six months after the competent judicial authority, as referred to in point (b) entry into force of the first subparagraph. 4 The powers listed in paragraphs 1, 2 and 3 are without prejudice to Section 3. 5 The measures taken by the Digital Services Coordinators in this Regulation] on the exercise of their powers listed in paragraphs 1, 2 and 3 shall be effective, dissuasive and proportionate, having regard, in particular, to the nature, gravity, recurrence and duration of the infringement or suspected infringement to which those measures relate, as well as the economic, technical and operational capacity of the provider of the intermediary services concerned where relevant. 6 Member States shall ensure that any exercise of the powers pursuant to paragraphs 1, 2 and 3 is subject to adequate safeguards laid down in the procedures applicable national law in conformity with the Charter and with the general principles of Union law. In particular, those measures shall only be taken in accordance with the right to respect for private life and the rights of defence, including the rights to be heard and of access to the file, and subject to the right to an effective judicial remedy of all affected parties. Digital Services Coordinators.

Article 42 Penalties

1 Member States shall lay down the rules on penalties applicable to infringements of this Regulation by providers of intermediary services under their jurisdiction and shall take all the necessary measures to ensure that they are implemented in accordance with Article 41. 2 2. Penalties shall be effective, proportionate and dissuasive. Member States shall notify the Commission and the Board of those rules and of those measures and shall notify it, without delay, of any subsequent amendments affecting them. 3 3. Member States shall ensure that the maximum amount of penalties imposed for a failure to comply with the obligations laid down in this Regulation shall not exceed 6 % of the annual income or worldwide turnover of the provider of intermediary services concerned. Penalties for the supply of incorrect, incomplete or misleading information, failure to reply or rectify incorrect, incomplete or misleading information and to submit to an on-site inspection shall not exceed 1% 1 % of the annual income or worldwide turnover of the provider concerned. 4 4. Member States shall ensure that the maximum amount of a periodic penalty payment shall not exceed 5 % of the average daily worldwide turnover of the provider of intermediary services concerned in the preceding financial year per day, calculated from the date specified in the decision concerned. 4a. Member States shall ensure that administrative or judicial authorities issuing orders pursuant to Article 8 and 9 shall only issue penalties or fines in line with this Article.

Article 43 Right to lodge a complaint

1. Recipients of the service service, , shall have the right to lodge a complaint against providers of intermediary services alleging an infringement of this Regulation with the Digital Services Coordinator of the Member State where the recipient resides or is established. During these proceedings, both parties shall have the right to be heard and receive appropriate information about the status of the proceedings. The Digital Services Coordinator shall assess the complaint and, where appropriate, transmit it to the Digital Services Coordinator of establishment. establishment without undue delay . Where the complaint falls under the responsibility of another competent authority in its Member State, the Digital Service Coordinator receiving the complaint shall transmit it to that authority. authority , without undue delay . 1a. Upon receipt of the complaint, transmitted pursuant to paragraph 1, the Digital Services Coordinator of establishment shall assess the matter in a timely manner and shall inform within six months the Digital Services Coordinator of the Member State where the recipient resides or is established if it intends to proceed with an investigation. If it opens an investigation, it shall provide an update at least every three months. The Digital Services Coordinator of the Member State where the recipient resides or is established shall consequently inform the recipient.

Artículo 43 a.

Article 43a Compensation Without prejudice to Article 5, recipients of the service shall have the right to seek, in accordance with relevant Union and national law compensation from providers of intermediary services, against any direct damage or loss suffered due to an infringement by providers of intermediary services of obligations established under this Regulation.

Article 44 Activity reports

1 1. Digital Services Coordinators shall draw up an annual report on their activities under this Regulation. They shall make the annual reports in a standardised and machine-readable format available to the public, and shall communicate them to the Commission and to the Board. 2 The annual report shall include at least the following information: (a) the number and subject matter of orders to act against illegal content and orders to provide information issued in accordance with Articles 8 and 9 by any national judicial or administrative authority of the Member State of the Digital Services Coordinator concerned; concerned , including information on the name of the issuing authority, the name of the provider and the type of action specified in the order, as well as a justification that the order complies with Article 3 of Directive 2000/31/EC ; (b) the effects given to those orders, as communicated to the Digital Services Coordinator pursuant to Articles 8 and 9. 3 Where a Member State has designated several competent authorities pursuant to Article 38, it shall ensure that 9 , the Digital Services Coordinator draws up number of appeals made against those orders, as well as the outcome of the appeals . 2a. The Commission shall make publicly available a single biennial report covering analysing the activities of all competent authorities annual reports, communicated pursuant to paragraph 1 and that shall submit it to the Digital Services Coordinator receives all relevant information European Parliament and support needed to that effect from the other competent authorities concerned. Council.

Article 45 Cross-border cooperation among Digital Services Coordinators

1 Where a Digital Services Coordinator has reasons to suspect that a provider of an intermediary service, not under the jurisdiction of the Member State concerned, infringed this Regulation, it shall request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation. Where the Board has reasons to suspect that a provider of intermediary services infringed this Regulation in a manner involving at least three Member States, it may recommend request the Digital Services Coordinator of establishment to assess the matter and take the necessary investigatory and enforcement measures to ensure compliance with this Regulation. 2 2. A request or recommendation pursuant to paragraph 1 shall at least indicate: (a) the point of contact of the provider of the intermediary services concerned as provided for in Article 10; (b) a description of the relevant facts, 2a. A request pursuant to paragraph 1 shall be at the provisions of this Regulation concerned and same time communicated to the reasons why Commission. Where the Digital Services Coordinator Commission believes that sent the request, request is not justified or the Board, suspects that the provider infringed this Regulation; (c) any other information that where the Digital Services Coordinator that sent Commission is currently taking action on the request, or same matter, the Board, considers relevant, including, where appropriate, information gathered on its own initiative or suggestions Commission can ask for specific investigatory or enforcement measures the request to be taken, including interim measures. 3 withdrawn. 3. The Digital Services Coordinator of establishment shall take into utmost account the request or recommendation pursuant to paragraph 1. Where it considers that it has insufficient information to act upon the request or recommendation and has reasons to consider that the Digital Services Coordinator that sent the request, or the Board, could provide additional information, it may request such information. The time period laid down in paragraph 4 shall be suspended until that additional information is provided. 4 4. The Digital Services Coordinator of establishment shall, without undue delay and in any event not later than two months following receipt of the request or recommendation, request, communicate to the Digital Services Coordinator that sent the request, or the Board, its assessment of the suspected infringement, or that of any other competent authority pursuant to national law where relevant, and an explanation of any investigatory or enforcement measures taken or envisaged in relation thereto to ensure compliance with this Regulation. 5 5. Where the Digital Services Coordinator that sent the request, or, where appropriate, the Board, did not receive a reply within the time period laid down in paragraph 4 or where it does not agree with the assessment of the Digital Services Coordinator of establishment, it may refer the matter to the Commission, providing all relevant information. That information shall include at least the request or recommendation sent to the Digital Services Coordinator of establishment, any additional information provided pursuant to paragraph 3 and the communication referred to in paragraph 4. 6 The Commission shall assess the matter within three months following the referral of the matter pursuant to paragraph 5, after having consulted the Digital Services Coordinator of establishment and, unless it referred the matter itself, the Board. 7 7. Where, pursuant to paragraph 6, the Commission concludes that the assessment or the investigatory or enforcement measures taken or envisaged pursuant to paragraph 4 are incompatible with this Regulation, it shall request the Digital Service Coordinator of establishment to further assess the matter and take the necessary investigatory or enforcement measures to ensure compliance with this Regulation, and to inform it about those measures taken within two months from that request. This information shall be also transmitted to the Digital Services Coordinator or the Board that initiated the proceedings pursuant to paragraph 1.

Article 46 Joint investigations and requests for Commission intervention

1 deleted 1a. Where a Digital Services Coordinators may participate Coordinator of establishment has reasons to suspect that a provider of intermediary services has infringed this Regulation in joint investigations, which a manner involving at least one other Member State, it may propose to the Digital Services Coordinator of destination concerned to launch a joint investigation. The joint investigation shall be coordinated with based on an agreement between the support Member States concerned. 1b. Upon request of the Board, with regard Digital Services Coordinator of destination who has reasons to matters covered by this Regulation, concerning providers suspect that a provider of intermediary services operating has infringed this Regulation in several its Member States. State, the Board may recommend to the Digital Services Coordinator of establishment to launch a joint investigation with the Digital Services Coordinator of destination concerned. The joint investigation shall be based on an agreement between the Member States concerned. Where there is no agreement within one month, the joint investigation shall be under the supervision of the Digital Services Coordinator of establishment. Such joint investigations are without prejudice to the tasks and powers of the participating Digital Services Coordinators and the requirements applicable to the performance of those tasks and exercise of those powers provided in this Regulation. The participating Digital Services Coordinators shall make the results of the joint investigations available to other Digital Services Coordinators, the Commission and the Board through the system provided for in Article 67 for the fulfilment of their respective tasks under this Regulation. 2 Where a Digital Services Coordinator of establishment has reasons to suspect that a very large online platform infringed this Regulation, it may request the Commission to take the necessary investigatory and enforcement measures to ensure compliance with this Regulation in accordance with Section 3. Such a request shall contain all information listed in Article 45(2) and set out the reasons for requesting the Commission to intervene.

Article 47 European Board for Digital Services

1 An independent advisory group of Digital Services Coordinators on the supervision of providers of intermediary services named ‘European Board for Digital Services’ (the ‘Board’) is established. 2 The Board shall advise the Digital Services Coordinators and the Commission in accordance with this Regulation to achieve the following objectives: (a) Contributing to the consistent application of this Regulation and effective cooperation of the Digital Services Coordinators and the Commission with regard to matters covered by this Regulation; (b) coordinating and contributing to providing guidance and analysis of to the Commission and Digital Services Coordinators and other competent authorities on emerging issues across the internal market with regard to matters covered by this Regulation; (c) assisting (ba) contributing to the Digital Services Coordinators and effective application of Article 3 of Directive 2000/31/EC to prevent fragmentation of the Commission in digital single market; (ca) contribute to the supervision effective cooperation with the competent authorities of very large online platforms. third countries and with international organisations.

Article 48 Structure of the Board

1 1. The Board shall be composed of the Digital Services Coordinators, who shall be represented by high-level officials. Where provided for by national law, other competent authorities entrusted with specific operational responsibilities for the application and enforcement of this Regulation alongside the Digital Services Coordinator shall , may participate in the Board. Other national authorities may be invited to the meetings, where the issues discussed are of relevance for them. 2 Each Member State shall have one vote. The Commission shall not have voting rights. The Board meeting shall adopt be deemed valid where at least two thirds of its acts by simple majority. 3 members are present. 1a. The Board shall be chaired by the Commission. The Commission shall convene the meetings and prepare the agenda in accordance with the tasks of the Board pursuant to this Regulation and with its rules of procedure. 4 2. Each Member State shall have one vote , to be cast by the Digital Services Coordinator . The Commission shall provide administrative and analytical support for the activities of the Board pursuant to this Regulation. 5 not have voting rights. deleted 5a. The Board may invite experts and observers to attend its meetings, and may cooperate with other Union bodies, offices, agencies shall, where appropriate, consult interested parties and advisory groups, as well as external experts as appropriate. The Board shall make the results of this cooperation that consultation publicly available. 6 6. The Board shall adopt its rules of procedure, procedure by a two-thirds majority of its members , following the consent of the Commission.

Article 49 Tasks of the Board

1 Where necessary to meet the objectives set out in Article 47(2), the Board shall in particular: (a) support the coordination of joint investigations; (b) support the competent authorities in the analysis of reports and results of audits of very large online platforms to be transmitted pursuant to this Regulation; (c) (ca) issue opinions, specific recommendations or advice to Digital Services Coordinators in accordance with this Regulation; for the implementation of Article 13a; (d) advise the Commission to take the measures referred to in Article 51 and, where requested by the Commission, and adopt opinions on draft Commission measures concerning very large online platforms in accordance with this Regulation; (da) monitor the compliance with Article 3 of Directive 2000/31/EC of measures taken by a Member State restricting the freedom to provide services of intermediary service providers from another Member State and ensure that those measures are strictly necessary and do not restrict the application of this Regulation; (e) support and promote the development and implementation of European standards, guidelines, reports, templates and code of conducts in close collaboration with relevant stakeholders as provided for in this Regulation, Regulation , including by issuing opinions, recommendations or advice on matters related to Article 34 , as well as the identification of emerging issues, with regard to matters covered by this Regulation. 2 2. Digital Services Coordinators and other national competent authorities that do not follow the opinions, requests or recommendations addressed to them adopted by the Board shall provide the reasons for this choice and an explanation on the investigations, actions and the measures that they have implemented when reporting pursuant to this Regulation or when adopting their relevant decisions, as appropriate.

Artículo 49 a.

Article 49a Reports 1. The Board shall draw up an annual report regarding its activities. The report shall be made public and be transmitted to the European Parliament, to the Council and to the Commission in all official languages of the Union. 2. The annual report shall include, among other information, a review of the practical application of the opinions, guidelines, recommendations advice and any other measures taken under Article 49(1).

Article 50 Enhanced supervision for very large online platforms

1 Where the Digital Services Coordinator of establishment adopts a decision finding that a very large online platform has infringed any of the provisions of Section 4 of Chapter III, it shall make use of the enhanced supervision system laid down in this Article. It shall take utmost account of any opinion and recommendation of the Commission and the Board pursuant to this Article. The Commission acting on its own initiative, or the Board acting on its own initiative or upon request of at least three Digital Services Coordinators of destination, may, where it has reasons to suspect that a very large online platform infringed any of those provisions, the provisions of Section 4 of Chapter III , recommend the Digital Services Coordinator of establishment to investigate the suspected infringement with a view to that Digital Services Coordinator adopting such a decision within a reasonable time period. 2 period and no later than three months . 2. When communicating the decision referred to in the first subparagraph of paragraph 1 to the very large online platform concerned, the Digital Services Coordinator of establishment shall request it to draw up and communicate to the Digital Services Coordinator of establishment, the Commission and the Board, within one month from that decision, an action plan, specifying how that platform intends to terminate or remedy the infringement. The measures set out in the action plan may include, recommend , where appropriate, participation in a code of conduct as provided for in Article 35. 3 Within one month following receipt of the action plan, the Board shall communicate its opinion on the action plan to the Digital Services Coordinator of establishment. Within one month following receipt of that opinion, that Digital Services Coordinator shall decide whether the action plan is appropriate to terminate or remedy the infringement. Where the Digital Services Coordinator of establishment has concerns on the ability of the measures to terminate or remedy the infringement, it may request the very large online platform concerned to subject itself to an additional, independent audit to assess the effectiveness of those measures in terminating or remedying the infringement. In that case, that platform shall send the audit report to that Digital Services Coordinator, the Commission and the Board within four months from the decision referred to in the first subparagraph. When requesting such an additional audit, the Digital Services Coordinator may specify a particular audit organisation that is to carry out the audit, at the expense of the platform concerned, selected on the basis of criteria set out in Article 28(2). 4 The Digital Services Coordinator of establishment shall communicate to the Commission, the Board and the very large online platform concerned its views as to whether the very large online platform has terminated or remedied the infringement and the reasons thereof. It shall do so within the following time periods, as applicable: (a) within one month from the receipt of the audit report referred to in the second subparagraph of paragraph 3, where such an audit was performed; (b) within three months from the decision on the action plan referred to in the first subparagraph of paragraph 3, where no such audit was performed; (c) immediately upon the expiry of the time period set out in paragraph 2, where that platform failed to communicate the action plan within that time period. Pursuant to that communication, the Digital Services Coordinator of establishment shall no longer be entitled to take any investigatory or enforcement measures in respect of the relevant conduct by the very large online platform concerned, without prejudice to Article 66 or any other measures that it may take at the request of the Commission.

Article 51 Intervention by the Commission and opening of proceedings

1 Opening of proceedings by the Commission 1. The Commission, acting either upon the Board’s recommendation or on its own initiative after consulting the Board, may shall initiate proceedings in view of the possible adoption of decisions pursuant to Articles 58 and 59 in respect of the relevant conduct by the very large online platform that: (a) is suspected of having infringed any of the provisions of this Regulation and the Digital Services Coordinator of establishment did not take any investigatory or enforcement measures, pursuant to the request of the Commission referred to in Article 45(7), upon the expiry of the time period set in that request; (b) is suspected of having infringed any of the provisions of this Regulation and the Digital Services Coordinator of establishment requested the Commission to intervene in accordance with Article 46(2), upon the reception of that request; (c) has been found to have infringed any of the provisions of Section 4 of Chapter III, upon the expiry of the relevant time period for the communication referred to in Article 50(4). 2 2. Where the Commission decides to initiate initiates proceedings pursuant to paragraph 1, it shall notify all Digital Services Coordinators, the Board and the very large online platform concerned. As regards points (a) and (b) of paragraph 1, pursuant to that notification, the Digital Services Coordinator of establishment concerned shall no longer be entitled to take any investigatory or enforcement measures in respect of the relevant conduct by the very large online platform concerned, without prejudice to Article 66 or any other measures that it may take at the request of the Commission. 3 The Digital Services Coordinator referred to in Articles 45(7), 46(2) and 50(1), as applicable, shall, without undue delay upon being informed, transmit to the Commission: (a) any information that that Digital Services Coordinator exchanged relating to the infringement or the suspected infringement, as applicable, with the Board and with the very large online platform concerned; (b) the case file of that Digital Services Coordinator relating to the infringement or the suspected infringement, as applicable; (c) any other information in the possession of that Digital Services Coordinator that may be relevant to the proceedings initiated by the Commission. 4 The Board, and the Digital Services Coordinators making the request referred to in Article 45(1), shall, without undue delay upon being informed, transmit to the Commission any information in their possession that may be relevant to the proceedings initiated by the Commission.

Article 52 Requests for information

1 1. In order to carry out the tasks assigned to it under this Section, the Commission may by simple reasoned request or by decision require the very large online platforms concerned, their legal representatives as well as any other persons acting for purposes related to their trade, business, craft or profession that may be reasonably be aware of information relating to the suspected infringement or the infringement, as applicable, including organisations performing the audits referred to in Articles 28 and 50(3), to provide such information within a reasonable time period. 2 When sending a simple request for information to the very large online platform concerned or other person referred to in Article 52(1), the Commission shall state the legal basis and the period 3a. The purpose of the request, specify what information is required request shall include reasoning on why and set the time period within which how the information is to be provided, necessary and the penalties provided for in Article 59 for supplying incorrect or misleading information. 3 Where the Commission requires the very large online platform concerned or other person referred to in Article 52(1) proportionate to supply information by decision, it shall state the legal basis and the purpose of the request, specify what information is required objective pursued and set the time period within which why it is to cannot be provided. It shall also indicate the penalties provided for in Article 59 and indicate or impose the periodic penalty payments provided for in Article 60. It shall further indicate the right to have the decision reviewed received by the Court of Justice of the European Union. 4 other means. 4. The owners of the very large online platform concerned or other person referred to in Article 52(1) or their representatives and, in the case of legal persons, companies or firms, or where they have no legal personality, the persons authorised to represent them by law or by their constitution shall supply the information requested on behalf of the very large online platform concerned or other person referred to in Article 52(1). Lawyers duly authorised to act may supply the information on behalf of their clients. The latter shall remain fully responsible if the information supplied is incomplete, incorrect or misleading. 5 At the request of the Commission, the Digital Services Coordinators and other competent authorities shall provide the Commission with all necessary information to carry out the tasks assigned to it under this Section.

Article 53 Power to take interviews and statements

In order to carry out the tasks assigned to it under this Section, the Commission may interview any natural or legal person which consents to being interviewed for the purpose of collecting information, relating to the subject-matter of an investigation, in relation to the suspected infringement or infringement, as applicable.

Article 54 Power to conduct on-site inspections

1 In order to carry out the tasks assigned to it under this Section, the Commission may conduct on-site inspections at the premises of the very large online platform concerned or other person referred to in Article 52(1). 2 On-site inspections may also be carried out with the assistance of auditors or experts appointed by the Commission pursuant to Article 57(2). 3 During on-site inspections the Commission and auditors or experts appointed by it may require the very large online platform concerned or other person referred to in Article 52(1) to provide explanations on its organisation, functioning, IT system, algorithms, data-handling and business conducts. The Commission and auditors or experts appointed by it may address questions to key personnel of the very large online platform concerned or other person referred to in Article 52(1). 4 The very large online platform concerned or other person referred to in Article 52(1) is required to submit to an on-site inspection ordered by decision of the Commission. The decision shall specify the subject matter and purpose of the visit, set the date on which it is to begin and indicate the penalties provided for in Articles 59 and 60 and the right to have the decision reviewed by the Court of Justice of the European Union.

Article 55 Interim measures

1 1. In the context of proceedings which may lead to the adoption of a decision of non-compliance pursuant to Article 58(1), where there is an urgency due to the risk of serious damage for the recipients of the service, service the Commission may, by decision, order proportionate interim measures in compliance with fundamental rights against the very large online platform concerned on the basis of a prima facie finding of an infringement. 2 A decision under paragraph 1 shall apply for a specified period of time and may be renewed in so far this is necessary and appropriate.

Article 56 Commitments

1 If, during proceedings under this Section, the very large online platform concerned offers commitments to ensure compliance with the relevant provisions of this Regulation, the Commission may by decision make those commitments binding on the very large online platform concerned and declare that there are no further grounds for action. 2 2. The Commission may, upon request or on its own initiative, shall reopen the proceedings: (a) where there has been a material change in any of the facts on which the decision was based; (b) where the very large online platform concerned acts contrary to its commitments; or (c) where the decision was based on incomplete, incorrect or misleading information provided by the very large online platform concerned or other person referred to in Article 52(1). 3 Where the Commission considers that the commitments offered by the very large online platform concerned are unable to ensure effective compliance with the relevant provisions of this Regulation, it shall reject those commitments in a reasoned decision when concluding the proceedings.

Article 57 Monitoring actions

1 For the purposes of carrying out the tasks assigned to it under this Section, the Commission may take the necessary actions to monitor the effective implementation and compliance with this Regulation by the very large online platform concerned. The Commission may also order that platform to provide access to, and explanations relating to, its databases and algorithms. 2 The actions pursuant to paragraph 1 may include the appointment of independent external experts and auditors to assist the Commission in monitoring compliance with the relevant provisions of this Regulation and to provide specific expertise or knowledge to the Commission.

Article 58 Non-compliance

1 The Commission shall adopt a non-compliance decision where it finds that the very large online platform concerned does not comply with one or more of the following: (a) the relevant provisions of this Regulation; (b) interim measures ordered pursuant to Article 55; (c) commitments made binding pursuant to Article 56, 2 Before adopting the decision pursuant to paragraph 1, the Commission shall communicate its preliminary findings to the very large online platform concerned. In the preliminary findings, the Commission shall explain the measures that it considers taking, or that it considers that the very large online platform concerned should take, in order to effectively address the preliminary findings. 3 3. In the decision adopted pursuant to paragraph 1 the Commission shall order the very large online platform concerned to take the necessary measures to ensure compliance with the decision pursuant to paragraph 1 within a reasonable time period one month and to provide information on the measures that that platform intends to take to comply with the decision. 4 The very large online platform concerned shall provide the Commission with a description of the measures it has taken to ensure compliance with the decision pursuant to paragraph 1 upon their implementation. 5 5. Where the Commission finds that the conditions of paragraph 1 are not met, it shall close the investigation by a decision. The decision shall apply with immediate effect.

Article 59 Fines

1 1. In the decision pursuant to Article 58, the Commission may impose on the very large online platform concerned fines not exceeding 6% 6 % of its total worldwide turnover in the preceding financial year where it finds that that the platform, intentionally or negligently: (a) infringes the relevant provisions of this Regulation; (b) fails to comply with a decision ordering interim measures under Article 55; or (c) fails to comply with a voluntary measure made binding by a decision pursuant to Articles 56. 2 2. The Commission may by decision and in compliance with the proportionality principle impose on the very large online platform concerned or other person referred to in Article 52(1) fines not exceeding 1% 1 % of the total worldwide turnover in the preceding financial year, where they intentionally or negligently: (a) supply incorrect, incomplete or misleading information in response to a request pursuant to Article 52 or, when the information is requested by decision, fail to reply to the request within the set time period; (b) fail to rectify within the time period set by the Commission, incorrect, incomplete or misleading information given by a member of staff, or fail or refuse to provide complete information; (c) refuse to submit to an on-site inspection pursuant to Article 54. 3 Before adopting the decision pursuant to paragraph 2, the Commission shall communicate its preliminary findings to the very large online platform concerned or other person referred to in Article 52(1). 4 4. In fixing the amount of the fine, the Commission shall have regard to the nature, gravity, duration and recurrence of the infringement any fines issued under Article 42 for the same infringement and, for fines imposed pursuant to paragraph 2, the delay caused to the proceedings.

Article 60 Periodic penalty payments

1 1. The Commission may, by decision, impose on the very large online platform concerned or other person referred to in Article 52(1), as applicable, periodic penalty payments not exceeding 5 % of the average daily worldwide turnover in the preceding financial year per day, calculated from the date appointed by the decision, in order to compel them to: (a) supply correct and complete information in response to a decision requiring information pursuant to Article 52; (b) submit to an on-site inspection which it has ordered by decision pursuant to Article 54; (c) comply with a decision ordering interim measures pursuant to Article 55(1); (d) comply with commitments made legally binding by a decision pursuant to Article 56(1); (e) comply with a decision pursuant to Article 58(1). 2 Where the very large online platform concerned or other person referred to in Article 52(1) has satisfied the obligation which the periodic penalty payment was intended to enforce, the Commission may fix the definitive amount of the periodic penalty payment at a figure lower than that which would arise under the original decision.

Article 61 Limitation period for the imposition of penalties

1 The powers conferred on the Commission by Articles 59 and 60 shall be subject to a limitation period of five years. 2 Time shall begin to run on the day on which the infringement is committed. However, in the case of continuing or repeated infringements, time shall begin to run on the day on which the infringement ceases. 3 Any action taken by the Commission or by the Digital Services Coordinator for the purpose of the investigation or proceedings in respect of an infringement shall interrupt the limitation period for the imposition of fines or periodic penalty payments. Actions which interrupt the limitation period shall include, in particular, the following: (a) requests for information by the Commission or by a Digital Services Coordinator; (b) on-site inspection; (c) the opening of a proceeding by the Commission pursuant to Article 51(2). 4 Each interruption shall start time running afresh. However, the limitation period for the imposition of fines or periodic penalty payments shall expire at the latest on the day on which a period equal to twice the limitation period has elapsed without the Commission having imposed a fine or a periodic penalty payment. That period shall be extended by the time during which the limitation period is suspended pursuant to paragraph 5. 5 The limitation period for the imposition of fines or periodic penalty payments shall be suspended for as long as the decision of the Commission is the subject of proceedings pending before the Court of Justice of the European Union.

Article 62 Limitation period for the enforcement of penalties

1 The power of the Commission to enforce decisions taken pursuant to Articles 59 and 60 shall be subject to a limitation period of five years. 2 Time shall begin to run on the day on which the decision becomes final. 3 The limitation period for the enforcement of penalties shall be interrupted: (a) by notification of a decision varying the original amount of the fine or periodic penalty payment or refusing an application for variation; (b) by any action of the Commission, or of a Member State acting at the request of the Commission, designed to enforce payment of the fine or periodic penalty payment. 4 Each interruption shall start time running afresh. 5 The limitation period for the enforcement of penalties shall be suspended for so long as: (a) time to pay is allowed; (b) enforcement of payment is suspended pursuant to a decision of the Court of Justice of the European Union.

Article 63 Right to be heard and access to the file

1 Before adopting a decision pursuant to Articles 58(1), 59 or 60, the Commission shall give the very large online platform concerned or other person referred to in Article 52(1) the opportunity of being heard on: (a) preliminary findings of the Commission, including any matter to which the Commission has taken objections; and (b) measures that the Commission may intend to take in view of the preliminary findings referred to point (a). 2 The very large online platform concerned or other person referred to in Article 52(1) may submit their observations on the Commission’s preliminary findings within a reasonable time period set by the Commission in its preliminary findings, which may not be less than 14 days. 3 The Commission shall base its decisions only on objections on which the parties concerned have been able to comment. 4 The rights of defence of the parties concerned shall be fully respected in the proceedings. They shall be entitled to have access to the Commission's file under the terms of a negotiated disclosure, subject to the legitimate interest of the very large online platform concerned or other person referred to in Article 52(1) in the protection of their business secrets. The right of access to the file shall not extend to confidential information and internal documents of the Commission or Member States’ authorities. In particular, the right of access shall not extend to correspondence between the Commission and those authorities. Nothing in this paragraph shall prevent the Commission from disclosing and using information necessary to prove an infringement. 5 The information collected pursuant to Articles 52, 53 and 54 shall be used only for the purpose of this Regulation. 6 Without prejudice to the exchange and to the use of information referred to in Articles 51(3) and 52(5), the Commission, the Board, Member States’ authorities and their respective officials, servants and other persons working under their supervision,; and any other natural or legal person involved, including auditors and experts appointed pursuant to Article 57(2) shall not disclose information acquired or exchanged by them pursuant to this Section and of the kind covered by the obligation of professional secrecy.

Article 64 Publication of decisions

1 1. The Commission shall publish the decisions it adopts pursuant to Articles 55(1), 56(1), 58, 59 and 60. Such publication shall state the names of the parties and the main content of the decision, including any penalties imposed. 2 The publication shall have regard to the rights imposed , along with, where possible and legitimate interests of the very large online platform concerned, any justified, non-confidential documents or other person referred to in Article 52(1) and any third parties in the protection forms of their confidential information. information on which the decision is based .

Article 65 Requests for access restrictions and cooperation with national courts

1 Where all powers pursuant to this Article to bring about the cessation of an infringement of this Regulation have been exhausted, the infringement persists and causes serious harm which cannot be avoided through the exercise of other powers available under Union or national law, the Commission may request the Digital Services Coordinator of establishment of the very large online platform concerned to act pursuant to Article 41(3). Prior to making such request to the Digital Services Coordinator, the Commission shall invite interested parties to submit written observations within a time period that shall not be less than two weeks, 14 working days describing the measures it intends to request and identifying the intended addressee or addressees thereof. 2 Where the coherent application of this Regulation so requires, the Commission, acting on its own initiative, may submit written observations to the competent judicial authority referred to Article 41(3). With the permission of the judicial authority in question, it may also make oral observations. For the purpose of the preparation of its observations only, the Commission may request that judicial authority to transmit or ensure the transmission to it of any documents necessary for the assessment of the case.

Article 66 Implementing acts relating to Commission intervention

1 In relation to the Commission intervention covered by this Section, the Commission may adopt implementing acts concerning the practical arrangements for: (c) (ca) the proceedings pursuant to Articles 54 development and 57; (a) the hearings provided for in Article 63; (b) the negotiated disclosure implementation of information standards provided for in Article 63. 2 Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70. Before the adoption of any measures pursuant to paragraph 1, the Commission shall publish a draft thereof and invite all interested parties to submit their comments within the time period set out therein, which shall not be less than one month. 34.

Article 67 Information sharing system

1 The Commission shall establish and maintain a reliable and secure information sharing system supporting communications between Digital Services Coordinators, the Commission and the Board. 2 The Digital Services Coordinators, the Commission and the Board shall use the information sharing system for all communications pursuant to this Regulation. 3 The Commission shall adopt implementing acts laying down the practical and operational arrangements for the functioning of the information sharing system and its interoperability with other relevant systems. Those implementing acts shall be adopted in accordance with the advisory procedure referred to in Article 70.

Article 68 Representation

Without prejudice to Directive 2020/XX/EU (EU) 2020/1818 of the European Parliament and of the Council , (52), recipients of intermediary services shall have the right to mandate a , or a body, organisation or association to exercise the rights referred to in Articles 8, 12, 13, 14, 15, 17, 18 18, 19, 43 and 19 43a on their behalf, provided the body, organisation or association meets all of the following conditions: (a) it operates on a not-for-profit basis; (b) it has been properly constituted in accordance with the law of a Member State; (c) its statutory objectives include a legitimate interest in ensuring that this Regulation is complied with.

Article 69 Exercise of the delegation

1 The power to adopt delegated acts is conferred on the Commission subject to the conditions laid down in this Article. 2 2. The delegation of power referred to in Articles 13a, 16, 23, 25, and 31 shall be conferred on the Commission for an indeterminate period of time five years starting from [date of expected adoption of the Regulation]. 3 The Commission shall draw up a report in respect of the delegation of power not later than nine months before the end of the five-year period. The delegation of power shall be tacitly extended for periods of an identical duration, unless the European Parliament or the Council opposes such extension not later than three months before the end of each period. 3. The delegation of power referred to in Articles 13a, 16, 23, 25 and 25, 31 may be revoked at any time by the European Parliament or by the Council. A decision of revocation shall put an end to the delegation of power specified in that decision. It shall take effect the day following that of its publication in the Official Journal of the European Union or at a later date specified therein. It shall not affect the validity of any delegated acts already in force. 4 As soon as it adopts a delegated act, the Commission shall notify it simultaneously to the European Parliament and to the Council. 5 5. A delegated act adopted pursuant to Articles 13a, 16, 23, 25 and 25, 31 shall enter into force only if no objection has been expressed by either the European Parliament or the Council within a period of three four months of notification of that act to the European Parliament and the Council or if, before the expiry of that period, the European Parliament and the Council have both informed the Commission that they will not object. That period shall be extended by three months at the initiative of the European Parliament or of the Council.

Article 70 Committee

1 1. The Commission shall be assisted by the a Digital Services Committee. That Committee shall be a Committee within the meaning of Regulation ( EU) No 182/2011 . 2 Where reference is made to this Article, Article 4 of Regulation ( EU) (EU) No 182/2011 shall apply. 182/2011.

Chapter V – Final provisions

Article 71 Deletion of certain provisions of Directive 2000/31/EC

1 Articles 12 to 15 of Directive 2000/31/EC shall be deleted. 2 References to Articles 12 to 15 of Directive 2000/31/EC shall be construed as references to Articles 3, 4, 5 and 7 of this Regulation, respectively.

Article 72 Amendments to Directive 2020/XX/EC on Representative Actions for the Protection of the Collective Interests of Consumers

3 The following is added to Annex I: “(X) Regulation of the European Parliament and of the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC”

Article 73 Evaluation

1 1. By five three years after the entry into force of this Regulation at the latest, and every five three years thereafter, the Commission shall evaluate this Regulation and report to the European Parliament, the Council and the European Economic and Social Committee. 2 For This report shall address in particular: (a) the purpose application of paragraph 1, Member States and Article 25, including with respect to the Board shall send information on number of average monthly active recipients of the request service; (b) the application of Article 11; (c) the Commission. 3 application of Article 14, (d) the application of Articles 35 and 36. 1a. Where appropriate, the report referred to in paragraph 1 shall be accompanied by a proposal for amendment of this Regulation. 3. In carrying out the evaluations referred to in paragraph 1, the Commission shall take into account the positions and findings of the European Parliament, the Council, and other relevant bodies or sources. 4 By three years from the date of application of this Regulation at the latest, the Commission, after consulting the Board, shall carry out an assessment of the functioning of the Board sources , and shall report it pay specific attention to the European Parliament, the Council and the European Economic small and Social Committee, taking into account the first years of application of the Regulation. On the basis of the findings medium-sized enterprises and taking into utmost account the opinion of the Board, that report shall, where appropriate, be accompanied by a proposal for amendment of this Regulation with regard to the structure position of the Board. new competitors.

Article 74 Entry into force and application

1 This Regulation shall enter into force on the twentieth day following that of its publication in the Official Journal of the European Union. 2 2. It shall apply from [date - three six months after its entry into force].

parlamento.ai

Diff es un proyecto de investigacion de parlamento.ai, donde proveemos transcripciones en tiempo real y alertas sobre lo que esta pasando en el Congreso.

Constantemente buscamos empujar los limites de que tan transparente puede ser el Congreso. Cualquier comentario o sugerencia es muy bienvenido.