Responsibilities of online platforms are at the heart of the Digital Services Act (DSA) regulation. It is through them that content is distributed on the Internet.
Internet platforms, according to the DSA, are hosting service providers that, at the request of the recipient of the service, store and publicly disseminate information, unless it is only their side activity. Therefore, the platforms are social networking sites, online stores and accommodation platforms. In order not to overburden smaller entities, the obligations relating to internet platforms do not apply to micro and small enterprises within the meaning of EU regulations.
Handling of complaints
In line with the principle of proportionality adopted in the DSA, in addition to the obligations explicitly assigned to online platforms, they must also comply with the requirements that are binding on all intermediary service providers and hosting providers. As with the latter, platforms must enable reporting. They must also remove illegal content (notice & action). In addition, they are required to enable electronic appeals against decisions made on the basis of illegal content. The internal complaint handling therefore concerns the entity that posted the content, and not the user who reported illegal content, in the event that the report is not taken into account. Such a user may, if necessary, have the right to appeal against the decision to a court on general terms. An alternative to the internal procedure is the possibility to appeal to an independent, expert out-of-court body, the decisions of which are to be binding on the platform. These bodies can be public or private entities, and if they meet certain conditions, they obtain an appropriate certificate and are placed on the list of the European Commission.
Trusted whistleblowers and the right to suspend service
With regard to online platforms, DSA also provides mechanisms to improve the notice & action procedure. This is primarily the purpose of Trusted Flaggers – independent experts whose reports are to be treated as a priority. They can be public or private entities that have demonstrated a proven track record of reporting illegal content in a timely and objective manner. The European Commission will also keep a register of these entities. If such an entity is not functioning properly, it may lose its status. Moreover, to protect against abuse by both whistle-blowers and users posting illegal content, platforms will be able to suspend their services to recipients who too often make manifestly (that is, their assessment should be obvious, even to the layman) unfounded reports or publish manifestly illegal content.
Platforms should be guided by absolute figures and evaluation criteria (intention, seriousness of the breach), which should be clearly indicated in the terms of service.
The second part of this article will be published soon. We invite you to follow our website.