Online platforms responsibilities are at the heart of the Digital Services Act (DSA) regulation. It is through them that content is distributed on the Internet.
In addition to the obligations related to the notice & action procedure and the internal complaint handling system, mentioned in the previous article, the DSA imposes additional obligations on online platforms related to moderating the content posted on them and ensuring transparent online commerce.
Reporting crime
Online platforms can become a tool for the dissemination of criminal content. Therefore, regardless of the method of removing illegal publications, one of the obligations imposed on online platforms as part of content moderation is the obligation to immediately report suspected crime (mainly by attempting to publish prohibited content on the platform) to the competent law enforcement authorities. In order not to be excessive, this obligation covers „serious crimes” which pose a threat to the life or safety of persons. In the preamble, the directive indicates that this is, inter alia, for offenses relating to child pornography.
Verification of entrepreneurs’ identity
Platforms that enable the conclusion of contracts with entrepreneurs remotely (accommodation platforms, online stores) should ensure transparency in relation to entities offering their products or services. For this purpose, the platform, before enabling entrepreneurs to use it, should obtain basic information enabling their identification (contact details, entry in the relevant register). Platforms should provide this information to consumers in a clear and understandable manner and verify its correctness, in particular using generally available sources. The legislator clearly indicates in the preamble that such verification is to be proportionate, not too costly and time-consuming. Ideally, all data should be provided by the entrepreneur himself using the interface on the website. The overriding aim of the regulation is to discourage entrepreneurs from offering products and services in breach of EU regulations – due to their easy identification in the event of such a breach.
Transparency of Internet advertising
Advertisements are often placed on online platforms. Which recipient sees a given advertisement depends mainly on the profiling of their data obtained while browsing the web. To ensure the transparency of advertising, online platforms will have to inform users in real time that the content they display is an advertisement and on whose behalf the advertisement is being published. In addition, platforms must indicate the main parameters used to determine the recipient to whom the advertisement is displayed (i.e. why he sees this and not another advertisement – this should consist, for example, in indicating the user’s characteristics, which were determined on the basis of profiling his data).
Information related to the obligations imposed on platforms, e.g. with regard to out-of-court dispute resolution, should be included in the reports. Platforms will also have to publish information on the number of active users on a regular basis – this is an important criteria for being considered very large platforms.