Digital Services Act, which is going to be its impact?
On 17 February 2024, the new Regulation (EU) 2022/2065 of the European Parliament and of the Council on a single market for digital services and amending Directive 2000/31/EC, also known as the Digital Services Act (DSA), entered into force.
The DSA will apply to intermediary service providers, but also to very large online platforms (‘VLOPs’) and ‘very large online search engines (’VLOSE”), provided that there is a substantial connection with the EU, regardless of the country in which they are located. Substantial connection may be understood as follows:
- If there is an establishment in the EU
- If the number of addressees, in one or more EU Member States, is significant in relation to the size of the population in that Member State.
- Where the activities, within one or more EU Member States, are specifically targeted, e.g. on the basis of the language used, the currency used, use of top level domain names, etc.
- Substantial connection should not be understood as simply accessing the service.
Objectives of the Digital Services Regulation
This new regulation is a new effort by the European Commission to implement a regulation with the aim of applying rules on transparency, content moderation, disinformation and recommendation algorithms, and in particular:
- To protect users from internet risks.
- Preventing the sale of services, goods or publication of illegal content.
- Combat misinformation and misleading advertising.
- Subject service providers to independent audits, including safeguards for the protection of minors.
- Limit the use of sensitive personal data for advertising purposes.
With regard to the liability of these service providers, the system remains in place whereby they will only be liable for illegal content uploaded by users from the moment they have actual knowledge of it, with no prior liability for such providers, even if the content uploaded to their platforms is illegal or offensive.
What is now required is that platforms have a specific channel or mechanism in place, a single point of contact, through which users can report illegal content for subsequent removal. The period for the removal of such content, from the moment of actual knowledge of it, should be as short as possible.
Specifically, although the time periods may vary depending on the type of content, for illegal content that incites hatred, the maximum period for removal will be 24 hours from the time the notification is received.
In addition, in those cases in which a platform suspends or restricts a user on the platform or with respect to the publications it makes, the DSA will oblige them to provide a detailed and well-founded explanation regarding the reasons for said suspension and/or restriction, as well as options for opposing the suspension and/or restriction.
Another of the main novelties of the Digital Services Act or Regulation is who it will apply to, which are those platforms that operate in the European Union, regardless of where they are located. In addition, specific requirements are described for ‘very large online platforms’, such as Meta, X, Instagram, among others.
The status of ‘very large online platform’ will be acquired for those with more than 45 million users. In addition, specific due diligence obligations are established for such platforms, which are highly burdensome, depending on the service provider in question, as well as the type of services offered. And in the event of penalties being applied, these can amount to up to 6% of annual turnover.
Furthermore, in order to ensure compliance with transparency measures in the algorithms that are responsible for issuing recommendations, the European Centre for Algorithmic Transparency was inaugurated in April 2023 in Seville.
At Letslaw we are digital lawyers, experts in Information Technology Law, so we will be happy to help you with anything you need related to this area.