Kommisjonsrekommandasjon (EU) 2023/2425 av 20. oktober 2023 om koordinering av reaksjoner på hendelser som kommer av oppstår fra spredning av ulovlig delt innhold før den fulle ikrafttredelsen av europaparlaments- og rådsforordning (EU) 2022/2065 (DSA)
Digitale tjenester i det indre marked: koordinering av reaksjoner på hendelser som oppstår fra spredning av ulovlig innhold
Omtale publisert i Stortingets EU/EØS-nytt 30.10.2023
Tidligere
- Kommisjonsrekommandasjon publisert i EU-tidende 26.10.2023
Nærmere omtale
BAKGRUNN (fra kommisjonsrekommandasjonen)
(1) The world is witnessing an unprecedented period of conflict and instability. With Russia’s war of aggression against Ukraine and with the terrorist attack by Hamas in Israel. With the wide reach of social media, violence and war increasingly reverberate online in the Union. This has had as its consequence an unprecedented increase in illegal and harmful content being disseminated online, including coordinated actions to spread disinformation and misinformation throughout the Union in relation to such international crises.
(2) Online platforms, in particular, play an important role in the dissemination of information throughout the Union. On one hand, online platforms constitute key channels of communication for Union citizens and can provide meaningful information to governments and public authorities. They facilitate the public debate and the dissemination of information, opinions, and ideas to the public, and influence how citizens obtain and communicate information online. On the other hand, online platforms can be misused as a means to disseminate and amplify illegal or harmful content online.
(3) With Regulation (EU) 2022/2065 of the European Parliament and of the Council, the Union has laid down ground-breaking rules to secure its online information environment, protecting vital informational freedoms, especially in times of conflict, but also requiring effective responses to the dissemination of illegal content online and threats to civic discourse, elections and public security. That Regulation contributes to the proper functioning of the internal market for intermediary services by setting out harmonised rules for a safe, predictable and trusted online environment that facilitates innovation and in which fundamental rights enshrined in the Charter of Fundamental Rights of the European Union, are effectively protected.
(4) The Regulation does so, in particular, by imposing specific due diligence obligations tailored to specific categories of intermediary services providers, and by putting in place a governance structure to ensure cooperation and coordination between the competent authorities of the Member States and the Commission in monitoring and enforcing those obligations, including the possibility to drawing up crisis protocols pursuant to Article 48.
(5) While Regulation (EU) 2022/2065 will only apply in full as from 17 February 2024, it already applies to the providers of online platforms and of online search engines which the Commission, on 25 April 2023, designated as very large online platforms and as very large online search engines pursuant to Article 33(4) of that regulation. While the Member States are only obliged to designate their Digital Services Coordinators and other national competent authorities responsible for the monitoring and enforcement of Regulation (EU) 2022/2065 by 17 February 2024, the Commission may already deploy the enforcement powers entrusted to it under Section 4 of Chapter IV of that regulation in respect of the very large online platforms and of very large online search engines it designated on 25 April 2023.
(6) However, the effective monitoring and enforcement of Regulation (EU) 2022/2065 by the Commission in relation to those designated very large online platforms and very large online search engines requires the assistance of and active cooperation with Member State national authorities. In several instances, the provisions of Section 4 of Chapter IV of that regulation explicitly requires the Commission to cooperate with the European Board for Digital Service (‘the Board’), Digital Service Coordinators, and other national competent authorities which the Member States plan to entrust with the monitoring and enforcement of that regulation in their territory.
(7) The fact that several Member States have not yet designated their Digital Service Coordinators and that the Board has not yet been constituted complicates the monitoring and enforcement of that regulation by the Commission, prior to the full entry into force thereof, in relation to designated very large online platforms and very large online search engines to which Regulation (EU) 2022/2065 already applies. Nevertheless, the Commission is committed to ensure the full effectiveness of that regulation in relation to providers of such services.
(8) By the date of adoption of this Recommendation, less than 10 % of Member States have already formally appointed their Digital Services Coordinator. In many Member States, however, existing regulatory authorities have been preliminarily identified to assume the role of Digital Services Coordinator and the national legislative processes have been initiated. To that end, the Commission encourages the Member States, until the governance structure foreseen by Regulation (EU) 2022/2065 is fully in place, to appoint an independent authority to be part of an informal network of prospective Digital Services Coordinators, as their role is essential to identify and tackle incidents, in particular arising from the dissemination of illegal content, posing a clear risk of intimidating groups of population and destabilising political and social structures in the Union or parts thereof, including those which risk leading to a serious threat to public security or public health in the Union or in significant parts of it. They are encouraged to meet regularly among themselves and with the Commission in an informal network to discuss such incidents arising from the dissemination of illegal content disseminated on very large online platforms and of very large online search engines, to which that regulation already applies. Such incidents may include, in particular, the dissemination of illegal content in relation to international conflicts, acts of terrorism, public health emergencies, electoral processes, etc.
(9) The Commission also promotes the convening of specific meetings in response to an incident to achieve an agile, coordinated and proportionate response in light of the application of Regulation (EU) 2022/2065 by providers, as well as among Union institutions and Member States, to streamline communication in urgent situations and to allow for widespread situational awareness.
(10) The Member States are also encouraged to assist the Commission in its task of monitoring and enforcement of Regulation (EU) 2022/2065 in relation to designated very large online platforms and very large online search engines. In this context, the Member States are encouraged to gather evidence on the dissemination of illegal content through very large online platforms and very large online search engines on their territory and to share that evidence with the Commission so that it can properly and swiftly respond to such content.
(11) Regulation (EU) 2022/2065 does not determine whether a particular type of content qualifies as illegal content. The unlawfulness of content is determined by national laws or, where harmonised, by European rules. Several acts of Union law provide for a legal framework in respect of certain particular types of illegal content that are presented and disseminated online and harmonise what should be considered illegal across the Union. In particular, Directive (EU) 2017/541 of the European Parliament and of the Council establishes minimum rules concerning the definition of criminal offences and sanctions in the area of terrorist offences, offences related to a terrorist group and offences related to terrorist activities, as well as measures of protection of, and support and assistance to, victims of terrorism.
(12) In addition, Regulation (EU) 2021/784 of the European Parliament and of the Council defines specifically what constitutes terrorist content online, namely material that incites the commission of a terrorist offence, glorifies terrorist acts, advocates for the commission of such offences, solicits to commit or contribute to, or participate in, activities related to terrorist offences, provides instruction on the making of several types of weapons for the purposes of terrorism or constitutes a threat to commit a terrorist offence. It also provides the legal framework for Member States to send removal orders to hosting service providers, obliging the removal of the content within one hour. It further requires hosting service providers to implement specific measures to prevent the exploitation of their services if exposed to terrorist content.
(13) Similarly, Council Framework Decision 2008/913/JHA requires Member States to criminalise several intentional conducts related to public incitement to violence or hatred directed against a group of persons or a member of such a group defined by reference to race, colour, religion, descent or national or ethnic origin. It also requires Member States to criminalise intentional conduct condoning, denying or grossly trivialising crimes of genocide, crimes against humanity, war crimes and crimes against peace, directed against a group of persons or a member of such a groups defined by reference to race, colour, religion, descent or national or ethnic origin when the conduct is carried out in a manner likely to incite to violence or hatred against such a group or a member of such a group.
(14) The Commission further recalls that it is already possible for competent national authorities to issue injunctions against intermediary service providers whose services are being used to disseminate illegal content online. In the current context, it is of crucial importance that the competent national authorities proceed swiftly in identifying such illegal online content and issue removal orders on the basis of their national systems. Article 9 of Regulation (EU) 2022/2065 makes clear that such orders can be issued on a cross-border basis. In view of the risk of incidents, it is of primary importance that the competent authorities collect all necessary evidence to allow effective measures against the amplification of illegal online content regarding the often horrendous crimes and make use of the powers conferred to them by the different instruments of Union law to tackle illegal content.
(15) The multiplicity of national and Union legislation and different forms of coordination in relation to illegal content increases the need to ensure coordination between Member States in the phase leading up to the full application of Regulation (EU) 2022/2065. Taking swift and coordinated action is key to prevent illegal content, and in particular terrorist content and illegal hate speech, from circulating online, including by going viral. When action taken at national level to act against the amplification of illegal content online is uncoordinated, that may increase the risk of legal fragmentation and uncertainty, and increase friction and response times. Furthermore, as recognised by Regulation (EU) 2022/2065, the Commission is better placed to enforce that regulation in relation to providers of very large online platforms and of very large online search engines. With this in mind, it is desirable that Member States act in a coordinated manner in support of the eventual enforcement actions that the Commission may take when exercising its powers set out in Section 4 of Chapter IV of Regulation (EU) 2022/2065.
(16) The Commission further recalls that several voluntary cooperation frameworks exist to address the dissemination of illegal content online.
(17) Taking swift and coordinated action in crisis situations is key to prevent illegal content, and in particular terrorist content and illegal hate speech, from disseminating online virally. The EU Crisis Protocol, developed in 2019 in the context of the EU Internet Forum, and updated in 2023, provides for a voluntary mechanism for a coordinated and rapid cross-border response by online services providers and law enforcement to a suspected crisis in the online space, stemming from a terrorist or a violent extremist act. The EU Crisis Protocol establishes procedures, roles and responsibilities of key actors, in particular to prevent disruption of investigations and ensure evidence gathering and is based on voluntary cooperation among the EU Internet Forum members. Member States can activate the Protocol, in consultation with Europol’s EU Internet Referral Unit (IRU). The EU IRU takes a leading role in the coordination between national law enforcement authorities and online service providers. Preservation of removed content is also key to allow for reinstatement of unduly removed content and protecting fundamental freedoms.
(18) In the context of the Code of conduct on countering hate speech online, major social media platforms, some of which have been designated very large online platforms under Regulation (EU) 2022/2065, have committed to assess and if necessary remove hate speech content notified to them within the majority of cases 24 hours; their compliance is assessed by a network of trusted flaggers. The Commission and the signatories are currently reviewing the Code of conduct, also in the context of the entry into application of Article 45 of Regulation (EU) 2022/2065, to introduce commitments that can help mitigate systemic risks and anticipate threats of waves of illegal hate speech before content has gone viral online.
(19) Regulation (EU) 2022/2065 provides for coordination mechanisms to react to emergency situations. However, as the recent events demonstrate, extraordinary circumstances are already occurring before 17 February 2024, affecting the European digital space. Such extraordinary circumstances, triggered by specific incidents or crisis arising from the dissemination of illegal content, pose a clear risk of intimidating groups of population and destabilising political and social structures in the Union or parts thereof. This situation requires coordinated action at Union level now, well before the application date of the relevant provisions in Regulation (EU) 2022/2065 (e.g. 17 February 2024).
(20) With regard to such emergency threats, action taken at national level to act against the amplification of illegal content online can risk being uncoordinated, leading to legal fragmentation and uncertainty, and increase friction and response times. Furthermore, as recognised by Regulation (EU) 2022/2065, the Commission is better placed to enforce the Regulation as regards the systemic application of the rules by very large online platforms and very large online search engines’ providers. With this in mind, Member States should be encouraged to act in a coordinated manner in support of the eventual enforcement actions that the Commission may take when fulfilling its role set out in Regulation (EU) 2022/2065.
(21) Involving law enforcement in the planning of national response to tackle illegal content is important so that taken or planned measures do not interfere with their work, in particular when there is an imminent threat to life.
(22) In view of the unprecedented period of conflict and instability affecting the Union, this recommendation sets out mechanisms of preparedness, cooperation and coordination between the Commission and the Member States ahead the full application of Regulation (EU) 2022/2065 on 17 February 2024, in a spirit of sincere cooperation, to allow a speedy transition towards the application of that Regulation and to ensure its full effectiveness since inception. This Recommendation does not aim at replacing or supplementing the mechanisms of enforcement nor the framework of obligations laid down in Regulation (EU) 2022/2065.
(23) The Commission will assess the experience in the application of this Recommendation once it expires, i.e. when Regulation (EU) 2022/2065 enters fully into application.
(24) This Recommendation should apply until 17 February 2024,