Cyberstalking and Online Platforms’ Due Diligence in the EU Digital Services Act

By Irene Kamara

Cyberstalking: a pattern of abusive online behaviours

Cyberstalking, the act of using electronic communication devices to create a criminal level of intimidation, harassment, and fear in one or more victims,[1] is a form of – usually gender-based- cyberviolence, with immense impacts on the physical and mental well-being of the victim. The Council of Europe Istanbul Convention on violence against women and children defines stalking as “the intentional conduct of repeatedly engaging in threatening conduct directed at another person, causing her or him to fear for her or his safety.”[2] The characteristic of cyberstalking is the repeated nature of the online harassment. It constitutes a pattern of behaviour, rather than one isolated incident.[3] Because of this aspect, while the victim may feel a continuous threat, classifying different events by a single or multiple offenders as one cyberstalking offence and prosecuting it, runs into several evidentiary obstacles. Such an evidentiary obstacle is that the victim needs to maintain records of the different events over the course of an extended period that amounts to the cyberstalking offence. Where punishable, cyberstalking usually falls under criminal law provisions of harassment, especially in jurisdictions that have signed and ratified the Istanbul Convention of the Council of Europe. However, regulatory approaches targeting the offender are not the only strategy to mitigate cyberstalking as a phenomenon. Online platforms such as social media platforms offer de facto a means that facilitates cyberstalking, since offenders use social media platforms to engage in unwanted communication such as threats against one or more victims or publicise defamatory or image-based abusive material. Several of the most popular platforms have adopted their own community standards on accepted behaviour. For example, Meta has a policy in place on bullying and harassment,[4] where inter alia the platform commits to “remove content that’s meant to degrade or shame, including, for example, claims about someone’s sexual activity.” Those policies however are largely voluntary measures, and their appropriateness is often not reviewed by external state actors, such as an independent supervisory authority.

Cyberstalking and the EU Digital Services Act

Since 2022, the EU has a new Regulation in place assigning a range of responsibilities to online platforms, such as Meta, to identify and take down illegal content including cyberstalking. The Digital Services Act (‘DSA’)[5] aims at providing harmonised EU rules for a “safe, predictable and trusted online environment”,[6] by inter alia establishing rules on due diligence obligations for providers of intermediary services. The DSA modernised some of the provisions of the 2000 e-Commerce Directive[7] and reinstated some others, such as the provision clarifying there is no general obligation for providers of intermediary services to monitor the information in their services, nor to engage into active fact-finding to establish whether an illegal activity takes place abusing their services.[8]

Despite the absence of a general monitoring obligation, providers of intermediary services are subject to several obligations in order to ensure the online safety and trust of the users of their services.[9] Those due diligence obligations, explained in the next section, are centered around the concept of illegal content. The DSA, defines in its Article 3(h) illegal content as “any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law.” The concept of content is thus very broad meaning any information, ‘products, services and activities’[10] and whether this content is illegal is determined by examining other EU or Member State law. Once thus information shared, publicized, transmitted, stored that is infringing EU or national Member State law, the due diligence framework established in the DSA is applicable to the service provider of intermediary services. Recital 12 DSA provides additional interpretational clarity as per the parameters and examples of illegal content, since applicable rules might render the content itself illegal or this might be rendered illegal because it relates to illegal activities. Examples include the sharing of image-based sexual abuse of children material (CSAM), hate speech or terrorist content, and online stalking (cyberstalking). As a result of this broad definition, even acts or information that are not as such illegal, but relate to the illegal activity of cyberstalking, would also qualify as illegal content, and would be subject to the DSA. This is an important step towards regulating cyberstalking, and essentially limiting the single acts of the cyberstalker causing nuisance or harassment to the victim(s) and other related targets of the offence, such as the friendly, family, or work environment of the victim(s).

The DSA due diligence framework: placing the responsibility on online platforms?

The e-Commerce Directive already provided an obligation for information society service providers to remove or disable access to information, when obtaining knowledge of an illegal activity.[11] The DSA develops a due diligence framework, which involves service providers undertaking action in a reactive manner (e.g. once a report is filed towards an online platform about an abusive image), but also in a proactive manner. The due diligence framework ensures that the service providers, and especially large online platforms, have assessed the systemic risks from the design and the functioning of their services.[12] The due diligence framework comprises of rules relating to transparency, cooperation with law enforcement and judicial authorities, and proactive measures against misuse of the offered services. In terms of proactive measures, very large online platforms must put in place mitigation measures tailored to systemic risks and adapt their moderation processes,  in particular in cases of cyberviolence, which includes cyberstalking. The risk of dissemination of CSAM is – according to Recital 80 DSA – one of the categories of such systemic risks. The mitigation measures include the expeditious removal or disabling access to the illegal content, and adapting the speed and quality of processing notices (Art. 35(1)(c) DSA). In terms of transparency, specifically for online platforms, the DSA imposes strict reporting rules as regards the use of automated moderation tools, including specification of error rates and applied safeguards,[13] but also detailed reporting of the number of suspensions of provision of services due to misuse.[14] As regards cooperation with law enforcement and judicial authorities, all hosting providers must notify the competent authorities of a suspicion that a criminal offence threatening an individual’s safety or life is taking place. The notification threshold is quite low, since Art. 18(1) DSA requires not proven illegal behaviour, but a suspicion that such a behaviour takes place. This means that in cases of cyberstalking, any act pointing the service provider at the direction of a potential of repeated threats directed towards an individual directly or indirectly via friends, family, or colleagues would require a report to the law enforcement authority.

Next steps

The DSA entered into force in 2022, but starts applying early 2024, since the EU legislator provided a grace period to service providers subject to the scope of the DSA to adapt to the new set of obligations. While it should be expected that hate speech, CSAM, and copyright infringing material, will -at the first period of the DSA application monopolise the focus of platforms and the related complaints and reports- the DSA will also be tested as a regulatory instrument against cyberstalking and the role of intermediaries, e.g. in this case online platforms, in combatting such an abusive online behaviour.


[1] Pittaro, M. L. (2007). Cyber stalking: An Analysis of Online Harassment and Intimidation. International Journal of Cyber Criminology, 1(2), 180–197. https://doi.org/10.5281/zenodo.18794

[2] Article 34 Council of Europe Convention on preventing and combating violence against women and domestic violence (‘Istanbul Convention’), Council of Europe Treaty Series No. 210.

[3] Vidal Verástegui, J., Romanosky, S., Blumenthal, M. S., Brothers, A., Adamson, D. M., Ligor, D. C., … & Schirmer, P. (2023). Cyberstalking: A Growing Challenge for the US Legal System.

[4] https://transparency.fb.com/policies/community-standards/bullying-harassment/

[5] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act) OJ L 277, 27.10.2022, p. 1–102.

[6] Article 1(1) DSA.

[7] Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (‘Directive on electronic commerce’) OJ L 178, 17.7.2000, p. 1–16.

[8] Read further on the prohibition of general monitoring obligations: Senftleben, Martin and Angelopoulos, Christina, The Odyssey of the Prohibition on General Monitoring Obligations on the Way to the Digital Services Act: Between Article 15 of the E-Commerce Directive and Article 17 of the Directive on Copyright in the Digital Single Market, Amsterdam/Cambridge, October 2020, https://ssrn.com/abstract=3717022

[9] Recital 41 DSA.

[10] Recital 12 DSA.

[11] Recital 46 e-commerce Directive.

[12] Article 34 DSA.

[13] Art. 15 DSA.

[14] Art. 24 DSA.

Tags: ,